Most of them were found and fixed using codespell.
Signed-off-by: Stefan Weil <sw@weilnetz.de>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Signed-off-by: Jonathan Druart <jonathan.druart@koha-community.org>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
# RETURN:
# A 4-dimensionnal array in $self->{top_terms}
# [0] term
-# [1] term number of occurences
+# [1] term number of occurrences
# [2] term proportional relative weight in terms set E[0-1]
# [3] term logarithmic relative weight E [0-levels_cloud]
#
# This array is sorted alphabetically by terms ([0])
-# It can be easily sorted by occurences:
+# It can be easily sorted by occurrences:
# @t = sort { $a[1] <=> $a[1] } @{$self->{top_terms}};
#
sub scan {
#
-# Returns a HTML version of index top terms formated
+# Returns a HTML version of index top terms formatted
# as a 'tag cloud'.
#
sub html_cloud {
=head1 IMPROVEMENTS
Generated top terms have more informations than those outputted from
-the time beeing. Some parameters could be easily added to improve
+the time being. Some parameters could be easily added to improve
this script:
=over
=item B<WithCount>
-In order to output terms with the number of occurences they
+In order to output terms with the number of occurrences they
have been found in Koha Catalogue by Zebra.
=item B<CloudLevels>
# This script loops through each overdue item, determines the fine,
# and updates the total amount of fines due by each user. It relies on
# the existence of /tmp/fines, which is created by ???
-# Doesnt really rely on it, it relys on being able to write to /tmp/
+# Doesn't really rely on it, it relys on being able to write to /tmp/
# It creates the fines file
#
# This script is meant to be run nightly out of cron.
Produces html data. If patron does not have an email address or
-n (no mail) flag is set, an HTML file is generated in the specified
-directory. This can be downloaded or futher processed by library staff.
+directory. This can be downloaded or further processed by library staff.
The file will be called notices-YYYY-MM-DD.html and placed in the directory
specified.
Produces plain text data. If patron does not have an email address or
-n (no mail) flag is set, a text file is generated in the specified
-directory. This can be downloaded or futher processed by library staff.
+directory. This can be downloaded or further processed by library staff.
The file will be called notices-YYYY-MM-DD.txt and placed in the directory
specified.
=head1 SEE ALSO
The F<misc/cronjobs/advance_notices.pl> program allows you to send
-messages to patrons in advance of thier items becoming due, or to
+messages to patrons in advance of their items becoming due, or to
alert them of items that have just become due.
=cut
config file and the template file.
A config file is divided into three sections; channel, image, and
-config. A section begins with the name of the section occuring alone
+config. A section begins with the name of the section occurring alone
on a line, and ends with the beginning of the next section (or the end
of the file). Each of these sections contains series of configuration
options in the form:
# This script loops through each overdue item, determines the fine,
# and updates the total amount of fines due by each user. It relies on
# the existence of /tmp/fines, which is created by ???
-# Doesnt really rely on it, it relys on being able to write to /tmp/
+# Doesn't really rely on it, it relies on being able to write to /tmp/
# It creates the fines file
#
# This script is meant to be run nightly out of cron.
--db_passwd=db-pass ...
The command in usually called from the root directory for the Koha source tree.
-If you are runing from another directory, use the --path switch to specify
+If you are running from another directory, use the --path switch to specify
a different path.
=head1 OPTIONS
print "--------------\n";
print "Koha circulation benchmarking utility\n";
print "--------------\n";
-print "Benchmarking with $max_tries occurences of each operation and $concurrency concurrent sessions \n";
+print "Benchmarking with $max_tries occurrences of each operation and $concurrency concurrent sessions \n";
print "Load testing staff client dashboard page";
for (my $i=1;$i<=$max_tries;$i++) {
push @mainpage,"$baseurl/mainpage.pl";
=item B<--offset=N>
-Like an OFFSET statement in SQL, this tells the script to skip N of the targetted records.
+Like an OFFSET statement in SQL, this tells the script to skip N of the targeted records.
The default is 0, i.e. skip none of them.
=back
}
-# Disable logging for the biblios and authorities import operation. It would unnecesarily
+# Disable logging for the biblios and authorities import operation. It would unnecessarily
# slow the import
# Disable the syspref cache so we can change logging settings
=item B<-k, -keepids>=<FIELD>
-Field store ids in I<FIELD> (usefull for authorities, where 001 contains the
+Field store ids in I<FIELD> (useful for authorities, where 001 contains the
authid for Koha, that can contain a very valuable info for authorities coming
from LOC or BNF. useless for biblios probably)
=item B<-keepids>
-Store ids in 009 (usefull for authorities, where 001 contains the authid for
+Store ids in 009 (useful for authorities, where 001 contains the authid for
Koha, that can contain a very valuable info for authorities coming from LOC or
BNF. useless for biblios probably)
require "koha-svc.pl"
-at begining of script. Rest of API is described below. Example of it's usage is at beginning of this script.
+at beginning of script. Rest of API is described below. Example of its usage is at beginning of this script.
=head2 new
$ENV{MEMCACHED_SERVERS} = "localhost:11211";
#$ENV{MEMCACHED_DEBUG} = 0;
-$ENV{PROFILE_PER_PAGE} = 1; # reset persistant and profile counters after each page, like CGI
+$ENV{PROFILE_PER_PAGE} = 1; # reset persistent and profile counters after each page, like CGI
#$ENV{INTRANET} = 1; # usually passed from script
#$ENV{DBI_AUTOPROXY}='dbi:Gofer:transport=null;cache=DBI::Util::CacheMemory'