[ < ] | [ > ] | [ << ] | [ Up ] | [ >> ] | [Top] | [Contents] | [Index] | [ ? ] |
Because they use the Perl Getopt::Mixed
module, all of the
LinkController command line programs respond to the standard POSIX style
command line options. At least the following two options will be implemented.
You can use the `--help' option to get help on each program, for example:
extract-links --help |
will give something like
extract-links [arguments] [url-base [file-base]] -V --version Give version information for this program -h --help --usage Describe usage of this program. --help-opt=OPTION Give help information for a given option -v --verbose[=VERBOSITY] Give information about what the program is doing. Set value to control what information is given. -e --exclude-regex=REGEX Exclude expression for excluding files. -p --prune-regex=REGEX Regular expression for excluding entire directories. -d --default-infostrucs handle all default infostrucs (as well as ones listed on command line) -l --link-database=FILENAME Database to create link records into. -c --config-file=FILENAME Load in an additional configuration file -o --out-url-list=FILENAME File to output the url of each link found to -i --in-url-list=FILENAME File to input urls from to create links Extract the link and index information from a directory containing HTML files or from a set of WWW pages with URLs which begin with the given URL and which can be found by starting from that URL and searching other such pages. |
You can then use that information to get the program to do what you want.
F.1 Invoking link-report Command line options for the link-report program F.2 Invoking extract-links Command line options for the test-link program F.3 Invoking extract-links Command line options for the extract-links program F.4 Invoking fix-link Command line options for the fix-link program F.5 Invoking build-schedule Command line options for the build-schedule program
[ < ] | [ > ] | [ << ] | [ Up ] | [ >> ] | [Top] | [Contents] | [Index] | [ ? ] |
The `link-report' program prints out status information about links allowing the user to see what needs to be fixed. The default is to print out all of the broken and redirected links that currently occur on the users web pages and which are either redirected or broken.
Before running `link-report' you should probably use `test-link' (see section F.2 Invoking extract-links)to check which links are broken. That may not be needed if your system administrator does it for you. After you have identified broken links you may want to use `fix-link' (see section F.4 Invoking fix-link) to repair the broken links.
FIXME this section should give a better description of each option.
link-report [options] -V --version Give version information for this program -h --help --usage Describe usage of this program. --help-opt=OPTION Give help information for a given option -v --verbose[=VERBOSITY] Give information about what the program is doing. Set value to control what information is given. -U --uri=URIs Give URIs which are to be reported on. -f --uri-file=FILENAME Read all URIs in a file (one URI per line). -E --uri-exclude=EXCLUDE RE Add a regular expressions for URIs to ignore. -I --uri-include=INCLUDE RE Give regular expression for URIs to check (if this option is given others aren't checked). -e --page-exclude=EXCLUDE RE Add a regular expressions for pages to ignore. -i --page-include=INCLUDE RE Give regular expression for URIs to check (if this option is given others aren't checked). -a --all-links Report information about every URI. -b --broken Report links which are considered broken. -n --not-perfect Report any URI which wasn't okay at last test. -r --redirected Report links which are redirected. -o --okay Report links which have been tested okay. -d --disallowed Report links for which testing isn't allowed. -u --unsupported Report links which we don't know how to test. -m --ignore-missing Don't complain about links which aren't in the database. -g --good Report links which are probably worth listing. -N --no-pages Report without page list. --config-file=FILENAME Load in an additional configuration file --link-index=FILENAME Use the given file as the index of which file has what link. --link-database=FILENAME Use the given file as the dbm containing links. -l --long-list Where possible, identify the file and long list it (implies infostructure). This is used for emacs check-all-dired. -R --uri-report Print URIs on separate lines for each link. -H --html Report status of links in html format. Report on the status of links, getting the links either from the database, from the index file or from the command line. |
[ < ] | [ > ] | [ << ] | [ Up ] | [ >> ] | [Top] | [Contents] | [Index] | [ ? ] |
The `test-link' program tests all of the links in the LinkController database storing information about any problems found. It works as a robot contacting the servers where the target of each link is stored and verifying that the resource the link points to is really there.
Before running `test-link' you should probably use `extract-links' (see section F.3 Invoking extract-links) to collect all of the links you want to test and then `build-schedule' (see section F.5 Invoking build-schedule).
FIXME this section should give a better description of each option.
test-link [arguments] -V --version Give version information for this program -h --help --usage Describe usage of this program. --help-opt=OPTION Give help information for a given option -v --verbose[=VERBOSITY] Give information about what the program is doing. Set value to control what information is given. --quite -q --silent Program should generate no output except in case of error. --no-warn Avoid issuing warnings about non-fatal problems. -c --config-file=FILENAME Load in an additional configuration file -u --user-address=STRING Email address for user running link testing. -H --halt-time=MINUTES stop after given number of minutes --never-stop keep running without stopping --no-robot Don't follow robot rules. Dangerous!!! -w --no-waitre=NETLOC-REGEX Home HOST regex: no robot rules.. (danger?)!!! --test-now Test links now not when scheduled (testing only) --untested Test all links which have not been tested. --sequential Put links into schedule in order tested (for testing) -H --halt-time=MINUTES stop after given number of minutes -m --max-links=INTEGER Maximum number of links to test (-1=no limit) Read the link database and test those links which are due to have been tested, exiting when the next link to be tested is due after the program start time. Don't use --no-robot, except for when you are doing local testing (that is, you aren't connected to the internet proper). Don't use --never-stop or --test-now except when you are watching what is happening. |
[ < ] | [ > ] | [ << ] | [ Up ] | [ >> ] | [Top] | [Contents] | [Index] | [ ? ] |
The `extract-links' program walks through the users web pages collecting all of the links from those pages and storing them into a database for later checking by the `test-link' program (see section F.2 Invoking extract-links). It can also list the links found into a given file.
After running `extract-links' you should use `build-schedule' (see section F.5 Invoking build-schedule) which will make sure that any new links discovered are scheduled for checking..
FIXME this section should give a better description of each option.
extract-links [arguments] [url-base [file-base]] -V --version Give version information for this program -h --help --usage Describe usage of this program. --help-opt=OPTION Give help information for a given option -v --verbose[=VERBOSITY] Give information about what the program is doing. Set value to control what information is given. --quiet -q --silent Program should generate no output except in case of error. -e --exclude-regex=REGEX Exclude expression for excluding files. -p --prune-regex=REGEX Regular expression for excluding entire directories. -d --default-infostrucs handle all default infostrucs (as well as ones listed on command line) -l --link-database=FILENAME Database to create link records into. -c --config-file=FILENAME Load in an additional configuration file -o --out-url-list=FILENAME File to output the URL of each link found to -i --in-url-list=FILENAME File to input URLs from to create links Extract the link and index information from a directory containing HTML files or from a set of WWW pages with URLs which begin with the given URL and which can be found by starting from that URL and searching other such pages. |
[ < ] | [ > ] | [ << ] | [ Up ] | [ >> ] | [Top] | [Contents] | [Index] | [ ? ] |
The `fix-link' program is designed to repair a broken links across all of the files which LinkController is managing. It does this by looking up index files and seeing files contain the broken link then doing a textual substitution in each of these files. This makes it much faster than searching through all of the files in a set of web pages to see which pages have the broken link.
In order to work properly, `extract-links' (see section F.3 Invoking extract-links) must have been run first to build up the index databases used by `fix-link'.
fix-link [options] old-link new-link -V --version Give version information for this program -h --help --usage Describe usage of this program. --help-opt=OPTION Give help information for a given option -v --verbose[=VERBOSITY] Give information about what the program is doing. Set value to control what information is given. -q --quiet --silent Program should generate no output except in case of error. --no-warn Avoid issuing warnings about non-fatal problems. --directory=DIRNAME correct all files in the given directorywhat link. -r --relative Fix relative links (expensive??). -t --tree Fix the link and any others based on it. -b --base=FILENAME Base URI of the document or directory to be fixed. --config-file=FILENAME Load in an additional configuration file Replace any occurences of OLD-LINK with NEW-LINK using link index file to locate which files OLD-LINK occurs in. |
[ < ] | [ > ] | [ << ] | [ Up ] | [ >> ] | [Top] | [Contents] | [Index] | [ ? ] |
The `build-schedule' program makes a schedule for testing links. If run with no options it will make sure that all the links in the LinkController database will be checked at some point in the future.
Before running `build-schedule' you should probably use `extract-links' (see section F.3 Invoking extract-links) to collect all of the links you want to test. Afterwards you should use `test-link' to check which ones are broken (see section F.2 Invoking extract-links).
build-schedule [options] -V --version Give version information for this program -h --help --usage Describe usage of this program. --help-opt=OPTION Give help information for a given option -v --verbose[=VERBOSITY] Give information about what the program is doing. Set value to control what information is given. --quite -q --silent Program should generate no output except in case of error. --no-warn Avoid issuing warnings about non-fatal problems. -l --url-list=FILENAME File with complete list of URLs to schedule -s --schedule=FILENAME Override location of the schedule -t --spread-time=SECONDS Time over which to spread checking; default 10 days -S --start-offset=SECONDS Time offset from now for starting work (can be negative) -d --ignore-db Set the time with no regard to curent setting -i --ignore-link Set the time with no regard to link status --no-warn Avoid issuing warnings about non-fatal problems. --config-file=FILENAME Load in an additional configuration file Examine a database of link objects and build a schedule to check those links which can be used by test-link. |
[ << ] | [ >> ] | [Top] | [Contents] | [Index] | [ ? ] |