dwww Home


Documentation for packages, which contain /usr/share/doc/libwww-robotrules-perl/copyright:

Package: libwww-robotrules-perl

Description: database of robots.txt-derived permissions
WWW::RobotRules parses /robots.txt files as specified in "A Standard for Robot Exclusion", at <http://www.robotstxt.org/wc/norobots.html>. Webmasters can use the /robots.txt file to forbid conforming robots from accessing parts of their web site.
The parsed files are kept in a WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can be used for one or more parsed /robots.txt files on any number of hosts.
Homepage: https://metacpan.org/release/WWW-RobotRules
copyright | changelog | Debian changelog

Manual pages:

WWW::RobotRules(3pm) WWW::RobotRules::AnyDBM_File(3pm)

Other documents:

/usr/share/doc/libwww-robotrules-perl

Generated by dwww version 1.15 on Sun Jun 16 21:16:33 CEST 2024.