aboutsummaryrefslogtreecommitdiff
path: root/perl/perl-www-robotrules/README
diff options
context:
space:
mode:
authorponce <matteo.bernardini@gmail.com>2012-08-25 18:36:51 +0200
committerponce <matteo.bernardini@gmail.com>2012-08-25 18:36:51 +0200
commitc26e2faf5d476ddb4d9a8ca317c83aa3f5c540f8 (patch)
tree86381178199713dce58fb22ad83d4446be91f4f0 /perl/perl-www-robotrules/README
parentcb3e51a2b562d7256d92150a7fa61f69c840635b (diff)
perl/perl-uri-escape: Fixed dep information
Diffstat (limited to 'perl/perl-www-robotrules/README')
-rw-r--r--perl/perl-www-robotrules/README2
1 files changed, 0 insertions, 2 deletions
diff --git a/perl/perl-www-robotrules/README b/perl/perl-www-robotrules/README
index 54915b3f0b47b..7dee780d0f40a 100644
--- a/perl/perl-www-robotrules/README
+++ b/perl/perl-www-robotrules/README
@@ -6,5 +6,3 @@ The parsed files are kept in a WWW::RobotRules object, and this
object provides methods to check if access to a given URL is
prohibited. The same WWW::RobotRules object can be used for one
or more parsed /robots.txt files on any number of hosts.
-
-This requires perl-uri-escape.