📄 ch20_12.htm
字号:
<HTML><HEAD><TITLE>Recipe 20.11. Creating a Robot (Perl Cookbook)</TITLE><METANAME="DC.title"CONTENT="Perl Cookbook"><METANAME="DC.creator"CONTENT="Tom Christiansen & Nathan Torkington"><METANAME="DC.publisher"CONTENT="O'Reilly & Associates, Inc."><METANAME="DC.date"CONTENT="1999-07-02T01:46:02Z"><METANAME="DC.type"CONTENT="Text.Monograph"><METANAME="DC.format"CONTENT="text/html"SCHEME="MIME"><METANAME="DC.source"CONTENT="1-56592-243-3"SCHEME="ISBN"><METANAME="DC.language"CONTENT="en-US"><METANAME="generator"CONTENT="Jade 1.1/O'Reilly DocBook 3.0 to HTML 4.0"><LINKREV="made"HREF="mailto:online-books@oreilly.com"TITLE="Online Books Comments"><LINKREL="up"HREF="ch20_01.htm"TITLE="20. Web Automation"><LINKREL="prev"HREF="ch20_11.htm"TITLE="20.10. Mirroring Web Pages"><LINKREL="next"HREF="ch20_13.htm"TITLE="20.12. Parsing a Web Server Log File"></HEAD><BODYBGCOLOR="#FFFFFF"><img alt="Book Home" border="0" src="gifs/smbanner.gif" usemap="#banner-map" /><map name="banner-map"><area shape="rect" coords="1,-2,616,66" href="index.htm" alt="Perl Cookbook"><area shape="rect" coords="629,-11,726,25" href="jobjects/fsearch.htm" alt="Search this book" /></map><div class="navbar"><p><TABLEWIDTH="684"BORDER="0"CELLSPACING="0"CELLPADDING="0"><TR><TDALIGN="LEFT"VALIGN="TOP"WIDTH="228"><ACLASS="sect1"HREF="ch20_11.htm"TITLE="20.10. Mirroring Web Pages"><IMGSRC="../gifs/txtpreva.gif"ALT="Previous: 20.10. Mirroring Web Pages"BORDER="0"></A></TD><TDALIGN="CENTER"VALIGN="TOP"WIDTH="228"><B><FONTFACE="ARIEL,HELVETICA,HELV,SANSERIF"SIZE="-1"><ACLASS="chapter"REL="up"HREF="ch20_01.htm"TITLE="20. Web Automation"></A></FONT></B></TD><TDALIGN="RIGHT"VALIGN="TOP"WIDTH="228"><ACLASS="sect1"HREF="ch20_13.htm"TITLE="20.12. Parsing a Web Server Log File"><IMGSRC="../gifs/txtnexta.gif"ALT="Next: 20.12. Parsing a Web Server Log File"BORDER="0"></A></TD></TR></TABLE></DIV><DIVCLASS="sect1"><H2CLASS="sect1"><ACLASS="title"NAME="ch20-chap20_creating_1">20.11. Creating a Robot</A></H2><DIVCLASS="sect2"><H3CLASS="sect2"><ACLASS="title"NAME="ch20-pgfId-1241">Problem<ACLASS="indexterm"NAME="ch20-idx-1000002667-0"></A><ACLASS="indexterm"NAME="ch20-idx-1000002667-1"></A><ACLASS="indexterm"NAME="ch20-idx-1000002667-2"></A><ACLASS="indexterm"NAME="ch20-idx-1000002667-3"></A></A></H3><PCLASS="para">You want to create a script that navigates the Web on its own (i.e., a robot), and you'd like to respect the remote sites' wishes.</P></DIV><DIVCLASS="sect2"><H3CLASS="sect2"><ACLASS="title"NAME="ch20-pgfId-1247">Solution</A></H3><PCLASS="para">Instead of writing your robot to use LWP::UserAgent, have it use <ACLASS="indexterm"NAME="ch20-idx-1000002668-0"></A>LWP::RobotUA instead:</P><PRECLASS="programlisting">use LWP::RobotUA;$ua = LWP::RobotUA->new('websnuffler/0.1', 'me@wherever.com');</PRE></DIV><DIVCLASS="sect2"><H3CLASS="sect2"><ACLASS="title"NAME="ch20-pgfId-1257">Discussion</A></H3><PCLASS="para">To avoid having marauding robots and web crawlers hammer their servers, sites are encouraged to create a file with access rules called robots.txt. If you're fetching only one document with your script, this is no big deal, but if your script is going to fetch many documents, probably from the same server, you could easily exhaust that site's bandwidth.</P><PCLASS="para">When you create your own scripts to run around the Web, it's important to be a good net citizen. That means two things: don't request documents from the same server too often, and heed the advisory access rules in their robots.txt file.</P><PCLASS="para">The easiest way to handle this is to use the LWP::RobotUA module to create agents instead of LWP::UserAgent. This agent automatically knows to pull things over slowly when repeatedly calling the same server. It also checks each site's robots.txt file to see whether you're trying to grab a file that is off limits. If you do, you'll get back a response like this:</P><PRECLASS="programlisting">403 (Forbidden) Forbidden by robots.txt</PRE><PCLASS="para">Here's an example robots.txt file, fetched using the GET program that comes with the LWP module suite:</P><PRECLASS="programlisting">% GET http://www.webtechniques.com/robots.txt <CODECLASS="userinput"><B><CODECLASS="replaceable"><I>User-agent: *</I></CODE></B></CODE><CODECLASS="userinput"><B><CODECLASS="replaceable"><I> Disallow: /stats</I></CODE></B></CODE><CODECLASS="userinput"><B><CODECLASS="replaceable"><I> Disallow: /db</I></CODE></B></CODE><CODECLASS="userinput"><B><CODECLASS="replaceable"><I> Disallow: /logs</I></CODE></B></CODE><CODECLASS="userinput"><B><CODECLASS="replaceable"><I> Disallow: /store</I></CODE></B></CODE><CODECLASS="userinput"><B><CODECLASS="replaceable"><I> Disallow: /forms</I></CODE></B></CODE><CODECLASS="userinput"><B><CODECLASS="replaceable"><I> Disallow: /gifs</I></CODE></B></CODE><CODECLASS="userinput"><B><CODECLASS="replaceable"><I> Disallow: /wais-src</I></CODE></B></CODE><CODECLASS="userinput"><B><CODECLASS="replaceable"><I> Disallow: /scripts</I></CODE></B></CODE><CODECLASS="userinput"><B><CODECLASS="replaceable"><I> Disallow: /config</I></CODE></B></CODE></PRE><PCLASS="para">A more interesting and extensive example is at <ACLASS="systemitem.url"HREF="http://www.cnn.com/robots.txt.">http://www.cnn.com/robots.txt.</A> This file is so big, they even keep it under RCS control!</P><PRECLASS="programlisting">% GET http://www.cnn.com/robots.txt | head<CODECLASS="userinput"><B><CODECLASS="replaceable"><I># robots, scram</I></CODE></B></CODE><CODECLASS="userinput"><B><CODECLASS="replaceable"><I># $I d : robots.txt,v 1.2 1998/03/10 18:27:01 mreed Exp $</I></CODE></B></CODE><CODECLASS="userinput"><B><CODECLASS="replaceable"><I>User-agent: *</I></CODE></B></CODE><CODECLASS="userinput"><B><CODECLASS="replaceable"><I>Disallow: /</I></CODE></B></CODE><CODECLASS="userinput"><B><CODECLASS="replaceable"><I>User-agent: Mozilla/3.01 (hotwired-test/0.1)</I></CODE></B></CODE><CODECLASS="userinput"><B><CODECLASS="replaceable"><I>Disallow: /cgi-bin</I></CODE></B></CODE><CODECLASS="userinput"><B><CODECLASS="replaceable"><I>Disallow: /TRANSCRIPTS</I></CODE></B></CODE><CODECLASS="userinput"><B><CODECLASS="replaceable"><I>Disallow: /development</I></CODE></B></CODE><ACLASS="indexterm"NAME="ch20-idx-1000002670-0"></A><ACLASS="indexterm"NAME="ch20-idx-1000002670-1"></A><ACLASS="indexterm"NAME="ch20-idx-1000002670-2"></A><ACLASS="indexterm"NAME="ch20-idx-1000002670-3"></A></PRE></DIV><DIVCLASS="sect2"><H3CLASS="sect2"><ACLASS="title"NAME="ch20-pgfId-1317">See Also</A></H3><PCLASS="para">The documentation for the CPAN module LWP::RobotUA(3); <ACLASS="systemitem.url"HREF="http://info.webcrawler.com/mak/projects/robots/robots.html">http://info.webcrawler.com/mak/projects/robots/robots.html</A> for a description of how well-behaved robots act</P></DIV></DIV><DIVCLASS="htmlnav"><P></P><HRALIGN="LEFT"WIDTH="684"TITLE="footer"><TABLEWIDTH="684"BORDER="0"CELLSPACING="0"CELLPADDING="0"><TR><TDALIGN="LEFT"VALIGN="TOP"WIDTH="228"><ACLASS="sect1"HREF="ch20_11.htm"TITLE="20.10. Mirroring Web Pages"><IMGSRC="../gifs/txtpreva.gif"ALT="Previous: 20.10. Mirroring Web Pages"BORDER="0"></A></TD><TDALIGN="CENTER"VALIGN="TOP"WIDTH="228"><ACLASS="book"HREF="index.htm"TITLE="Perl Cookbook"><IMGSRC="../gifs/txthome.gif"ALT="Perl Cookbook"BORDER="0"></A></TD><TDALIGN="RIGHT"VALIGN="TOP"WIDTH="228"><ACLASS="sect1"HREF="ch20_13.htm"TITLE="20.12. Parsing a Web Server Log File"><IMGSRC="../gifs/txtnexta.gif"ALT="Next: 20.12. Parsing a Web Server Log File"BORDER="0"></A></TD></TR><TR><TDALIGN="LEFT"VALIGN="TOP"WIDTH="228">20.10. Mirroring Web Pages</TD><TDALIGN="CENTER"VALIGN="TOP"WIDTH="228"><ACLASS="index"HREF="index/index.htm"TITLE="Book Index"><IMGSRC="../gifs/index.gif"ALT="Book Index"BORDER="0"></A></TD><TDALIGN="RIGHT"VALIGN="TOP"WIDTH="228">20.12. Parsing a Web Server Log File</TD></TR></TABLE><HRALIGN="LEFT"WIDTH="684"TITLE="footer"><FONTSIZE="-1"></DIV<!-- LIBRARY NAV BAR --> <img src="../gifs/smnavbar.gif" usemap="#library-map" border="0" alt="Library Navigation Links"><p> <a href="copyrght.htm">Copyright © 2002</a> O'Reilly & Associates. All rights reserved.</font> </p> <map name="library-map"> <area shape="rect" coords="1,0,85,94" href="../index.htm"><area shape="rect" coords="86,1,178,103" href="../lwp/index.htm"><area shape="rect" coords="180,0,265,103" href="../lperl/index.htm"><area shape="rect" coords="267,0,353,105" href="../perlnut/index.htm"><area shape="rect" coords="354,1,446,115" href="../prog/index.htm"><area shape="rect" coords="448,0,526,132" href="../tk/index.htm"><area shape="rect" coords="528,1,615,119" href="../cookbook/index.htm"><area shape="rect" coords="617,0,690,135" href="../pxml/index.htm"></map> </BODY></HTML>
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -