⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 crawlserver.html

📁 用JAVA编写的,在做实验的时候留下来的,本来想删的,但是传上来,大家分享吧
💻 HTML
📖 第 1 页 / 共 3 页
字号:
<a name="121" href="#121">121</a> <em>     * @throws IOException</em><a name="122" href="#122">122</a> <em>     */</em><a name="123" href="#123">123</a>     <strong>public</strong> <strong>void</strong> updateRobots(<a href="../../../../org/archive/crawler/datamodel/CrawlURI.html">CrawlURI</a> curi) {<a name="124" href="#124">124</a>         <a href="../../../../org/archive/crawler/datamodel/RobotsHonoringPolicy.html">RobotsHonoringPolicy</a> honoringPolicy =<a name="125" href="#125">125</a>             settingsHandler.getOrder().getRobotsHonoringPolicy();<a name="126" href="#126">126</a> <a name="127" href="#127">127</a>         robotsFetched = System.currentTimeMillis();<a name="128" href="#128">128</a> <a name="129" href="#129">129</a>         <strong>boolean</strong> gotSomething = curi.getFetchStatus() > 0<a name="130" href="#130">130</a>                 &amp;&amp; curi.isHttpTransaction();<a name="131" href="#131">131</a>         <strong>if</strong> (!gotSomething &amp;&amp; curi.getFetchAttempts() &lt; MIN_ROBOTS_RETRIES) {<a name="132" href="#132">132</a>             <em class="comment">// robots.txt lookup failed, no reason to consider IGNORE yet</em><a name="133" href="#133">133</a>             validRobots = false;<a name="134" href="#134">134</a>             <strong>return</strong>;<a name="135" href="#135">135</a>         }<a name="136" href="#136">136</a>         <a name="137" href="#137">137</a>         <a href="../../../../org/archive/crawler/settings/CrawlerSettings.html">CrawlerSettings</a> settings = getSettings(curi);<a name="138" href="#138">138</a>         <strong>int</strong> type = honoringPolicy.getType(settings);<a name="139" href="#139">139</a>         <strong>if</strong> (type == RobotsHonoringPolicy.IGNORE) {<a name="140" href="#140">140</a>             <em class="comment">// IGNORE = ALLOWALL</em><a name="141" href="#141">141</a>             robots = RobotsExclusionPolicy.ALLOWALL;<a name="142" href="#142">142</a>             validRobots = <strong>true</strong>;<a name="143" href="#143">143</a>             <strong>return</strong>;<a name="144" href="#144">144</a>         }<a name="145" href="#145">145</a>         <a name="146" href="#146">146</a>         <strong>if</strong>(!gotSomething) {<a name="147" href="#147">147</a>             <em class="comment">// robots.txt lookup failed and policy not IGNORE</em><a name="148" href="#148">148</a>             validRobots = false;<a name="149" href="#149">149</a>             <strong>return</strong>;<a name="150" href="#150">150</a>         }<a name="151" href="#151">151</a>         <a name="152" href="#152">152</a>         <strong>if</strong> (!curi.is2XXSuccess()) {<a name="153" href="#153">153</a>             <em class="comment">// Not found or anything but a status code in the 2xx range is</em><a name="154" href="#154">154</a>             <em class="comment">// treated as giving access to all of a sites' content.</em><a name="155" href="#155">155</a>             <em class="comment">// This is the prevailing practice of Google, since 4xx</em><a name="156" href="#156">156</a>             <em class="comment">// responses on robots.txt are usually indicative of a </em><a name="157" href="#157">157</a>             <em class="comment">// misconfiguration or blanket-block, not an intentional</em><a name="158" href="#158">158</a>             <em class="comment">// indicator of partial blocking. </em><a name="159" href="#159">159</a>             <em class="comment">// TODO: consider handling server errors, redirects differently</em><a name="160" href="#160">160</a>             robots = RobotsExclusionPolicy.ALLOWALL;<a name="161" href="#161">161</a>             validRobots = <strong>true</strong>;<a name="162" href="#162">162</a>             <strong>return</strong>;<a name="163" href="#163">163</a>         }<a name="164" href="#164">164</a> <a name="165" href="#165">165</a>         <a href="../../../../org/archive/io/ReplayInputStream.html">ReplayInputStream</a> contentBodyStream = <strong>null</strong>;<a name="166" href="#166">166</a>         <strong>try</strong> {<a name="167" href="#167">167</a>             <strong>try</strong> {<a name="168" href="#168">168</a>                 BufferedReader reader;<a name="169" href="#169">169</a>                 <strong>if</strong> (type == RobotsHonoringPolicy.CUSTOM) {<a name="170" href="#170">170</a>                     reader = <strong>new</strong> BufferedReader(<strong>new</strong> StringReader(honoringPolicy<a name="171" href="#171">171</a>                             .getCustomRobots(settings)));<a name="172" href="#172">172</a>                 } <strong>else</strong> {<a name="173" href="#173">173</a>                     contentBodyStream = curi.getHttpRecorder()<a name="174" href="#174">174</a>                             .getRecordedInput().getContentReplayInputStream();<a name="175" href="#175">175</a> <a name="176" href="#176">176</a>                     contentBodyStream.setToResponseBodyStart();<a name="177" href="#177">177</a>                     reader = <strong>new</strong> BufferedReader(<strong>new</strong> InputStreamReader(<a name="178" href="#178">178</a>                             contentBodyStream));<a name="179" href="#179">179</a>                 }<a name="180" href="#180">180</a>                 robots = RobotsExclusionPolicy.policyFor(settings,<a name="181" href="#181">181</a>                         reader, honoringPolicy);<a name="182" href="#182">182</a>                 validRobots = <strong>true</strong>;<a name="183" href="#183">183</a>             } <strong>finally</strong> {<a name="184" href="#184">184</a>                 <strong>if</strong> (contentBodyStream != <strong>null</strong>) {<a name="185" href="#185">185</a>                     contentBodyStream.close();<a name="186" href="#186">186</a>                 }<a name="187" href="#187">187</a>             }<a name="188" href="#188">188</a>         } <strong>catch</strong> (IOException e) {<a name="189" href="#189">189</a>             robots = RobotsExclusionPolicy.ALLOWALL;<a name="190" href="#190">190</a>             validRobots = <strong>true</strong>;<a name="191" href="#191">191</a>             curi.addLocalizedError(getName(), e,<a name="192" href="#192">192</a>                     <span class="string">"robots.txt parsing IOException"</span>);<a name="193" href="#193">193</a>         }<a name="194" href="#194">194</a>     }<a name="195" href="#195">195</a> <a name="196" href="#196">196</a>     <em>/**<em>*</em></em><a name="197" href="#197">197</a> <em>     * @return Returns the time when robots.txt was fetched.</em><a name="198" href="#198">198</a> <em>     */</em><a name="199" href="#199">199</a>     <strong>public</strong> <strong>long</strong> getRobotsFetchedTime() {<a name="200" href="#200">200</a>         <strong>return</strong> robotsFetched;<a name="201" href="#201">201</a>     }<a name="202" href="#202">202</a> <a name="203" href="#203">203</a>     <em>/**<em>*</em></em><a name="204" href="#204">204</a> <em>     * @return The server string which might include a port number.</em><a name="205" href="#205">205</a> <em>     */</em><a name="206" href="#206">206</a>     <strong>public</strong> String getName() {<a name="207" href="#207">207</a>        <strong>return</strong> server;<a name="208" href="#208">208</a>     }<a name="209" href="#209">209</a> <a name="210" href="#210">210</a>     <em>/**<em>* Get the port number for this server.</em></em><a name="211" href="#211">211</a> <em>     *</em><a name="212" href="#212">212</a> <em>     * @return the port number or -1 if not known (uses default for protocol)</em><a name="213" href="#213">213</a> <em>     */</em><a name="214" href="#214">214</a>     <strong>public</strong> <strong>int</strong> getPort() {<a name="215" href="#215">215</a>         <strong>return</strong> port;<a name="216" href="#216">216</a>     }<a name="217" href="#217">217</a> <a name="218" href="#218">218</a>     <em>/**<em>* </em></em><a name="219" href="#219">219</a> <em>     * Called when object is being deserialized.</em><a name="220" href="#220">220</a> <em>     * In addition to the default java deserialization, this method</em><a name="221" href="#221">221</a> <em>     * re-establishes the references to settings handler and robots honoring</em><a name="222" href="#222">222</a> <em>     * policy.</em><a name="223" href="#223">223</a> <em>     *</em><a name="224" href="#224">224</a> <em>     * @param stream the stream to deserialize from.</em><a name="225" href="#225">225</a> <em>     * @throws IOException if I/O errors occur</em><a name="226" href="#226">226</a> <em>     * @throws ClassNotFoundException If the class for an object being restored</em><a name="227" href="#227">227</a> <em>     *         cannot be found.</em><a name="228" href="#228">228</a> <em>     */</em><a name="229" href="#229">229</a>     <strong>private</strong> <strong>void</strong> readObject(ObjectInputStream stream)<a name="230" href="#230">230</a>             throws IOException, ClassNotFoundException {<a name="231" href="#231">231</a>         stream.defaultReadObject();<a name="232" href="#232">232</a>         Thread t = Thread.currentThread();<a name="233" href="#233">233</a>         <strong>if</strong> (t instanceof Checkpointer.CheckpointingThread) {<a name="234" href="#234">234</a>             settingsHandler = ((Checkpointer.CheckpointingThread)t)<a name="235" href="#235">235</a>         		.getController().getSettingsHandler();<a name="236" href="#236">236</a>         } <strong>else</strong> <strong>if</strong> (t instanceof ToeThread) {<a name="237" href="#237">237</a>             settingsHandler = ((ToeThread) Thread.currentThread())<a name="238" href="#238">238</a>                 .getController().getSettingsHandler();<a name="239" href="#239">239</a>         } <strong>else</strong> {<a name="240" href="#240">240</a>             <em class="comment">// TODO: log differently? (if no throw here</em><a name="241" href="#241">241</a>             <em class="comment">// NPE is inevitable)</em><a name="242" href="#242">242</a>             <strong>throw</strong> <strong>new</strong> RuntimeException(<span class="string">"CrawlServer must deserialize "</span> +<a name="243" href="#243">243</a>                     <span class="string">"in a ToeThread or CheckpointingThread"</span>);<a name="244" href="#244">244</a>         }<a name="245" href="#245">245</a>         postDeserialize();<a name="246" href="#246">246</a>     }<a name="247" href="#247">247</a>     <a name="248" href="#248">248</a>     <strong>private</strong> <strong>void</strong> postDeserialize() {<a name="249" href="#249">249</a>     	<strong>if</strong> (<strong>this</strong>.robots != <strong>null</strong>) {

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -