⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 domainsensitivefrontier.html

📁 用JAVA编写的,在做实验的时候留下来的,本来想删的,但是传上来,大家分享吧
💻 HTML
📖 第 1 页 / 共 2 页
字号:
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"><html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en"><head><meta http-equiv="content-type" content="text/html; charset=UTF-8" /><title>DomainSensitiveFrontier xref</title><link type="text/css" rel="stylesheet" href="../../../../stylesheet.css" /></head><body><div id="overview"><a href="../../../../../apidocs/org/archive/crawler/frontier/DomainSensitiveFrontier.html">View Javadoc</a></div><pre><a name="1" href="#1">1</a>   <em class="comment">/*<em class="comment"> DomainSensitiveFrontier</em></em><a name="2" href="#2">2</a>   <em class="comment">*</em><a name="3" href="#3">3</a>   <em class="comment">* $Id: DomainSensitiveFrontier.java,v 1.13 2006/09/07 19:21:50 stack-sf Exp $</em><a name="4" href="#4">4</a>   <em class="comment">*</em><a name="5" href="#5">5</a>   <em class="comment">* Created on 2004-may-06</em><a name="6" href="#6">6</a>   <em class="comment">*</em><a name="7" href="#7">7</a>   <em class="comment">* Copyright (C) 2004 Royal Library of Sweden.</em><a name="8" href="#8">8</a>   <em class="comment">*</em><a name="9" href="#9">9</a>   <em class="comment">* This file is part of the Heritrix web crawler (crawler.archive.org).</em><a name="10" href="#10">10</a>  <em class="comment">*</em><a name="11" href="#11">11</a>  <em class="comment">* Heritrix is free software; you can redistribute it and/or modify</em><a name="12" href="#12">12</a>  <em class="comment">* it under the terms of the GNU Lesser Public License as published by</em><a name="13" href="#13">13</a>  <em class="comment">* the Free Software Foundation; either version 2.1 of the License, or</em><a name="14" href="#14">14</a>  <em class="comment">* any later version.</em><a name="15" href="#15">15</a>  <em class="comment">*</em><a name="16" href="#16">16</a>  <em class="comment">* Heritrix is distributed in the hope that it will be useful,</em><a name="17" href="#17">17</a>  <em class="comment">* but WITHOUT ANY WARRANTY; without even the implied warranty of</em><a name="18" href="#18">18</a>  <em class="comment">* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the</em><a name="19" href="#19">19</a>  <em class="comment">* GNU Lesser Public License for more details.</em><a name="20" href="#20">20</a>  <em class="comment">*</em><a name="21" href="#21">21</a>  <em class="comment">* You should have received a copy of the GNU Lesser Public License</em><a name="22" href="#22">22</a>  <em class="comment">* along with Heritrix; if not, write to the Free Software</em><a name="23" href="#23">23</a>  <em class="comment">* Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA  02111-1307  USA</em><a name="24" href="#24">24</a>  <em class="comment">*/</em><a name="25" href="#25">25</a>  <strong>package</strong> <a href="../../../../org/archive/crawler/frontier/package-summary.html">org.archive.crawler.frontier</a>;<a name="26" href="#26">26</a>  <a name="27" href="#27">27</a>  <strong>import</strong> java.io.IOException;<a name="28" href="#28">28</a>  <strong>import</strong> java.util.Hashtable;<a name="29" href="#29">29</a>  <strong>import</strong> java.util.logging.Logger;<a name="30" href="#30">30</a>  <a name="31" href="#31">31</a>  <strong>import</strong> javax.management.AttributeNotFoundException;<a name="32" href="#32">32</a>  <strong>import</strong> javax.management.MBeanException;<a name="33" href="#33">33</a>  <strong>import</strong> javax.management.ReflectionException;<a name="34" href="#34">34</a>  <a name="35" href="#35">35</a>  <strong>import</strong> org.archive.crawler.datamodel.CrawlURI;<a name="36" href="#36">36</a>  <strong>import</strong> org.archive.crawler.event.CrawlURIDispositionListener;<a name="37" href="#37">37</a>  <strong>import</strong> org.archive.crawler.filter.OrFilter;<a name="38" href="#38">38</a>  <strong>import</strong> org.archive.crawler.filter.URIRegExpFilter;<a name="39" href="#39">39</a>  <strong>import</strong> org.archive.crawler.framework.CrawlController;<a name="40" href="#40">40</a>  <strong>import</strong> org.archive.crawler.framework.exceptions.FatalConfigurationException;<a name="41" href="#41">41</a>  <strong>import</strong> org.archive.crawler.prefetch.QuotaEnforcer;<a name="42" href="#42">42</a>  <strong>import</strong> org.archive.crawler.scope.ClassicScope;<a name="43" href="#43">43</a>  <strong>import</strong> org.archive.crawler.settings.CrawlerSettings;<a name="44" href="#44">44</a>  <strong>import</strong> org.archive.crawler.settings.SimpleType;<a name="45" href="#45">45</a>  <strong>import</strong> org.archive.crawler.settings.Type;<a name="46" href="#46">46</a>  <a name="47" href="#47">47</a>  <em>/**<em>* </em></em><a name="48" href="#48">48</a>  <em> * Behaves like {@link BdbFrontier} (i.e., a basic mostly breadth-first</em><a name="49" href="#49">49</a>  <em> * frontier), but with the addition that you can set the number of documents</em><a name="50" href="#50">50</a>  <em> * to download on a per site basis. </em><a name="51" href="#51">51</a>  <em> *</em><a name="52" href="#52">52</a>  <em> * Useful for case of frequent revisits of a site of frequent changes.</em><a name="53" href="#53">53</a>  <em> * </em><a name="54" href="#54">54</a>  <em> * &lt;p>Choose the number of docs you want to download and specify</em><a name="55" href="#55">55</a>  <em> * the count in &lt;code>max-docs&lt;/code>.  If &lt;code>count-per-host&lt;/code> is</em><a name="56" href="#56">56</a>  <em> * true, the default, then the crawler will download &lt;code>max-docs&lt;/code> </em><a name="57" href="#57">57</a>  <em> * per host.  If you create an override,  the overridden &lt;code>max-docs&lt;/code></em><a name="58" href="#58">58</a>  <em> * count will be downloaded instead, whether it is higher or lower.</em><a name="59" href="#59">59</a>  <em> * &lt;p>If &lt;code>count-per-host&lt;/code> is false, then &lt;code>max-docs&lt;/code></em><a name="60" href="#60">60</a>  <em> * acts like the the crawl order &lt;code>max-docs&lt;/code> and the crawler will</em><a name="61" href="#61">61</a>  <em> * download this total amount of docs only.  Overrides will  </em><a name="62" href="#62">62</a>  <em> * download &lt;code>max-docs&lt;/code> total in the overridden domain. </em><a name="63" href="#63">63</a>  <em> *</em><a name="64" href="#64">64</a>  <em> * @author Oskar Grenholm &lt;oskar dot grenholm at kb dot se></em><a name="65" href="#65">65</a>  <em> * @deprecated As of release 1.10.0.  Replaced by {@link BdbFrontier} and</em><a name="66" href="#66">66</a>  <em> * {@link QuotaEnforcer}.</em><a name="67" href="#67">67</a>  <em> */</em><a name="68" href="#68">68</a>  <strong>public</strong> <strong>class</strong> <a href="../../../../org/archive/crawler/frontier/DomainSensitiveFrontier.html">DomainSensitiveFrontier</a> <strong>extends</strong> <a href="../../../../org/archive/crawler/frontier/BdbFrontier.html">BdbFrontier</a><a name="69" href="#69">69</a>  implements <a href="../../../../org/archive/crawler/event/CrawlURIDispositionListener.html">CrawlURIDispositionListener</a> {<a name="70" href="#70">70</a>      <strong>private</strong> <strong>static</strong> <strong>final</strong> Logger logger =<a name="71" href="#71">71</a>          Logger.getLogger(DomainSensitiveFrontier.<strong>class</strong>.getName());<a name="72" href="#72">72</a>      <a name="73" href="#73">73</a>      <strong>public</strong> <strong>static</strong> <strong>final</strong> String ATTR_MAX_DOCS = <span class="string">"max-docs"</span>;<a name="74" href="#74">74</a>      <strong>public</strong> <strong>static</strong> <strong>final</strong> String ATTR_COUNTER_MODE = <span class="string">"counter-mode"</span>;<a name="75" href="#75">75</a>      <strong>public</strong> <strong>static</strong> <strong>final</strong> String COUNT_OVERRIDE = <span class="string">"count-per-override"</span>;<a name="76" href="#76">76</a>      <strong>public</strong> <strong>static</strong> <strong>final</strong> String COUNT_HOST = <span class="string">"count-per-host"</span>;<a name="77" href="#77">77</a>      <strong>public</strong> <strong>static</strong> <strong>final</strong> String COUNT_DOMAIN = <span class="string">"count-per-domain"</span>;<a name="78" href="#78">78</a>      <strong>public</strong> <strong>static</strong> <strong>final</strong> String[] ATTR_AVAILABLE_MODES = <strong>new</strong> String[] {<a name="79" href="#79">79</a>          COUNT_OVERRIDE, COUNT_HOST, COUNT_DOMAIN };      <a name="80" href="#80">80</a>      <strong>public</strong> <strong>static</strong> <strong>final</strong> String DEFAULT_MODE = COUNT_OVERRIDE;<a name="81" href="#81">81</a>          <a name="82" href="#82">82</a>      <em class="comment">// TODO: Make this a BigMap.</em><a name="83" href="#83">83</a>      <strong>private</strong> Hashtable hostCounters = <strong>new</strong> Hashtable();<a name="84" href="#84">84</a>      <strong>private</strong> <strong>boolean</strong> countPerOverride = <strong>true</strong>;<a name="85" href="#85">85</a>      <strong>private</strong> String counterMode;<a name="86" href="#86">86</a>  <a name="87" href="#87">87</a>      <strong>public</strong> <a href="../../../../org/archive/crawler/frontier/DomainSensitiveFrontier.html">DomainSensitiveFrontier</a>(String name) {<a name="88" href="#88">88</a>          <strong>super</strong>(ATTR_NAME, <span class="string">"DomainSensitiveFrontier. *Deprecated* Use "</span> +<a name="89" href="#89">89</a>          	<span class="string">"BdbFrontier+QuotaEnforcer instead. "</span> +<a name="90" href="#90">90</a>              <span class="string">"Overrides BdbFrontier to add specification of number of "</span> +<a name="91" href="#91">91</a>              <span class="string">"documents to download (Expects 'exclude-filter' "</span> +<a name="92" href="#92">92</a>              <span class="string">"to be part of CrawlScope)."</span>);<a name="93" href="#93">93</a>          <a href="../../../../org/archive/crawler/settings/Type.html">Type</a> e = addElementToDefinition(<strong>new</strong> <a href="../../../../org/archive/crawler/settings/SimpleType.html">SimpleType</a>(ATTR_MAX_DOCS,<a name="94" href="#94">94</a>              <span class="string">"Maximum number of documents to download for host or domain"</span> +<a name="95" href="#95">95</a>              <span class="string">" (Zero means no limit)."</span>, <strong>new</strong> Long(0)));<a name="96" href="#96">96</a>          e.setOverrideable(<strong>true</strong>);<a name="97" href="#97">97</a>          e = addElementToDefinition(<strong>new</strong> <a href="../../../../org/archive/crawler/settings/SimpleType.html">SimpleType</a>(ATTR_COUNTER_MODE,<a name="98" href="#98">98</a>                 <span class="string">"If "</span> + COUNT_OVERRIDE + <span class="string">", acts like the crawl "</span> +<a name="99" href="#99">99</a>                 <span class="string">"order maximum download count and the crawler will download "</span> +<a name="100" href="#100">100</a>                <span class="string">"this total amount of docs only. Override to change the max "</span> +<a name="101" href="#101">101</a>                <span class="string">"count for the overridden domain or host. "</span> +<a name="102" href="#102">102</a>                <span class="string">"Else if "</span> + COUNT_HOST + <span class="string">" the crawler will download "</span> +<a name="103" href="#103">103</a>                ATTR_MAX_DOCS + <span class="string">" per host. Add an override to change "</span> +<a name="104" href="#104">104</a>                <span class="string">"max count on a per-domain or a per-host basis.For "</span> +<a name="105" href="#105">105</a>                <span class="string">"example, if you set "</span> + ATTR_MAX_DOCS + <span class="string">" to 30 in "</span> +<a name="106" href="#106">106</a>                <span class="string">"this mode, the crawler will download 30 docs from "</span> +<a name="107" href="#107">107</a>                <span class="string">"each host in scope. If you  override for kb.se setting "</span> +<a name="108" href="#108">108</a>                ATTR_MAX_DOCS +<a name="109" href="#109">109</a>                <span class="string">" to 20, it will instead download only 20 docs from each "</span> +<a name="110" href="#110">110</a>                <span class="string">"host of kb.se. (It can be a larger as well as a smaller "</span> +<a name="111" href="#111">111</a>                <span class="string">"value here.). "</span> +<a name="112" href="#112">112</a>                <span class="string">"Finally "</span> + COUNT_DOMAIN + <span class="string">" behaves similar to "</span> +<a name="113" href="#113">113</a>                COUNT_HOST +<a name="114" href="#114">114</a>                <span class="string">", but instead sets max on a per-domain basis."</span> +<a name="115" href="#115">115</a>                <span class="string">"Here you can do overrides on the domain-level, but "</span> +<a name="116" href="#116">116</a>                <span class="string">"not on the host-level. So if you here set "</span> +<a name="117" href="#117">117</a>                ATTR_MAX_DOCS + <a name="118" href="#118">118</a>                <span class="string">" to 30 the crawler will download 30 docs from each "</span> +<a name="119" href="#119">119</a>                <span class="string">"domain in scope. If you  override for kb.se setting "</span> +<a name="120" href="#120">120</a>                ATTR_MAX_DOCS + <span class="string">" to 20, it will instead download only "</span> +<a name="121" href="#121">121</a>                <span class="string">"20 docs in total from the whole kb.se domain. (It can be "</span> +<a name="122" href="#122">122</a>                <span class="string">"a larger as well as a smaller value here.)"</span>, <a name="123" href="#123">123</a>                DEFAULT_MODE, ATTR_AVAILABLE_MODES));<a name="124" href="#124">124</a>          e.setOverrideable(false);         <a name="125" href="#125">125</a>     }<a name="126" href="#126">126</a> <a name="127" href="#127">127</a>     <strong>public</strong> <strong>void</strong> initialize(<a href="../../../../org/archive/crawler/framework/CrawlController.html">CrawlController</a> c)<a name="128" href="#128">128</a>     throws FatalConfigurationException, IOException {<a name="129" href="#129">129</a>         <strong>super</strong>.initialize(c);<a name="130" href="#130">130</a>         <strong>this</strong>.controller.addCrawlURIDispositionListener(<strong>this</strong>);<a name="131" href="#131">131</a>         <strong>try</strong> {<a name="132" href="#132">132</a>             counterMode = ((String)getAttribute(ATTR_COUNTER_MODE));<a name="133" href="#133">133</a>             <strong>if</strong>(counterMode.equalsIgnoreCase(COUNT_DOMAIN) ||

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -