⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 start_vldb_after_2001.pl

📁 利用lwp::get写的
💻 PL
字号:
#!/usr/bin/perl -w
use strict;

#########################
# Start VLDB 2001-
# Yeni, 2006/11
# yeni@yueds.com
#########################
# define the crawler's rule to fetch VLDB papers year afters 2000.

use MyCrawler;

# Go to main entry
&main;
exit;

############### PERSONALIZE PART BEGIN ###############

# Main entry
sub main {
    # use sigmod rules
    sigmod_rules();
    
    $MyGrabber::rulefunc = \&sigmod_rulefunc;
    $MyCrawler::links_filter = \&sigmod_links_filter;
    
    $MyCrawler::pagebase = '';
    
    # begin collecting from TOCs
    MyCrawler::toc('VLDB', '2001', 'http://www.vldb.org/dblp/db/conf/vldb/vldb2001.html');
    MyCrawler::toc('VLDB', '2002', 'http://www.vldb.org/dblp/db/conf/vldb/vldb2002.html');
    MyCrawler::toc('VLDB', '2003', 'http://www.vldb.org/dblp/db/conf/vldb/vldb2003.html');
    MyCrawler::toc('VLDB', '2004', 'http://www.vldb.org/dblp/db/conf/vldb/vldb2004.html');
    MyCrawler::toc('VLDB', '2005', 'http://www.vldb.org/dblp/db/conf/vldb/vldb2005.html');
    MyCrawler::toc('VLDB', '2006', 'http://www.vldb.org/dblp/db/conf/vldb/vldb2006.html');
}

sub sigmod_rules {
    # add page search rules for content pages
    MyGrabber::newSearchRule;
    MyGrabber::addSearchRule('Author',  # rule name
                             '  author    = {',
                             '},',
                             1,         # appear once
                             1,         # continued with last search
                            );
    MyGrabber::addSearchRule('Title',   # rule name
                             '  title     = {',
                             '},',
                             1,         # appear once
                             1,         # continued with last search
                            );
    MyGrabber::addSearchRule('Year',  # rule name
                             '  year      = {',
                             '},',
                             1,         # appear once
                             1,         # continued with last search
                            );
    MyGrabber::addSearchRule('Abstract',  # rule name
                             '  ee        = {',
                             '},',
                             1,         # appear once
                             1,         # continued with last search
                            );
}

sub sigmod_rulefunc {
    my ($resstr, $rulename) = @_;
    if ($rulename eq 'Author') {
        my @authors = split(/ and\n/, $resstr);
        foreach my $authorname (@authors) {
            my $institute = '';
            my @authortuple = ($authorname, $institute);
            push(@MyCrawler::authors, \@authortuple);
        }
    } elsif ($rulename eq 'Title') {
        $resstr =~ s/\n              //isg;
        $resstr =~ s/\(abstract\)//isg;
        $MyCrawler::props{$rulename} = $resstr;
    } else {
        $MyCrawler::props{$rulename} = $resstr;
    }
    $MyCrawler::props{'Conference'} = 'International Conference on Very Large Data Bases';
}

sub sigmod_links_filter {
    foreach my $url (@MyGrabber::links) {
        if($url =~ /http\:\/\/dblp\.uni-trier\.de\/rec\/bibtex\/conf\/vldb\// && $url !~ /200/) {
            push(@MyCrawler::availlinks, $url);
        }
    }
}

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -