⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 types.h

📁 Larbin互联网蜘蛛索引系统
💻 H
字号:
// Larbin// Sebastien Ailleret// 12-01-00 -> 13-07-00#ifndef TYPES_H#define TYPES_H#include <stdlib.h>// Size of the HashSize (max number of urls that can be fetched)#define hashSize 32000000// Size of the array of Sites in main memory#define siteListSize 100000// Max number of urls in ram#define ramUrls 100000// Max number of urls per site in Url#define maxUrlsBySite 60// time out when reading a page (in sec)#define timeoutPage 45// How long do we keep dns answers and robots.txt#define dnsValidTime 2*24*3600// Maximum size of a page#define maxPageSize 100000// Maximum size of a robots.txt that is read#define maxRobotsSize 10000// How many forbidden items do we accept in a robots.txt#define maxRobotsItem 100// file name used for storing urls on disk#define fifoFile "fifo"// number of urls per file on disk// should be equal to ramUrls#define urlByFile ramUrls// Size of the buffer used to read sockets#define BUF_SIZE 16384// Max size for a url#define MAX_URL_SIZE 512// Standard size of a fifo in a Site#define fifoSiteSize 16#define StdVectSize 8// Various reason of error when getting a page#define nbAnswers 9enum FetchError{  success,  noDNS,  noConnection,  forbiddenRobots,  timeout,  badType,  tooBig,  err40X,  earlyStop};#endif // TYPES_H

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -