⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 output.cc

📁 100 病毒源碼,原始碼,無毒 ......
💻 CC
字号:
// Larbin// Sebastien Ailleret// 03-02-00 -> 06-06-00#include <iostream.h>#include "types.h"#include "global.h"#include "xfetcher/file.h"#include "xinterf/output.h"#include "xutils/debug.h"/** A page has been loaded successfully * @param page the page that has been fetched */void loaded (html *page) {  // Here should be the code for managing everything  // cout << "fetched : ";  // page->getUrl()->print();}/** The fetch failed (but this page is interesting * @param u the URL of the doc * @param reason reason of the fail */void fetchFailInteresting (url *u, FetchError reason) {  // Here should be the code for managing everything}/** The fetch failed * @param u the URL of the doc * @param reason reason of the fail */void fetchFail (url *u, FetchError reason) {  // Here should be the code for managing everything}/** It's over with this file * report the situation ! (and make some stats) */static void endOfLoad (html *parser, FetchError err) {  answers(err);  switch (err) {  case success:	if (parser->isInteresting()) {	  interestingPage();	  loaded(parser);	}	break;  default:	if (parser->isInteresting()) {	  fetchFailInteresting(parser->getUrl(), err);	} else {	  fetchFail(parser->getUrl(), err);	}	break;  }}/** In this thread, end user manage the result of the crawl */void *startOutput (void *none) {  crash("Output on");   for (;;) {	Connexion *conn = global::userConns->get();	endOfLoad((html *)conn->parser, (enum FetchError) conn->pos);	conn->recycle();	global::freeConns->put(conn);  }  return NULL;}

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -