⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 lugd.h

📁 一个简单灵活的数据挖掘实验平台
💻 H
字号:
#ifndef _LUGD_H_
#define _LUGD_H_
/************************************************************************
            Linear Unit Gradient Descent(LUGD) Algorithm Description
			#for more information see the ML book(cn/e,Tom.Mitchell) p68
Lugd(train_set,learning_rate)
      : for each training instance <x,t>,where x is input vector,
        t is output vector, learning_rate is learning rate.
STEP 1. initializes each weight between input and output layer with little
		random value.
STEP 2. Before the stop condition is met, do:
        1) initializes  each delta weight between input and output layer
		with 0 value
		2)for each training instance <x,t> do:
			<1> calculate output of x, say o
			<2) for each delta weight do:
				  delta wight = delta weight + learning_rate*(t-o)x
		3)for each weight do:
				  weight = weight + delta weight
/************************************************************************/
#include "..\\Core\\Data.h"
#include <vector>
using namespace std;

//: Parameter struct for LUGD algorithm
struct LUGD_PARAM
{
	int input;   //: input layer node number, required!
	int output;  //: output layer node number, required!
	double learning_rate; //: learning rate during the training phrase, with default value
	double random_range;  //: random range of the initial weights between input and output layers, with default value
	int max_epoch;        //: maximum epoch of the whole training phrase, with default value
	LUGD_PARAM()
	{
		input         = 0;
		output        = 0;
		learning_rate = 0.5;
		random_range  = 0.05;
		max_epoch     = 2000;
	}
};
class LUGD
{
public:
    //: constructs the LUGD algorithm with specified training set and parameters, see parameter struct above
	LUGD(vector<Data>* train_set,LUGD_PARAM params);
	//: deletes the memory allocated during the initial phrase
	~LUGD();
	//: train the LUGD algorithm with specified training set and parameters
	void train();

	//: calculate the output of the testing data, result stores in the data itself(target positions)
	void work(Data& data);
	//: load a trained LUGD algorithm from specified file(file format should be compatible to the save function)
	static LUGD* load(string file);
	//: save a trained LUGD algorithm to a specified file(which is human readble)
	void save(string file);
private:
	//: forward calculates the result of the input data and delat weights
	void forward(Data data);
	//: updates the wights between input and output layers
	void update_weight();
private:
	//: checks the validation of the input parameters
	void check();
	//: initializes the internal parameters
	void initialize();
	//: randomizes the initial weights between input layer and output layer
	inline double random();
private:
	vector<Data>* _train_set; //: training set
	int _ni;                 //: input layer neuron node count
	int _no;				 //: output layer neuron node count
	double _range;           //: initialized random range
	double _enta;            //: learning rate               
	int _max_epoch;          //: maximum epoch count

	double* _wio;            //: weights between input and output layer
	double* _dwio;           //: delta weights between input and output layer
    double* _o;              //: output array of an instance input
};
#endif

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -