⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 tclass.java

📁 用于multivariate时间序列分类
💻 JAVA
📖 第 1 页 / 共 2 页
字号:
        // So now we have the raw data in the correct form for each        // attributor.         // And now, we can construct a learner for each case.         // Well, for now, I'm going to do something completely crazy.         // Let's run each classifier nonetheless over the whole data        // ... and see what the hell happens. Maybe some voting scheme         // is possible!! This is a strange form of ensemble        // classifier.         // Each naive bayes algorithm only gets one         Debug.setDebugLevel(Debug.PROGRESS);         int[] selectedIndices = null;          String [] classifierSpec = Utils.splitOptions(thisExp.learnerStuff);        if (classifierSpec.length == 0) {            throw new Exception("Invalid classifier specification string");        }                String classifierName = classifierSpec[0];        classifierSpec[0] = "";        Classifier learner = Classifier.forName(classifierName, classifierSpec);        Debug.dp(Debug.PROGRESS, "PROGRESS: Beginning format conversion for class ");         Instances  data = WekaBridge.makeInstances(trainAtts, "Train ");        Debug.dp(Debug.PROGRESS, "PROGRESS: Conversion complete. Starting learning");            if(thisExp.featureSel){                Debug.dp(Debug.PROGRESS, "PROGRESS: Doing feature selection");                    BestFirst bfs = new BestFirst();                CfsSubsetEval cfs = new CfsSubsetEval();                 cfs.buildEvaluator(data);                 selectedIndices = bfs.search(cfs, data);                 // Now extract the features.                 System.err.print("Selected features: ");                String featureString = new String();                 for(int j=0; j < selectedIndices.length; j++){                    featureString += (selectedIndices[j] +1)+ ",";                }                featureString += ("last");                 System.err.println(featureString);                // Now apply the filter.                 AttributeFilter af = new AttributeFilter();                 af.setInvertSelection(true);                 af.setAttributeIndices(featureString);                 af.inputFormat(data);                 data = af.useFilter(data, af);             }        learner.buildClassifier(data);         Debug.dp(Debug.PROGRESS, "Learnt classifier: \n" + learner.toString());                 WekaClassifier wekaClassifier;         wekaClassifier = new WekaClassifier(learner);         if(thisExp.makeDesc){            // Section for making description more readable. Assumes that             // learner.toString() returns a string with things that look like             // feature names.             String concept = learner.toString();             StringTokenizer st = new StringTokenizer(concept, " \t\r\n", true);            while (st.hasMoreTokens()) {                boolean appendColon = false;                 String curTok = st.nextToken();                 GClust clust = (GClust) ((ClusterVec) clusters).elCalled(curTok);                if(clust != null){                    // Skip the spaces                    st.nextToken();                     // Get a < or >                    String cmp = st.nextToken();                     String qual = "";                     if(cmp.equals("<=")){                        qual = " HAS NO ";                     }                    else {                        qual = " HAS ";                     }                    // skip spaces                    st.nextToken();                     // Get the number.                     String conf = st.nextToken();                     if(conf.endsWith(":")){                        conf = conf.substring(0, conf.length()-1);                         appendColon = true;                     }                    float minconf = Float.valueOf(conf).floatValue();                     EventI[] res = clust.getBounds(minconf);                    String name = clust.getName();                     int dashPos = name.indexOf('-');                     int undPos = name.indexOf('_');                     String chan = name.substring(0, dashPos);                    String evType = name.substring(dashPos+1, undPos);                     EventDescI edi = clust.eventDesc();                     System.out.print("Channel " + chan + qual + evType + " ");                     int numParams = edi.numParams();                     for(int i=0; i < numParams; i++){                        System.out.print(edi.paramName(i) + " in [" + res[0].valOf(i) + "," + res[1].valOf(i) + "] ");                     }                    if(appendColon){                        System.out.print(":");                     }                }                else {                    System.out.print(curTok);                }            }             // Now this is going to be messy as fuck. Really. What do we needs? Well,             // we need to read in the data; look up some info, that we             // assume came from a GainClusterer ...             // Sanity check.             //            GClust clust =  (GClust) ((ClusterVec) clusters).elCalled("alpha-inc_0");             // System.out.println("INSANE!: " + clust.getDescription());             // EventI[] res = clust.getBounds(1);             // System.out.println("For clust settings: min event = " + res[0].toString() + " and max event = " + res[1].toString());         }        Debug.dp(Debug.PROGRESS, "PROGRESS: Learning complete. ");          int numCorrect = 0;         ClassificationVecI classns;         if(thisExp.trainResults){            System.err.println(">>> Training performance <<<");             classns = (ClassificationVecI) trainAtts.getClassVec().clone();            for(int j=0; j < numTrainStreams; j++){                wekaClassifier.classify(data.instance(j), classns.elAt(j));            }            for(int j=0; j < numTrainStreams; j++){                // System.out.print(classns.elAt(j).toString());                 if(classns.elAt(j).getRealClass() == classns.elAt(j).getPredictedClass()){                    numCorrect++;                     String realClassName = domDesc.getClassDescVec().getClassLabel(classns.elAt(j).getRealClass());                                    System.err.println("Class " + realClassName + " CORRECTLY classified.");                                     }                else {                                        String realClassName = domDesc.getClassDescVec().getClassLabel(classns.elAt(j).getRealClass());                    String predictedClassName = domDesc.getClassDescVec().getClassLabel(classns.elAt(j).getPredictedClass());                                System.err.println("Class " + realClassName + " INCORRECTLY classified as " + predictedClassName + ".");                                                 }            }            System.err.println("Training results for classifier: " + numCorrect + " of " + numTrainStreams + " (" +                                numCorrect*100.0/numTrainStreams + "%)");         }                    System.err.println(">>> Testing stage <<<");         // First, print the results of using the straight testers.         classns = (ClassificationVecI) testAtts.getClassVec().clone();        StreamAttValVecI savvi = testAtts.getStreamAttValVec();         data = WekaBridge.makeInstances(testAtts, "Test ");         if(thisExp.featureSel){            String featureString = new String();             for(int j=0; j < selectedIndices.length; j++){                featureString += (selectedIndices[j]+1) + ",";            }            featureString += "last";             // Now apply the filter.             AttributeFilter af = new AttributeFilter();             af.setInvertSelection(true);             af.setAttributeIndices(featureString);             af.inputFormat(data);             data = af.useFilter(data, af);         }        for(int j=0; j < numTestStreams; j++){            wekaClassifier.classify(data.instance(j), classns.elAt(j));        }        System.err.println(">>> Learner <<<");         numCorrect = 0;         for(int j=0; j < numTestStreams; j++){            // System.out.print(classns.elAt(j).toString());             if(classns.elAt(j).getRealClass() == classns.elAt(j).getPredictedClass()){                numCorrect++;                 String realClassName = domDesc.getClassDescVec().getClassLabel(classns.elAt(j).getRealClass());                                System.err.println("Class " + realClassName + " CORRECTLY classified.");             }            else {                String realClassName = domDesc.getClassDescVec().getClassLabel(classns.elAt(j).getRealClass());                String predictedClassName = domDesc.getClassDescVec().getClassLabel(classns.elAt(j).getPredictedClass());                                                System.err.println("Class " + realClassName + " INCORRECTLY classified as " + predictedClassName + ".");             }        }            System.err.println("Test accuracy for classifier: " + numCorrect + " of " + numTestStreams + " (" +                                numCorrect*100.0/numTestStreams + "%)");                 }    }

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -