Fajri Koto
Fajri Koto

Reputation: 163

Using Neural Network Class in WEKA in Java code

Hi I want to do simple training and testing using Neural Network in WEKA library.

But, I find it is not trivial, and its different with NaiveBayes class in its library.

Anyone have example how to use this class in java code?

Upvotes: 5

Views: 11136

Answers (2)

Fajri Koto
Fajri Koto

Reputation: 163

I read some sources on the internet and just realize that "if you want to use NeuralNetwork classifier in WEKA library, so the approach is NOT using the given NeuralNetwork class, but it should be "MultilayerPerceptron" class"

It's a little bit tricky and consumed my hours.

I hope it's useful for anyone who is struggling with this.

http://weka.8497.n7.nabble.com/Multi-layer-perception-td2896.html

Ps. Please correct if I am being wrong!

Upvotes: 1

Hitanshu Tiwari
Hitanshu Tiwari

Reputation: 310

Following steps might be able to help you:

  1. Add Weka libraries

Download Weka from http://www.cs.waikato.ac.nz/ml/weka/downloading.html.

From the package find 'Weka.jar' and add in the project.

Java Code Snippet

  1. Building a Neural Classifier

    public void simpleWekaTrain(String filepath)
    {
    try{
    //Reading training arff or csv file
    FileReader trainreader = new FileReader(filepath);
    Instances train = new Instances(trainreader);
    train.setClassIndex(train.numAttributes() – 1);
    //Instance of NN
    MultilayerPerceptron mlp = new MultilayerPerceptron();
    //Setting Parameters
    mlp.setLearningRate(0.1);
    mlp.setMomentum(0.2);
    mlp.setTrainingTime(2000);
    mlp.setHiddenLayers(“3?);
    mlp.buildClassifier(train);
    }
    catch(Exception ex){
    ex.printStackTrace();
    }
    }
    

Another Way to set parameters,

    mlp.setOptions(Utils.splitOptions(“-L 0.1 -M 0.2 -N 2000 -V 0 -S 0 -E 20 -H 3?));

Where,

L = Learning Rate
M = Momentum
N = Training Time or Epochs
H = Hidden Layers
etc.
  1. Neural Classifier Training Validation

For evaluation of training data,

    Evaluation eval = new Evaluation(train);
    eval.evaluateModel(mlp, train);
    System.out.println(eval.errorRate()); //Printing Training Mean root squared Error
    System.out.println(eval.toSummaryString()); //Summary of Training

To apply K-Fold validation

    eval.crossValidateModel(mlp, train, kfolds, new Random(1));
  1. Evaluating/Predicting unlabelled data

    Instances datapredict = new Instances(
    new BufferedReader(
    new FileReader(<Predictdatapath>)));
    datapredict.setClassIndex(datapredict.numAttributes() – 1);
    Instances predicteddata = new Instances(datapredict);
    //Predict Part
    for (int i = 0; i < datapredict.numInstances(); i++) {
    double clsLabel = mlp.classifyInstance(datapredict.instance(i));
    predicteddata.instance(i).setClassValue(clsLabel);
    }
    //Storing again in arff
    BufferedWriter writer = new BufferedWriter(
    new FileWriter(<Output File Path>));
    writer.write(predicteddata.toString());
    writer.newLine();
    writer.flush();
    writer.close();
    

Upvotes: 9

Related Questions