Thomas Clancy
Thomas Clancy

Reputation: 61

DeepLearning4j - neural network configuration

During last few days I have started working with deeplearning4j library and I have encountered a kind of a problem.

My testing and input data consist of 25 binary values. Training set contains 40 rows. Network has 4 output values. My goal is to train the network to have as little error as possible.

I have tried different configurations (also the ones that were presented in deeplearning4j examples) but still I can not configure my network to have accuracy satisfactory level. What is more classification is really odd - for instance output values of network are like [0.31, 0.12, 0.24, 0.33].

To my mind proper values should be like [0, 0, 0, 1] etc.

My Neural network configuration:

private static final int SEED = 123;
private static final int ITERATIONS = 1;
private static final int NUMBER_OF_INPUT_NODES = 25; 
private static final int NUMBER_OF_OUTPUT_NODES = 4; 
private static final int EPOCHS = 10;

public static MultiLayerNetwork getNeuralNetwork() {
    StatsStorage storage = configureUI();
    MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder().seed(SEED).iterations(ITERATIONS).learningRate(1e-1)
            .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT)
            .weightInit(WeightInit.RELU).updater(Updater.ADADELTA).list()
            .layer(0, new DenseLayer.Builder().nIn(NUMBER_OF_INPUT_NODES).nOut(60)
                    .activation(Activation.RELU).build())
            .layer(1, new DenseLayer.Builder().nIn(60).nOut(50)
                    .activation(Activation.RELU).build())
            .layer(2, new DenseLayer.Builder().nIn(50).nOut(50)
                    .activation(Activation.RELU).build())
            .layer(3, new OutputLayer.Builder(LossFunctions.LossFunction.MCXENT).nIn(50).nOut(NUMBER_OF_OUTPUT_NODES)
                    .activation(Activation.SOFTMAX).build()).backprop(true).build();

    MultiLayerNetwork network = new MultiLayerNetwork(conf);
    network.init();
    network.setListeners(new StatsListener(storage), new ScoreIterationListener(1));
    DataSetIterator iterator = new ListDataSetIterator(createTrainingSet());
    for (int i = 0; i < EPOCHS; i++) {
        network.fit(iterator);
    }
    return network;
}

I will be really grateful for any help. Regards,

Upvotes: 2

Views: 504

Answers (1)

reden
reden

Reputation: 1003

Method 1:

It seems that it's the expected behavior for SOFTMAX activation. This is from the PredictGenderTest example:

INDArray predicted = model.output(features);
//System.out.println("output : " + predicted);
if (predicted.getDouble(0) > predicted.getDouble(1))
   gender.setText("Female");
else if (predicted.getDouble(0) < predicted.getDouble(1))
   gender.setText("Male");

If you want to evaluate the model, it might be easier to use this pattern:

Evaluation eval = new Evaluation(numOutputs);
while(testIter.hasNext()){
   DataSet t = testIter.next();
   INDArray features = t.getFeatureMatrix();
   INDArray labels = t.getLabels();
   INDArray predicted = network.output(features, false);
   eval.eval(labels, predicted);
}
System.out.println(eval.stats());

Then you get a human-readable result

Method 2:

I've found out another way to achieve this, which might be more desirable in some cases.

  1. Set label names on your data:

    DataSet verifyData = iterator.next(); List<String> labelNames = new ArrayList<>(); labelNames.add("Label 1"); labelNames.add("Label 2"); verifyData.setLabelNames(labelNames);

  2. Instead of model.output, use predict:

ArrayList<String> labels = (ArrayList<String>) model.predict(verifyData);

Upvotes: 2

Related Questions