Thanesh Prabaghan
Thanesh Prabaghan

Reputation: 1

JavaScript : Simple perceptron predicting wrong for XOR gate

This JavaScript code represents the concept of simple perceptron in a neural network. Below code is predicting fine with all truth table except XOR table. please run this code in your browser's console window and find what is wrong.

Since this is a simple single neuron, I didn't give much importance to hidden layers. I'am training this up to 10,000 iteration for better result.

//AND GATE
var X1 = [0,1,1,0];
var X2 = [0,1,0,1];
var OUT = [0,1,0,0];

/*

//AND GATE
var X1 = [0,1,1,0];
var X2 = [0,1,0,1];
var OUT = [0,1,0,0];

//OR GATE
var X1 = [0,1,1,0];
var X2 = [0,1,0,1];
var OUT = [0,1,1,1];

//NAND GATE
var X1 = [0,1,1,0];
var X2 = [0,1,0,1];
var OUT = [1,0,1,1];

//NOR GATE
var X1 = [0,1,1,0];
var X2 = [0,1,0,1];
var OUT = [1,0,0,0];

//XOR GATE
var X1 = [0,1,1,0];
var X2 = [0,1,0,1];
var OUT = [0,0,1,1];

*/

var LR = 0.01; //Learning rate to speedup learning process.
var BIAS = 1; // Avoid sum become zero.
var TRAIN = 10000; //Epochs we need to run for accurate result
var WEIGHTS = [Math.random(),Math.random(),Math.random()]; //3 Random weights 2 for input & 1 for bias 

//console.log("Initial Weights : "+WEIGHTS);

function neuron(x1,x2,out){

var sum = 0;
var error = 0;

//Sum of weighted x1,x2 and bias
sum = x1*WEIGHTS[0] + x2*WEIGHTS[1] + BIAS*WEIGHTS[2];

//Heaviside step function as activation function
if(sum>1){
    sum = 1;
}else{
    sum = 0;
}

//Calculate the error
error = out - sum;

//Adjust weights
WEIGHTS[0] = WEIGHTS[0] + error * x1 * LR;  
WEIGHTS[1] = WEIGHTS[1] + error * x2 * LR;
WEIGHTS[2] = WEIGHTS[2] + error * BIAS * LR;

//console.log("Weights adjust : "+WEIGHTS);

}

function Train(){
//Epoch iteration eg- 10000 is good
for(var k=1;k<=TRAIN;k++){
//Train Four sets of truth table
for(var i=0;i<X1.length;i++){
neuron(X1[i],X2[i],OUT[i]);
}
}
}

function Predict(x1,x2){
var predict = 0;
predict = x1*WEIGHTS[0] + x2*WEIGHTS[1] + BIAS*WEIGHTS[2];

if(predict>1){
    predict = 1;
}else{
    predict = 0;
}
//Predict for given input
console.log("The prediction for "+(x1+","+x2)+" is "+predict);

}

//First train the perceptron 
Train();
//Predict for given input
Predict(1,1);
Predict(0,0);
Predict(1,0);
Predict(0,1);

The output for XOR gate is

The prediction for 1,1 is 1
The prediction for 0,0 is 1
The prediction for 1,0 is 1
The prediction for 0,1 is 1

Upvotes: 0

Views: 164

Answers (1)

Daemon Painter
Daemon Painter

Reputation: 3520

Some sources state that is not possible to solve the XOR gate with a single perceptron.

Other sources let you know that you need higher order perceptrons to solve the XOR with a single Perceptron.

I quote from the second link:

Everyone who has ever studied about neural networks has probably already read that a single perceptron can’t represent the boolean XOR function. The book Artificial Intelligence: A Modern Approach, the leading textbook in AI, says: “[XOR] is not linearly separable so the perceptron cannot learn it” (p.730).

I hope this points you in the right direction. This kind of question is not new here.

Upvotes: 0

Related Questions