Chandler Freeman
Chandler Freeman

Reputation: 899

Why is my neural network giving negative outputs for positive inputs?

I am using a neural network library written in C++ (called FANN) to attempt to learn and predict mathematical sequences. It is implemented with Node.js using a wrapper for the library. In this particular instance, I am trying to make the neural network learn the Fibonacci sequence by giving the position as an input and the numerical value as the output. My code for the network is as follows:

// This neural network calculates the fibonacci sequence
var net = new fanntom.standard(1,3,1);

var data = [
  [[0], [0]],
  [[1], [1]],
  [[2], [1]],
  [[3], [2]],
  [[4], [3]],
  [[5], [5]],
  [[6], [8]],
  [[7], [13]],
  [[8], [21]],
  [[9], [34]]
]

net.activation_function_hidden('FANN_LINEAR');
net.activation_function_output('FANN_LINEAR');
net.train(data, { error: 0.00001 })

;[0,1,2,3,4,5,6,7,8,9].forEach(function(a) {
    var c = net.run([a]);
    console.log("fibonacci sequence position " + a + " -> " + c)
})

Here is a sample of the output I receive:

Max epochs   100000. Desired error: 0.0000100000.
Epochs            1. Current error: 187.3569030762. Bit fail 9.
Epochs         1000. Current error: 34.0731391907. Bit fail 8.
Epochs         2000. Current error: 34.0791511536. Bit fail 8.
Epochs         3000. Current error: 34.0858230591. Bit fail 8.
Epochs         4000. Current error: 34.0767517090. Bit fail 8.
Epochs         5000. Current error: 34.0764961243. Bit fail 8.
Epochs         6000. Current error: 34.0817642212. Bit fail 8.
Epochs         7000. Current error: 34.0817031860. Bit fail 8.
Epochs         8000. Current error: 34.0721969604. Bit fail 8.
Epochs         9000. Current error: 34.0795860291. Bit fail 8.
Epochs        10000. Current error: 34.0741653442. Bit fail 8.
Epochs        11000. Current error: 34.0833320618. Bit fail 8.
Epochs        12000. Current error: 34.0826034546. Bit fail 8.
Epochs        13000. Current error: 34.0909080505. Bit fail 8.
Epochs        14000. Current error: 34.0811843872. Bit fail 8.
Epochs        15000. Current error: 34.0729255676. Bit fail 8.
Epochs        16000. Current error: 34.0812034607. Bit fail 8.
Epochs        17000. Current error: 34.0855636597. Bit fail 8.
Epochs        18000. Current error: 34.0725784302. Bit fail 8.
Epochs        19000. Current error: 34.0898971558. Bit fail 8.
Epochs        20000. Current error: 34.0742073059. Bit fail 8.
Epochs        21000. Current error: 34.0820236206. Bit fail 8.
Epochs        22000. Current error: 34.0867233276. Bit fail 8.
Epochs        23000. Current error: 34.0676040649. Bit fail 8.
Epochs        24000. Current error: 34.0834121704. Bit fail 8.
Epochs        25000. Current error: 34.0862617493. Bit fail 8.
Epochs        26000. Current error: 34.0691108704. Bit fail 8.
Epochs        27000. Current error: 34.0897636414. Bit fail 8.
Epochs        28000. Current error: 34.0828247070. Bit fail 8.
Epochs        29000. Current error: 34.0744514465. Bit fail 8.
Epochs        30000. Current error: 34.0876007080. Bit fail 8.
Epochs        31000. Current error: 34.0852851868. Bit fail 8.
Epochs        32000. Current error: 34.0892257690. Bit fail 8.
Epochs        33000. Current error: 34.0835494995. Bit fail 8.
Epochs        34000. Current error: 34.0838394165. Bit fail 8.
Epochs        35000. Current error: 34.0851097107. Bit fail 8.
Epochs        36000. Current error: 34.0754585266. Bit fail 8.
Epochs        37000. Current error: 34.0893363953. Bit fail 8.
Epochs        38000. Current error: 34.0729141235. Bit fail 8.
Epochs        39000. Current error: 34.0780258179. Bit fail 8.
Epochs        40000. Current error: 34.0776443481. Bit fail 8.
Epochs        41000. Current error: 34.0812759399. Bit fail 8.
Epochs        42000. Current error: 34.0707893372. Bit fail 8.
Epochs        43000. Current error: 34.0810317993. Bit fail 8.
Epochs        44000. Current error: 34.0846099854. Bit fail 8.
Epochs        45000. Current error: 34.0794601440. Bit fail 8.
Epochs        46000. Current error: 34.0818710327. Bit fail 8.
Epochs        47000. Current error: 34.0692596436. Bit fail 8.
Epochs        48000. Current error: 34.0687141418. Bit fail 8.
Epochs        49000. Current error: 34.0702171326. Bit fail 8.
Epochs        50000. Current error: 34.0730400085. Bit fail 8.
Epochs        51000. Current error: 34.0896568298. Bit fail 8.
Epochs        52000. Current error: 34.0715599060. Bit fail 8.
Epochs        53000. Current error: 34.0734481812. Bit fail 8.
Epochs        54000. Current error: 34.0772285461. Bit fail 8.
Epochs        55000. Current error: 34.0646171570. Bit fail 8.
Epochs        56000. Current error: 34.0669212341. Bit fail 8.
Epochs        57000. Current error: 34.0733718872. Bit fail 8.
Epochs        58000. Current error: 34.0881729126. Bit fail 8.
Epochs        59000. Current error: 34.0861282349. Bit fail 8.
Epochs        60000. Current error: 34.0846023560. Bit fail 8.
Epochs        61000. Current error: 34.0738449097. Bit fail 8.
Epochs        62000. Current error: 34.0877456665. Bit fail 8.
Epochs        63000. Current error: 34.0803222656. Bit fail 8.
Epochs        64000. Current error: 34.0794219971. Bit fail 8.
Epochs        65000. Current error: 34.0926132202. Bit fail 8.
Epochs        66000. Current error: 34.0831146240. Bit fail 8.
Epochs        67000. Current error: 34.0780830383. Bit fail 8.
Epochs        68000. Current error: 34.0757255554. Bit fail 8.
Epochs        69000. Current error: 34.0820083618. Bit fail 8.
Epochs        70000. Current error: 34.0746269226. Bit fail 8.
Epochs        71000. Current error: 34.0959663391. Bit fail 8.
Epochs        72000. Current error: 34.0699691772. Bit fail 8.
Epochs        73000. Current error: 34.0816230774. Bit fail 8.
Epochs        74000. Current error: 34.0853195190. Bit fail 8.
Epochs        75000. Current error: 34.0910835266. Bit fail 8.
Epochs        76000. Current error: 34.0766525269. Bit fail 8.
Epochs        77000. Current error: 34.0885848999. Bit fail 8.
Epochs        78000. Current error: 34.0684432983. Bit fail 8.
Epochs        79000. Current error: 34.0836944580. Bit fail 8.
Epochs        80000. Current error: 34.0931396484. Bit fail 8.
Epochs        81000. Current error: 34.0903816223. Bit fail 8.
Epochs        82000. Current error: 34.0796318054. Bit fail 8.
Epochs        83000. Current error: 34.0709342957. Bit fail 8.
Epochs        84000. Current error: 34.0812988281. Bit fail 8.
Epochs        85000. Current error: 34.0859451294. Bit fail 8.
Epochs        86000. Current error: 34.0641326904. Bit fail 8.
Epochs        87000. Current error: 34.0925521851. Bit fail 8.
Epochs        88000. Current error: 34.0828132629. Bit fail 8.
Epochs        89000. Current error: 34.0705337524. Bit fail 8.
Epochs        90000. Current error: 34.0698318481. Bit fail 8.
Epochs        91000. Current error: 34.0850410461. Bit fail 8.
Epochs        92000. Current error: 34.0921783447. Bit fail 8.
Epochs        93000. Current error: 34.0679855347. Bit fail 8.
Epochs        94000. Current error: 34.0932426453. Bit fail 8.
Epochs        95000. Current error: 34.0735969543. Bit fail 8.
Epochs        96000. Current error: 34.0687332153. Bit fail 8.
Epochs        97000. Current error: 34.0628662109. Bit fail 8.
Epochs        98000. Current error: 34.0813598633. Bit fail 8.
Epochs        99000. Current error: 34.0901985168. Bit fail 8.
Epochs       100000. Current error: 34.0652198792. Bit fail 8.
fibonacci sequence position 0 -> -3.7995970795170027
fibonacci sequence position 1 -> -1.3996559488192886
fibonacci sequence position 2 -> 1.0002851818784273
fibonacci sequence position 3 -> 3.4002263125761414
fibonacci sequence position 4 -> 5.800167443273858
fibonacci sequence position 5 -> 8.200108573971574
fibonacci sequence position 6 -> 10.60004970466929
fibonacci sequence position 7 -> 12.999990835367003
fibonacci sequence position 8 -> 15.39993196606472
fibonacci sequence position 9 -> 17.799873096762436

My question is, how can the neural net produce negative outputs if all the inputs are positive? Also, why is the error so large, especially for the first epoch?

Upvotes: 0

Views: 1327

Answers (1)

MSalters
MSalters

Reputation: 179897

The output can be negative because it's a combination of inputs, weights, and transfer functions. Weights are randomly initialized with average 0, so about half of them are negative. And since they're randomly initialized, you expect a huge error before the first training. It's literally a guess.

BTW, your error stabilizes after 1000 iterations. Considering the size of the problem domain, it probably stabilized after 50 iterations. You probably spent 2000x more time than necessary.

Upvotes: 2

Related Questions