user4911648
user4911648

Reputation:

caffe loss is nan or 0

I am training a network and I have changed to learning rate from 0.1 to 0.00001. The output always remains the same. No mean is used for training. What could be the reasons for such a weird loss?

I1107 15:07:28.381621 12333 solver.cpp:404]     Test net output #0: loss = 3.37134e+11 (* 1 = 3.37134e+11 loss)
I1107 15:07:28.549142 12333 solver.cpp:228] Iteration 0, loss = 1.28092e+11
I1107 15:07:28.549201 12333 solver.cpp:244]     Train net output #0: loss = 1.28092e+11 (* 1 = 1.28092e+11 loss)
I1107 15:07:28.549211 12333 sgd_solver.cpp:106] Iteration 0, lr = 1e-07
I1107 15:07:59.490077 12333 solver.cpp:228] Iteration 50, loss = -nan
I1107 15:07:59.490170 12333 solver.cpp:244]     Train net output #0: loss = 0 (* 1 = 0 loss)
I1107 15:07:59.490176 12333 sgd_solver.cpp:106] Iteration 50, lr = 1e-07
I1107 15:08:29.177093 12333 solver.cpp:228] Iteration 100, loss = -nan
I1107 15:08:29.177119 12333 solver.cpp:244]     Train net output #0: loss = 0 (* 1 = 0 loss)
I1107 15:08:29.177125 12333 sgd_solver.cpp:106] Iteration 100, lr = 1e-07
I1107 15:08:59.758381 12333 solver.cpp:228] Iteration 150, loss = -nan
I1107 15:08:59.758513 12333 solver.cpp:244]     Train net output #0: loss = 0 (* 1 = 0 loss)
I1107 15:08:59.758545 12333 sgd_solver.cpp:106] Iteration 150, lr = 1e-07
I1107 15:09:30.210208 12333 solver.cpp:228] Iteration 200, loss = -nan
I1107 15:09:30.210304 12333 solver.cpp:244]     Train net output #0: loss = 0 (* 1 = 0 loss)
I1107 15:09:30.210310 12333 sgd_solver.cpp:106] Iteration 200, lr = 1e-07

Upvotes: 0

Views: 1042

Answers (1)

Shai
Shai

Reputation: 114976

you loss is not 0, not even close. You start with 3.3e+11 (that is ~10^11) and it seems like soon after it explodes and you get nan. You need to drastically scale down you loss values. If you are using "EuclideanLoss" you might want to average the loss by the size of the depth map, scale the predicted values to [-1,1] range, or any other scaling method that will prevent your loss from exploding.

Upvotes: 2

Related Questions