Reputation: 1101
I am setting random seed for both random and numpy.random at the beginning of my main file:
import random
import numpy as np
np.random.seed(42)
random.seed(42)
import torch
Nevertheless, when I create a Net() object with randomly initialized parameters, it gives a completely different result every time:
net=neuralnet.Net()
print ("initialized params: ", net.fc1.weight)
Note that neuralnet.Net()
is in a different file, and is a class that extends torch.nn.Module
. it is torch.nn.Module
that is randomly initializing net.fc1.weight
, not my own code.
How is it possible that when I create a Net() object with randomly initialized parameters, it gives a completely different result every time?
Upvotes: 0
Views: 4177
Reputation: 101
try:
import torch
torch.manual_seed(0)
For further information: https://pytorch.org/docs/stable/notes/randomness.html
Upvotes: 3
Reputation: 1052
Have you looked at: https://github.com/pytorch/pytorch/issues/7068?
There are some recommendations on how to reproduce the results.
Example:
import sys
import random
import datetime as dt
import numpy as np
import torch
torch.manual_seed(42)
torch.cuda.manual_seed(42)
np.random.seed(42)
random.seed(42)
torch.backends.cudnn.deterministic = True
features = torch.randn(2, 5)
# Print stuff.
fnp = features.view(-1).numpy()
print("Time: {}".format(dt.datetime.now()))
for el in fnp:
print("{:.20f}".format(el))
print("Python: {}".format(sys.version))
print("Numpy: {}".format(np.__version__))
print("Pytorch: {}".format(torch.__version__))
Upvotes: 0