Begiiiner
Begiiiner

Reputation: 57

How can I fix the weights of 'torch.nn.Linear'?

  1. I want to know why there are two tensors in parameter list of nn.linear??

  2. I tried to set parameters, but it didn't work. How can I fix it?

XX=torch.from_numpy(X)
YY=torch.from_numpy(Y)
Ytt=torch.from_numpy(Yt)
XX=XX.view(100,1)
YY=YY.view(100,1)
Ytt=Ytt.view(100,1)
class model(torch.nn.Module):
  def __init__(self):
    super(model,self).__init__()
    self.linear1 = torch.nn.Linear(1,2).double()
    self.linear2 = torch.nn.Linear(2,2).double()
    self.linear3 = torch.nn.Linear(2,1).double()
  
  def forward(self,x):
    x=F.relu(self.linear1(x))
    x=F.relu(self.linear2(x))
    x=self.linear3(x)
    return x

M=model()
L=nn.MSELoss()
print(list(M.linear1.parameters()))
list(M.linear1.parameters())[0]=torch.Tensor([[-0.1],
        [ 0.2]])

print(list(M.linear1.parameters()))

Then

[Parameter containing:
tensor([[-0.2288],
        [ 0.2211]], dtype=torch.float64, requires_grad=True), Parameter containing:
tensor([-0.9185, -0.2458], dtype=torch.float64, requires_grad=True)]
[Parameter containing:
tensor([[-0.2288],
        [ 0.2211]], dtype=torch.float64, requires_grad=True), Parameter containing:
tensor([-0.9185, -0.2458], dtype=torch.float64, requires_grad=True)]

Upvotes: 1

Views: 3114

Answers (1)

Shai
Shai

Reputation: 114786

You have two parameter tensors in each nn.Linear: one for the weight matrix and the other for the bias. The function this layer implements is

y = Wx + b

You can set the values of a parameter tensor by accessing its data:

with torch.no_grad():
    M.linear1.weight.data[...] = torch.Tensor([[-0.1], [0.2]])

Upvotes: 3

Related Questions