Reputation: 179
Pytorch code is giving an error of missing position argument, while I have already given x as an input argument.
Code:
import torch.nn as nn
import torch.nn.functional as F
class Network(nn.Module):
def __init__(self):
super().__init__()
self.hidden = nn.Linear(8, 5)
self.output = nn.Linear(5, 1)
def forward(self, x):
x = 2*F.sigmoid(self.hidden(x))
x = F.softmax(self.output(x), dim= 0)
return x
x = torch.tensor([1.0, 2.0, 3.0, 4.0,5.0,6.0,7.0,8.0] , dtype = torch.float32)
f = Network()
print(f(x))
tensor([1.], grad_fn=)
Network.forward(x)
--------------------------------------------------------------------------- TypeError Traceback (most recent call last) Input In [98], in <cell line: 1>() ----> 1 Network.forward(x)
TypeError: forward() missing 1 required positional argument: 'x'
Upvotes: 2
Views: 765
Reputation: 319
Network.forward(x)
- in this line you are calling method using class, not instance. It requires 2 parameters in this case: self
and x
.
There is no need to call forward method directly.
The following lines perform forward call implicitly and it is a proper way of using it.
f = Network()
print(f(x))
UPD
f = Network()
network_output = f(x) # <-- this will perform `forward` method indirectly
It is described here: torch.nn.Module
Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
Instead of f.forward(x)
you have to do just f(x)
. Otherwise module functionality will be incomplete because you will not get any hooks registered for the given module. It is so for all modules in PyTorch. For example, you used self.hidden(x)
in your code, not self.hidden.forward(x)
.
Upvotes: 1