Reputation: 1314
Python Version: Python 3.8.5
Pytorch Version: '1.6.0'
I am defining LSTM, a subclass of nn.Module. I am trying to create an optimizer but I am getting the following error: torch.nn.modules.module.ModuleAttributeError: 'LSTM' object has no attribute 'paramters'
I have two code files, train.py and lstm_class.py (contain the LSTM class). I will try to produce a minimum working example, let me know if any other information is helpful.
The code in lstm_class.py:
import torch.nn as nn
class LSTM(nn.Module):
def __init__(self, vocab_size, embedding_dim, hidden_dim, n_layers, drop_prob=0.2):
super(LSTM, self).__init__()
# network size parameters
self.n_layers = n_layers
self.hidden_dim = hidden_dim
self.vocab_size = vocab_size
self.embedding_dim = embedding_dim
# the layers of the network
self.embedding = nn.Embedding(self.vocab_size, self.embedding_dim)
self.lstm = nn.LSTM(self.embedding_dim, self.hidden_dim, self.n_layers, dropout=drop_prob, batch_first=True)
self.dropout = nn.Dropout(drop_prob)
self.fc = nn.Linear(self.hidden_dim, self.vocab_size)
def forward(self, input, hidden):
# Defines forward pass, probably isn't relevant
def init_hidden(self, batch_size):
#Initializes hidden state, probably isn't relevant
The code in train.py
import torch
import torch.optim
import torch.nn as nn
import lstm_class
vocab_size = 1000
embedding_dim = 256
hidden_dim = 256
n_layers = 2
net = lstm_class.LSTM(vocab_size, embedding_dim, hidden_dim, n_layers)
optimizer = torch.optim.Adam(net.paramters(), lr=learning_rate)
I am getting the error on the last line written above. The full error message:
Traceback (most recent call last):
File "train.py", line 58, in <module>
optimizer = torch.optim.Adam(net.paramters(), lr=learning_rate)
File "/usr/local/lib/python3.8/site-packages/torch/nn/modules/module.py", line 771, in __getattr__
raise ModuleAttributeError("'{}' object has no attribute '{}'".format(
torch.nn.modules.module.ModuleAttributeError: 'LSTM' object has no attribute 'paramters'
Any tips for how to fix this would be appreciated. Also as written above, let me know if anything else would be relevant. Thanks
Upvotes: 1
Views: 4840