Reputation: 43
Found a piece of python code that can run perfectly,but I couldn't understand how it works. Would appreciate it if you could explain the lining part for me. I totally don't know what it does
class BoxHead(nn.Module):#pending
def __init__(self, lengths, num_classes):
super(BoxHead, self).__init__()
#-------------------------------------------------------
self.cls_score = nn.Sequential(*tuple([
module for i in range(len(lengths) - 1)
for module in (nn.Linear(lengths[i], lengths[i + 1]), nn.ReLU())]
+ [nn.Linear(lengths[-1], num_classes)]))
#-----------------------------------------------------------------------
self.bbox_pred = nn.Sequential(*tuple([
module for i in range(len(lengths) - 1)
for module in (nn.Linear(lengths[i], lengths[i + 1]), nn.ReLU())]
+ [nn.Linear(lengths[-1], 4)]))
Upvotes: 2
Views: 72
Reputation: 44
After reading, I assert lengths as a list of numbers.
This is the input of nn.Sequential for cls_score
:
# lengths = [num1, num2, num3, ..., numN-1, numN]
# numN-1 just represents its meaning, and is not valid in python syntax
[
nn.Linear(num1, num2),
nn.ReLU(),
nn.Linear(num2, num3),
nn.ReLU(),
nn.Linear(num3, num4),
nn.ReLU(),
...,
nn.Linear(numN-1, numN),
nn.ReLU(),
nn.Linear(numN, num_classes)
]
Input of nn.Sequential for bbox_pred
is alike, but its last item is nn.Linear(numN, 4)
Upvotes: 1
Reputation: 448
nn.Sequential
accepts *args
which means you can pass any number of positional arguments which supposed to be PyTorch layers. In this case It creates len(lengts) - 1
blocks of nn.Linear
followed by nn.ReLU
each and also one linear layer at last.
Upvotes: 2