user836026
user836026

Reputation: 11350

Saving the weight of one layer in Pytorch

I would like to save the weight of a model, but not the whole model like this:

torch.save(model, 'model.pth')

But rather, just one layer. for example, suppose, I have defined one layer like this:

self.conv_up3 = convrelu(256 + 512, 512, 3, 1)

How do I save the weight of only this layer. And also how do I load it for this layer.

Upvotes: 0

Views: 1402

Answers (1)

Umang Gupta
Umang Gupta

Reputation: 16450

You can do the following to save/get parameters of the specific layer:

specific_params = self.conv_up3.state_dict()
# save/manipulate `specific_params` as you want

And similarly, to load the params to that specific layer:

self.conv_up3.load_state_dict(params)

You can do this because each layer is a neural network (nn.Module instance) in itself.

Upvotes: 2

Related Questions