Reputation: 6476
Is there a way to specify our own custom kernel values for a convolution neural network in pytorch? Something like kernel_initialiser in tensorflow? Eg. I want a 3x3 kernel in nn.Conv2d
with initialization so that it acts as a identity kernel -
0 0 0
0 1 0
0 0 0
(this will effectively return the same output as my input in the very first iteration)
My non-exhaustive research on the subject -
I could use nn.init but it only has some pre-defined kernel initialisaition values.
I tried to follow the discussion on their official thread but it doesn't suit my needs.
I might have missed something in my research please feel free to point out.
Upvotes: 4
Views: 5837
Reputation: 31
I think an easier solution is to :
deconv = nn.ConvTranspose2d(
in_channels=channel_dim, out_channels=channel_dim,
kernel_size=kernel_size, stride=stride,
bias=False, padding=1, output_padding=1
)
deconv.weight.data.copy_(
get_upsampling_weight(channel_dim, channel_dim, kernel_size)
)
in other words use copy_
Upvotes: 1
Reputation: 6476
Thanks to ptrblck I was able to solve it.
I can define a new convolution layer as conv
and as per the example I can set the identity kernel using -
weights = ch.Tensor([[0, 0, 0], [0, 1, 0], [0, 0, 0]]).unsqueeze(0).unsqueeze(0)
weights.requires_grad = True
conv = nn.Conv2d(1, 1, kernel_size=3, stride=1, padding=1, bias=False)
with ch.no_grad():
conv.weight = nn.Parameter(weights)
I can then continue to use conv
as my regular nn.Conv2d
layer.
Upvotes: 1