Reputation: 145
I am using this repo as my reference.
After running till 78th epochs, I saved the model as in utils.py. However, when perform the following code:
CHECKPOINT_GEN = "output/vanilla/checkpoints/gen-78.pth.tar"
gen = Generator(in_channels=3, features=64).to("cpu")
gen.apply(weights_init)
opt_gen = optim.Adam(gen.parameters(), lr=config.LEARNING_RATE, betas=(0.5, 0.999))
scheduler_gen = optim.lr_scheduler.StepLR(opt_gen, step_size=100, gamma=0.1)
gen_checkpoint = torch.load(CHECKPOINT_GEN, map_location="cpu")
for k,v in gen_checkpoint["state_dict"].items():
print(k)
for k,v in gen.state_dict().items():
print(k)
First k yields, with layers named like these:
initial_down.0.weight,
initial_down.0.bias,
down1.conv.0.weight,
down1.conv.1.weight,
down1.conv.1.bias...
However in the second snippet, it yields only "0"-contained-in name- layers. (which means only layers like initial_down.0.weight, initial_down.0.bias,down1.conv.0.weight and not down1.conv.1.weight these) Because of this, I am not able to load the generator model to evaluate. How to fix this issue?
Upvotes: 0
Views: 646