ntd
ntd

Reputation: 2372

Unable to Normalize Tensor in PyTorch

I am trying to normalize the tensor outputted by my network but am getting an error in doing so. The code is as follows:

device = torch.device("cuda:0" if torch.cuda.is_available() else "cpu")
model_load_path = r'path\to\saved\model\file'
model.load_state_dict(torch.load(model_load_path))
model.eval()
output = model(input).to(device).view(-1, 1, 150, 150)
inv_normalize = transforms.Compose(
    [
        transforms.Normalize(mean=[-0.5/0.5], std=[1/0.5])
    ]
)
print(output.size())  # The size printed is torch.Size([1, 1, 150, 150])
output = inv_normalize(output)

I am getting an error in the following line:

output = inv_normalize(output)

The error reads:

TypeError: tensor is not a torch image.

My output is a single image, with a single channel, and height and width = 150

Any help will be appreciated! Thanks!

Upvotes: 0

Views: 5383

Answers (1)

Querenker
Querenker

Reputation: 2365

In order to apply transforms.Normalize you have to convert the input to a tensor. For this you can use transforms.ToTensor.

inv_normalize = transforms.Compose(
    [
        transforms.toTensor(),
        transforms.Normalize(mean=[-0.5/0.5], std=[1/0.5])
    ]
)

This tensor must consist of three dimensions (channels, height, width). Currently you have one dimension too much. Just remove the extra dimension in your view call:

output = model(input).to(device).view(1, 150, 150)

Upvotes: 2

Related Questions