ENHorse
ENHorse

Reputation: 49

looking for an equivalent of Tensorflow normalization layer in Pytorch

I was using 'tf.keras.layers.experimental.preprocessing.Normalization'. This layer is cool since you can save weights in this layer to normalize any input data to this layer. However, I couldn't find any normalization layer in Pytorch. Is there a layer that functions the same role?

Upvotes: 0

Views: 1489

Answers (2)

Cynichniy Bandera
Cynichniy Bandera

Reputation: 6103

Perhaps you spent about 1 sec looking for it :-) In pytorch, it is done through the transformations.

For example:

from torchvision import transforms

transforms.Normalize(
    mean=[0.485, 0.456, 0.406],
    std=[0.229, 0.224, 0.225]
)

There are literally thousands of examples over there on the internet.

Upvotes: 0

Ivan
Ivan

Reputation: 40648

There is no built-in that achieves this is PyTorch. However, you can measure the mean and standard deviation yourself (keeping only the relevant axes), then use torchvision.transform.Normalize with those statistics.

For instance in order to measure mean and std over the channels:

>>> x = torch.rand(16, 3, 10, 10)
>>> mean, std = x.mean((0, 2, 3)), x.std((0, 2, 3))
(tensor(0.4941), tensor(0.2899))

Then initialize a transform:

>>> t = torchvision.transform.Normalize(mean, std)

You can use this function on a new dataset to normalize it based on the initial dataset's statistics:

>>> z_normalized = t(z)

Upvotes: 3

Related Questions