Reputation: 1833
I have a numpy array data
of shape: [128, 64, 64, 64], and I wonder what's the best way to normalized each of the 128 slices into range [0.0, 1.0]. I understand i could use np.max(data[0,...]), np.max(data[1,...])... np.max(data[127,...]) to compute max values in each slice, but wonder if i could do this more efficiently.
Essentially something like this:
data_min = np.min(data[:,...])
data_max = np.max(data[:,...])
norm_data = (data[:,...] - data_min)/(data_max - data_min)
The result should still have shape [128, 64, 64, 64] But i haven't figured out which particular min/max functions and options to use to obtain the results.
Please advise. Thanks!
Upvotes: 1
Views: 682
Reputation: 221754
Get the min and max values, while keeping dimensions to help us with broadcasting
later on when we use those to normalize input data using the normalization formula, like so -
mins = data.min(axis=(1,2,3), keepdims=True)
maxs = data.max(axis=(1,2,3), keepdims=True)
norm_data = (data-mins)/(maxs-mins)
Upvotes: 2
Reputation: 521
You can use np.vectorize
to apply a function over all elements of an array:
def norm(element):
return (element - data_min) / (data_max - data_min)
ndnorm = np.vectorize(norm)
data_min = data.min()
data_max = data.max()
norm_data = ndnorm(data)
Upvotes: 0