Reputation: 176
I know versions of this question appear frequently on here, but I was not able to find a solution that works for me. For some background on the problem, I have an RGB image that is divided into chunks of NxN pixels. I want to compute the average of each color channel separately for each chunk. I know numpy is best used by leveraging vectorized operations, but the level of higher-dimensional slicing and indexing required here is beyond me. Essentially I need the following functionality:
for row in tiles:
for col in row:
rsum = 0;
gsum = 0;
bsum = 0;
for n in col:
for vec in n:
rsum += vec[0]
gsum += vec[1]
bsum += vec[2]
col[..., 0] = rsum/n.shape[0]**2
col[..., 1] = gsum/n.shape[0]**2
col[..., 2] = bsum/n.shape[0]**2
Where the shape of my ndarray is:
tiles.shape = (138, 84, 100, 100, 4)
A 138x84 matrix of 100x100 matrices, where each element is a length-4 vector. Is there a way to do this without any loops? Should I reshape my ndarray? Any guidance is much appreciated.
Upvotes: 0
Views: 30
Reputation: 117681
Simply pass the axes you want to average over to np.mean
:
avg = np.mean(tiles, axis=(2, 3))
Upvotes: 1