Jim421616
Jim421616

Reputation: 1536

Subtract the average of first and last value of each row from all values in the row

I have a numpy array that looks like this:

77.132  2.075   63.365  74.880
49.851  22.480  19.806  76.053
16.911  8.834   68.536  95.339
0.395   51.219  81.262  61.253
72.176  29.188  91.777  71.458
54.254  14.217  37.334  67.413
44.183  43.401  61.777  51.314
65.040  60.104  80.522  52.165
90.865  31.924  9.046   30.070
11.398  82.868  4.690   62.629

and what I'm trying to do is

I've tried it using for loops but I can't get it working:

import numpy as np

#   Create random arrays to simulate images
np.random.seed(10)
image = 100 * np.random.rand(10, 4)

no_disk_list = []

#for row in image:
#    left, right =   row[0], row[-1]
#    average = (left + right) / 2.0
#    for i in row:
#        no_average = row[i] - average
#        print(average)
#        no_disk_list.append(no_average)

subtracted = np.ones_like(image)
height, width = image.shape
for row in image:
    left, right =   image[0], image[-1]
    average = (left + right) / 2.0
    for element in row:
        subtracted[row, element] = image[row, element] - average

Both of the nested loops gives an error:

  File "C:/Users/Jeremy/Dropbox/Astro480/NEOWISE/subtract_disk.py", line 17, in <module>
    no_disk_value = row[i] - disk_value

IndexError: only integers, slices (`:`), ellipsis (`...`), numpy.newaxis (`None`) and integer or boolean arrays are valid indices

for the first loop and

  File "C:/Users/Jeremy/Dropbox/Astro480/NEOWISE/subtract_pixels.py", line 23, in <module>
    print(image[row, element])

IndexError: arrays used as indices must be of integer (or boolean) type

for the second. The questions here, here, and here are of limited use in my situation. Besides, I know that vectorization would be a better way to go, since the image I'll eventually be using has 1.3 million pixels. How can I get the loops working, or even better, vectorize the calculation?

Upvotes: 1

Views: 200

Answers (1)

MSeifert
MSeifert

Reputation: 152725

If I understand the question correctly this will work:

subtracted = np.ones_like(image)
height, width = image.shape
for row_no, row in enumerate(image):   # keep the row number using enumerate
    left, right = row[0], row[-1]      # you need the first and last value of the ROW!
    average = (left + right) / 2.0
    # Also use enumerate in the inner loop
    for col_no, element in enumerate(row):
        subtracted[row_no, col_no] = element - average

You can even use broadcasting ("vectorization") to shorten this considerably:

subtracted = image - (image[:, [0]] + image[:, [-1]]) / 2

The image[:, [0]] is the first column, the image[:, [-1]] is the last column. By adding and dividing them by 2 you get a 2D array containing the averages of each row. The final step is subtracting this from the image, which is easy in this case because it will broadcast correctly.

Step-by-step:

>>> arr = np.arange(20).reshape(4, 5)
>>> arr
array([[ 0,  1,  2,  3,  4],
       [ 5,  6,  7,  8,  9],
       [10, 11, 12, 13, 14],
       [15, 16, 17, 18, 19]])
>>> arr[:, [0]]  # first column
array([[ 0],
       [ 5],
       [10],
       [15]])
>>> arr[:, [-1]]  # last column
array([[ 4],
       [ 9],
       [14],
       [19]])
>>> (arr[:, [0]] + arr[:, [-1]]) / 2   # average
array([[  2.],
       [  7.],
       [ 12.],
       [ 17.]])
>>> arr - (arr[:, [0]] + arr[:, [-1]]) / 2  # subtracted
array([[-2., -1.,  0.,  1.,  2.],
       [-2., -1.,  0.,  1.,  2.],
       [-2., -1.,  0.,  1.,  2.],
       [-2., -1.,  0.,  1.,  2.]])

Upvotes: 1

Related Questions