Reputation: 1523
Let's say I have some DataArray:
da = xr.DataArray(
data=np.random.random((25,25)),
dims=["x", "y"],
coords=dict(
x=np.arange(25),
y=np.arange(25),
),
)
I want to downsample this array to 5x5 chunks. I can do this with the coarsen function:
da_coarse = da.coarsen(x=5,y=5).mean()
As I understand it, this is basically splitting the DataArray into 5x5 "chunks" and averaging each chunk into one value. However, what I want to do is take a weighted average of this 5x5 group, so the center points are weighted more heavily in the final mean than the edge points.
I can create a gaussian kernel with weights like this:
def gkern(kernlen=21, sig=3):
import scipy.stats as st
"""Returns a 2D Gaussian kernel."""
x = np.linspace(-(kernlen/2)/sig, (kernlen/2)/sig, kernlen+1)
kern1d = np.diff(st.norm.cdf(x))
kern2d = np.outer(kern1d, kern1d)
return kern2d/kern2d.sum()
window = gkern(5)
Where window
is now a 5x5 array with the desired weights for each point. However, I am unsure how to implement this window/kernel when doing the averaging in the coarsen function.
What is the best way to do this?
Upvotes: 1
Views: 773
Reputation: 2097
One way to do this is through DataArrayCoarsen.construct
, which allows you to more easily operate on individual windows at a time:
weights = xr.DataArray(gkern(5), dims=["x_window", "y_window"])
windowed_da = da.coarsen(x=5, y=5).construct(
x=("x_coarse", "x_window"),
y=("y_coarse", "y_window")
)
coarsened = (weights * windowed_da).sum(["x_window", "y_window"]) / weights.sum()
windowed_da
is the original DataArray, but reshaped into individual windows of the size specified in the coarsen
step.
Upvotes: 1