Reputation: 15886
Interpolation on the CPU
I would like to be able to do interpolation on the CPU.
Consider the following example:
X1 | X2
-------
X3 | X4
X1, X2, X3 and X4 are all pixels (in Vector4 format). They all have the following coordinates.
Now, I want to be able to do interpolation between the pixels given a Vector2.
So let's say I want the color of the coordinates {X: 0.348, Y: 0.129}.
How would I interpolate properly between the pixels?
Why I want to do this
I know this sounds crazy, but it's what I want to do. I am trying to simulate a GPU shader algorithm on the CPU. It's for a Perlin Noise generator. I've already got the 2D terrain generated through a pixel shader that way in the game, and I want to be able to (at a given X, Y and Z coordinate) check if there's a wall present there or not, using the exact same algorithm on the CPU.
If you want to know more about what I am trying to do, see this question: https://gamedev.stackexchange.com/questions/15667/perlin-noise-copying-the-algorithm-on-the-cpu
Edit
I posted this the wrong place. It should have been in "gamedev.stackexchange.com". I hope you have the answer anyway.
Upvotes: 4
Views: 686
Reputation: 27215
What you are looking for is bilinear interpolation (Wikipedia).
The XNA Lerp
functions do linear interpolation - that is: one axis. Bi-linear is on two axes. Fortunately this is simply a matter of linearly interpolating on one axis (twice: once for each pair of inputs on that axis), and then linearly interpolating the two results.
So you'd do something like this:
Vector2 position = new Vector2(0.348f, 0.129f); // <- example data from question
Vector4 result = Vector4.Lerp(Vector4.Lerp(X1, X2, position.X),
Vector4.Lerp(X3, X4, position.X),
position.Y);
Upvotes: 6