Reputation: 311
Lets say I have a grid that is 4x4 tiles. And each tile is 5x5 pixels. How would I make a 1d array of this grid and set the coordinates of the tiles properly(the coordinates I am specifying are different from the rows and columns of a 2d array)? For example, after the 1d array is created, the tile at index 4 has the coordinates (0, 5) in pixels.
Upvotes: 2
Views: 4343
Reputation: 131
I am an amateur at programming so forgive me if I have got it wrong.
I think there is an error in the answer above by QBrute. AUG 30 ’16
It works because it only considers a square array ie 4 x 4.
If a rectangular array is involved ( 6 x 4) it will fail
I think the line oneDIndex = (rowSize * rowIndex) + columnIndex
should be oneDIndex = (columnSize * rowIndex) + columnIndex
Upvotes: 1
Reputation: 4536
Assume you have a grid with dimensions w
and h
, where each tile is a square with size n
. The first thing you have to consider is what coordinates the upper left corner (or lower left corner, depending on the orientation of your coordinate system) of each tile has. (I will use the upper left corner as (0,0)
)
Since the first tile starts at (0,0)
and each tile is n
pixels wide and n
pixels, the next tiles in the same row will have coordinates of (n,0)
, (2*n, 0)
... ((w-1)*n, 0)
. This applies to all the next rows as well. The next row has an y-offset of n
pixels, so its coordinates are in the form (0,n)
, (n,n)
, (2*n, n)
and so forth until you reach the very last tile with starting coordinates ((w-1)*n, (h-1)*n)
.
Next, you need to know how to transform a 2D-Array into a 1D-Array. For this, consider how the indices of the 2D and 1D-Array are related:
2D-Array:
0: [0,1,2,3]
1: [0,1,2,3]
2: [0,1,2,3]
3: [0,1,2,3]
In a 1D-Array, the same would look like this:
1D-Array:
[(0,0),(1,0),(2,0),(3,0),(0,1),(1,1),(2,1),(3,1),(0,2),(1,2),(2,2),(3,2),(0,3),(1,3),(2,3),(3,3)]
The first row of the 2D-Array fit into indices 0-3
, the next row into indices 4-7
, the third row into indices 8-11
and the fourth row into indices 12-15
. Each index starts at 4 (the size of a row)
times the current rowIndex and is offset by the current columnIndex.
So you can apply the following formula to calculate the corresponding 1D-index (Note: This works only for rectangular 2D-Arrays, where each row has the same amount of elements):
oneDIndex = (rowSize * rowIndex) + columnIndex;
Test it out for (1,1)
of the 2D-Array:
oneDIndex = (1 * 4) + 1 // == 5
And as you can see, index 5
of the 1D-Array does contain the value (1,1)
.
Which means you can translate a 2D-Array into a 1D-Array with the following code:
for(int y = 0; y < twoDArray.length; y++) {
for(int x = 0; x < twoDArray[y].length; x++) {
int oneDIndex = (twoDArray[y].length * y) + x;
oneDArray[oneDIndex] = twoDArray[y][x];
}
}
All that's left is to combine these two ideas. For this, you need to map the upper left corners of each grid tile to its corresponding 2D-Index and then use the above code snippet to insert that into the 1D-Array:
The mapping can be done with the following code:
gridCoordinate.x = tileCoordinate.x / tileSize;
gridCoordinate.y = tileCoordinate.y / tileSize;
Considering your specific problem with grid size 4*4
and tile size 5*5
, take the tile at coordinates (5,5)
. Inserting this into above code snippet gives the grid coordinates (1,1)
. This is correct, since you can go 5 pixels right and 5 pixels down and you'll be inside the tile at that grid index.
Putting it all together:
In reverse, if you have the 1D-Index and you want the corresponding 2D-Index, you can do the following:
twoDYIndex = oneDIndex / gridSize;
twoDXIndex = oneDIndex % gridSize;
Let's say you want the element at index 4
in your 1D-Array (as stated in the question).
twoDYIndex = 4 / 4 // == 1
twoDXIndex = 4 % 4 // == 0
So in your 2D-Array this will be element (0,1)
in your grid, which is at coordinates (0,5)
as desired.
Upvotes: 2