Reputation: 2148
I'm building a "navigation type" app for android.
For the navigation part I'm building an Activity where the user can move and zoom the map (which is a bitmap) using touch events, and also the map rotate around the center of the screen using the compass.
I'm using Matrix to scale, transpose and rotate the image, and than I draw it to the canvas.
Here is the code called on loading of the view, to center the image in the screen:
image = new Matrix();
image.setScale(zoom, zoom);
image_center = new PointF(bmp.getWidth() / 2, bmp.getHeight() / 2);
float centerScaledWidth = image_center.x * zoom;
float centerScaledHeigth = image_center.y * zoom;
image.postTranslate(
screen_center.x - centerScaledWidth,
screen_center.y - centerScaledHeigth);
The rotation of the image is doing using the postRotate method.
Then in the onDraw() method I only call
canvas.drawBitmap(bmp, image, drawPaint);
The problem is that, when the user touch the screen, I want to get the point touched on the image, but apparently I can't get the correct position. I tried to invert the image matrix and translate the touched points, it isn't working.
Do somebody know how to translate the point coordinates?
EDIT
I'm using this code for traslation. dx and dy are translation values get from the onTouch listener. *new_center* is an array of float values in this form {x0, y0, x1, y1...}
Matrix translated = new Matrix();
Matrix inverted = new Matrix();
translated.set(image);
translated.postTranslate(dx, dy);
translated.invert(inverted);
inverted.mapPoints(new_center);
translated.mapPoints(new_center);
Log.i("new_center", new_center[0]+" "+new_center[1]);
Actually I tried using as *new_center = {0,0}*:
appling only the translated matrix, I get as espected the distance between the (0,0) point of the bmp and the (0,0) point of the screen, but it seems to not take account of the rotation.
Appling the inverted matrix to the points I get those results, moving the image in every possible way.
12-26 13:26:08.481: I/new_center(11537): 1.9073486E-6 -1.4901161E-7
12-26 13:26:08.581: I/new_center(11537): 0.0 -3.874302E-7
12-26 13:26:08.631: I/new_center(11537): 1.9073486E-6 1.2516975E-6
12-26 13:26:08.781: I/new_center(11537): -1.9073486E-6 -5.364418E-7
12-26 13:26:08.951: I/new_center(11537): 0.0 2.682209E-7
12-26 13:26:09.093: I/new_center(11537): 0.0 7.003546E-7
Instead I was especting the coordinates translated on the image.
Is it correct my line of thoughts?
Upvotes: 2
Views: 12195
Reputation: 2148
Ok, I get it.
First I separated the rotation from the translation and zooming of image.
Because I created a custom ImageView, this was simple. I apply the rotation to the canvas of the ImageView, and the other transformations to the matrix of the image.
I get trace of the canva's matrix throught a global matrix variable.
Some code:
To set the correct movement for the corresponding onTouch event, first I "rotate back" the points passed from onTouch (start and stop points) using the inverse of the matrix of the canvas
Then I calculate the difference between x and y, and apply that to the image matrix.
float[] movement = {start.x, start.y, stop.x, stop.y};
Matrix c_t = new Matrix();
canvas.invert(c_t);
c_t.mapPoints(movement);
float dx = movement[2] - movement[0];
float dy = movement[3] - movement[1];
image.postTranslate(dx, dy);
If instead you want to check that the image movement don't exceed its size, before the image.postTranslate(dx, dy); you put this code:
float[] new_center = {screen_center.x, screen_center.y};
Matrix copy = new Matrix();
copy.set(image);
copy.postTranslate(dx, dy);
Matrix translated = new Matrix();
copy.invert(translated);
translated.mapPoints(new_center);
if ((new_center[0] > 0) && (new_center[0] < bmp.getWidth()) &&
(new_center[1] > 0) && (new_center[1] < bmp.getHeight())) {
// you can remove the image.postTranslate and copy the "copy" matrix instead
image.set(copy);
...
It's important to note that:
A) The center rotation of the image is the center of the screen, so it will not change coordinates during the canvas' rotation
B) You can use the coordinates of the center of the screen to get the rotation center of the image.
With this method you can also convert every touch event to image coordinates.
Upvotes: 3