Reputation: 43
My goals are quite simple. I want to be able to do multiple transformation of an image in my UWP application using primarily touch screen gestures. More precise I want to handle each manipulations like
Pan (Translate) - One finger gesture to pan image around the canvas.
Scale - pinch to zoom, use center of gestures as zoom center.
Rotate - Standard two finger gesture, use center of gesture as rotation center.
I have created a very simple POC, which can be found here
The POC is quite simple and looks like this
<Page
x:Class="GesturesManipulations.MainPage"
xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"
xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"
xmlns:local="using:GesturesManipulations"
xmlns:d="http://schemas.microsoft.com/expression/blend/2008"
xmlns:mc="http://schemas.openxmlformats.org/markup-compatibility/2006"
mc:Ignorable="d">
<Canvas x:Name="container" Background="Silver">
<Image
x:Name="image"
Stretch="Uniform"
Canvas.Left="100" Canvas.Top="200"
Source="Assets/windows10-uwp.jpg"
ManipulationMode="TranslateX,TranslateY,Scale,Rotate"
ManipulationDelta="OnManipulationDelta">
<Image.RenderTransform>
<TransformGroup>
<ScaleTransform x:Name="scaleTransform" />
<RotateTransform x:Name="rotateTransform" />
<TranslateTransform x:Name="translateTransform" />
</TransformGroup>
</Image.RenderTransform>
</Image>
</Canvas>
</Page>
[codebehind]
private void OnManipulationDelta(object sender, ManipulationDeltaRoutedEventArgs e)
{
FrameworkElement origin = sender as FrameworkElement;
FrameworkElement parent = origin.Parent as FrameworkElement;
var localCoords = e.Position;
var relativeTransform = origin.TransformToVisual(parent);
Point parentContainerCoords = relativeTransform.TransformPoint(localCoords);
var center = parentContainerCoords;
// translate/panning
translateTransform.X += e.Delta.Translation.X;
translateTransform.Y += e.Delta.Translation.Y;
rotateTransform.CenterX = center.X;
rotateTransform.CenterY = center.Y;
rotateTransform.Angle += e.Delta.Rotation;
scaleTransform.CenterX = center.X;
scaleTransform.CenterY = center.Y;
scaleTransform.ScaleX *= e.Delta.Scale;
scaleTransform.ScaleY *= e.Delta.Scale;
}
Overall the functionality seems to works. But it also seems like the transform center has some problems. Starting a new gestures really often the image makes a jumps/offset and manipulations sometimes use the wrong center.
I have tried a lot of stuff over the last few days but I can't seem to get hold of the real problem. Maybe I am just doing this to simple. Do I have to do stuff in some of the other manipulation events. Hope someone can help me out.
Thanks in advance.
Upvotes: 3
Views: 1764
Reputation: 330
After struggling a lot with that center rotation problem, I finally found the solution.
According to a Microsoft Blog entry the way to implement this is the following:
void ManipulateMe_ManipulationDelta(object sender, ManipulationDeltaRoutedEventArgs e)
{
previousTransform.Matrix = transforms.Value;
// Get center point for rotation
Point center = previousTransform.TransformPoint(new Point(e.Position.X, e.Position.Y));
deltaTransform.CenterX = center.X;
deltaTransform.CenterY = center.Y;
// Look at the Delta property of the ManipulationDeltaRoutedEventArgs to retrieve
// the rotation, scale, X, and Y changes
deltaTransform.Rotation = e.Delta.Rotation;
deltaTransform.TranslateX = e.Delta.Translation.X;
deltaTransform.TranslateY = e.Delta.Translation.Y;
}
But the problem is, that the e.Position value is NOT relative to the element where the ManipulationDelta event is called, but the element which was touched! So if you have multiple elements inside your element, for example a grid, then the e.Position is relative to the child element which was touched. So we first need to transform the e.Position to the container element and then transform that to the screen position.
So the correct code should be:
void ManipulateMe_ManipulationDelta(object sender, ManipulationDeltaRoutedEventArgs e)
{
previousTransform.Matrix = transforms.Value;
// Get center point for rotation
var posX = e.Position.X;
var posY = e.Position.Y;
////// FIRST GET TOUCH POSITION RELATIVE TO THIS ELEMENT ///
if(e.Container != null)
{
var p = e.Container.TransformToVisual(this).TransformPoint(new Point(posX, posY)); //transform touch point position relative to this element
posX = p.X;
posY = p.Y;
}
Point center = previousTransform.TransformPoint(new Point(posX, posY));
deltaTransform.CenterX = center.X;
deltaTransform.CenterY = center.Y;
// Look at the Delta property of the ManipulationDeltaRoutedEventArgs to retrieve
// the rotation, scale, X, and Y changes
deltaTransform.Rotation = e.Delta.Rotation;
deltaTransform.TranslateX = e.Delta.Translation.X;
deltaTransform.TranslateY = e.Delta.Translation.Y;
}
Upvotes: 0