WPF Touch Basics

In my previous posts, I briefly described the new WPF Touch feature. Was planning to get a more detailed post out but just couldnt get the time. Thankfully, Drake Campbell, one of our devs in Microsoft, sent me the following post on the basics of WPF Touch. Its a nice read.. So get started....

WPF Touch Basics

Things to know before we start

Understanding how RoutedEvents work might provide additional insight for this tutorial. https://msdn.microsoft.com/magazine/cc785480.

What is Touch? Essentially it is your finger as the mouse. You can think of Touch as any action involving a single finger. Inertia effects, dragging and throwing something, are not included as part of Touch. Those are covered in Manipulations. Touch events include: TouchDown, TouchUp, TouchMove, TouchEnter, TouchLeave, and other properties for checking if controls have touch focus.


The goal for this tutorial will be to create a small sample that will illustrate the use of the Touch events by using them to move a rectangle around.

Let’s Get started

Create a new WPF Application in VisualStudio2010 Beta2 or later and give it the name TouchSample.

Add a new User Control called TouchableThing to the project. In the TouchableThing.xaml file replace the Grid with a Canvas. We use a Canvas because the default layout is more desirable. Then add a rectangle to the Canvas. See Fig 1.

Fig 1.

<UserControl x:Class="TouchSample.TouchableThing"






            d:DesignHeight="300" d:DesignWidth="300">




           Width="200" Height="200"

           Fill="Orange" Stroke="Orange" StrokeThickness="1"





We give the Rectangle a name so we can access it in the code behind and we make it large enough to hit test using a finger. Notice the last property being set on the Rectangle, IsManipulationEnabled. Setting this property to true enables the Rectangle, or any control with the property set to true, to bubble Touch events. Since the Touch events are bubbled they do not have to be handled by the control that has the property set to true. The bubbling of the events is not as important when using Touch as it is when using Manipulations.

Now we have to add event listeners to some Touch events. I could have done this from xaml but I chose to do it in the code behind file. The decision was arbitrary. In the TouchableThing.xaml.cs file find the constructor and add event listeners to the TouchDown and TouchMove events. Using tab-completion in the editor will auto generate names and stubs for you. See Fig 2.

Fig 2.

public partial class TouchableThing : UserControl


    private Point lastTouchDownPoint;

    public TouchableThing()



    this.TouchDown += new EventHandler<TouchEventArgs>(TouchableThing_TouchDown);

    this.TouchMove += new EventHandler<TouchEventArgs>(TouchableThing_TouchMove);


    private void TouchableThing_TouchDown(object sender, TouchEventArgs e)



    private void TouchableThing_TouchMove(object sender, TouchEventArgs e)




I added a variable for storing the point where the user was last detected touching down to make life easier. It is also interesting to note that all the TouchEvents use the same type of event args.

Now for the fun part. We have to decide on how we want to manage the coordinate space between the parent and the rectangle. We want to move the rectangle by dragging it with our finger. Unfortunately physically setting the location of something in WPF is not considered best practice. We will use a TranslationTransform. We could also use a Matrix, but I think the TranslationTransform is easier. Either one is fine. Since we avoid simply moving the object to a new location we will have to do some point translations . If you have positioned the rectangle in a location other than the default on the Canvas, then you will also have to account for that in the value of this.lastTouchDownPoint. Since the default position is (0,0) this is not a problem. See Fig 3.


Fig 3.


private void TouchDownVersion2(object sender, TouchEventArgs e)


    this.lastTouchDownPoint = e.GetTouchPoint(this.BasicRect).Position;




So what we have done here is get the touch point with respect to the Rectangle. The GetTouchPoint method will perform the needed translations into the coordinate space of the target. This gives us a point in a coordinate space where the top left corner of the rectangle is 0,0. Essentially this is the position of the finger inside the rectangle. We will take advantage of this in the TouchMove event handler. See Fig 4.


Fig 4.

private void TouchableThing_TouchMove(object sender, TouchEventArgs e)


    Point currentPoint = e.GetTouchPoint(this).Position;

    double diffX = currentPoint.X - this.lastTouchDownPoint.X;

    double diffY = currentPoint.Y - this.lastTouchDownPoint.Y;

    if (Math.Abs(diffX) > 0 || Math.Abs(diffY) > 0)


         TranslateTransform transform = new TranslateTransform(diffX, diffY);

         this.BasicRect.RenderTransform = transform;




This event is similar to the MouseMove event in that it is called repeatedly until the TouchUp event is raised. The first thing we want to do is get the current location for the Touch event with respect to the control handeling the event. This is not intuitive since we captured the mouse point with respect to the rectangle in the TouchDown event. The reason this works is we are simply tracking the change in movement. We want to put the rectangle exactly where the user’s finger is, but we have to account for where the user’s finger went down inside the rectangle. This is taken care of by the subtraction. The if statement is just checking if there were any changes in the x or y directions before doing any work. This is not the most performant code but is illustrates how to use the TouchEventArgs at a basic level.


Almost done. All we have to do is add the TouchableThing to the Window1.xaml. See Fig 5.


Fig 5.

<Window x:Class="TouchSample.MainWindow"



       Title="MainWindow" Height="350" Width="525"


    <custom:TouchableThing />


Press F5 and enjoy. Stay tuned for the Manipulation walkthrough. We will take the same rectangle and allow it to be scaled, rotated, translated, and inertial; using the Manipulation events.


Share this post