NoesisGUI

Touch and Manipulation Tutorial

github Tutorial Data

NoesisGUI enables having user interfaces that receive multiple touches simultaneously. A touch is a type of user input generated by putting fingers on a touch-sensitive screen. The touch contacts, and their movement, are interpreted as touch gestures and manipulations to support various user interactions.

You can manage touch events in NoesisGUI at four different levels:

  • Using Native controls included in NoesisGUI by default.
  • Touch events such as TouchDown and TouchMove provide low-level details for each touch contact, including finger motion and the ability to distinguish press and release events.
  • Manipulation events, such as ManipulationStarted, indicate an ongoing interaction. They start firing when the user touches an element and continue until the user lifts their finger(s), or the manipulation is canceled.
  • Gesture events, such as Tapped and DoubleTapped, trigger after an interaction is complete.

Interactions

Native Controls

ScrollViewer defines the PanningMode attached property that enables you to specify whether touch panning is enabled horizontally, vertically, both, or neither. The PanningDeceleration property specifies how quickly the scrolling slows down when the user lifts the finger from the touchscreen. The PanningRatio attached property specifies the ratio of scrolling offset to translate manipulation offset.

PanningMode can be either set directly on a ScrollViewer or used as an attached property. When a control contains a ScrollViewer in its ControlTemplate, PanningMode is used as an attached property to specify the behavior of the ScrollViewer in the ControlTemplate. When you use a ScrollViewer outside of a ControlTemplate, PanningMode is directly set on the ScrollViewer.

The following example creates a ScrollViewer and adds several buttons to it. The example sets PanningMode to Both so that the user can scroll the ScrollViewer horizontally and vertically by using fingers.

<Grid
  xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"
  xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml">

    <ScrollViewer HorizontalScrollBarVisibility="Auto" Height="250" Width="125" PanningMode="Both">
      <StackPanel>
        <Button Width="150" Height="80">Push me if you dare</Button>
        <Button Width="150" Height="80">Push me if you dare</Button>
        <Button Width="150" Height="80">Push me if you dare</Button>
        <Button Width="150" Height="80">Push me if you dare</Button>
        <Button Width="150" Height="80">Push me if you dare</Button>
        <Button Width="150" Height="80">Push me if you dare</Button>
        <Button Width="150" Height="80">Push me if you dare</Button>
        <Button Width="150" Height="80">Push me if you dare</Button>
        <Button Width="150" Height="80">Push me if you dare</Button>
        <Button Width="150" Height="80">Push me if you dare</Button>
        <Button Width="150" Height="80">Push me if you dare</Button>
        <Button Width="150" Height="80">Push me if you dare</Button>
      </StackPanel>
    </ScrollViewer>

</Grid>

The next example creates a TextBox and uses PanningMode as an attached property. It sets PanningMode to VerticalOnly.

<Grid
  xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"
  xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml">

  <TextBox Height="300" Width="300" Name="textBox1" TextWrapping="Wrap" IsReadOnly="True"
    ScrollViewer.PanningMode="VerticalOnly" ScrollViewer.VerticalScrollBarVisibility="Auto"
    ScrollViewer.HorizontalScrollBarVisibility="Auto">

      Hello world 1 Hello world 2 Hello world 3 Hello world 4 Hello world 5
      Hello world 6 Hello world 7 Hello world 8 Hello world 9 Hello world 10
      Hello world 11 Hello world 12 Hello world 13 Hello world 14 Hello world 15
      Hello world 16 Hello world 17 Hello world 18 Hello world 19 Hello world 20
      Hello world 21 Hello world 22 Hello world 23 Hello world 24 Hello world 25
      Hello world 26 Hello world 27 Hello world 28 Hello world 29 Hello world 30
      Hello world 31 Hello world 32 Hello world 33 Hello world 34 Hello world 35
      Hello world 36 Hello world 37 Hello world 38 Hello world 39 Hello world 40
      Hello world 41 Hello world 42 Hello world 43 Hello world 44 Hello world 45
      Hello world 46 Hello world 47 Hello world 48 Hello world 49 Hello world 50
      Hello world 51 Hello world 52 Hello world 53 Hello world 54 Hello world 55
      Hello world 56 Hello world 57 Hello world 58 Hello world 59 Hello world 60
      Hello world 61 Hello world 62 Hello world 63 Hello world 64 Hello world 65
      Hello world 66 Hello world 67 Hello world 68 Hello world 69 Hello world 70
      Hello world 71 Hello world 72 Hello world 73 Hello world 74 Hello world 75
      Hello world 76 Hello world 77 Hello world 78 Hello world 79 Hello world 80
      Hello world 81 Hello world 82 Hello world 83 Hello world 84 Hello world 85
      Hello world 86 Hello world 87 Hello world 88 Hello world 89 Hello world 90
      Hello world 91 Hello world 92 Hello world 93 Hello world 94 Hello world 95
      Hello world 96 Hello world 97 Hello world 98 Hello world 99 Hello world 100
  </TextBox>

</Grid>

Touch Events

UIElement defines the following events that you can subscribe to so your application can respond to touch:

  • TouchDown: occurs when a finger touches the screen while the finger is over an element.
  • TouchMove: occurs when a finger moves on the screen while the finger is over an element.
  • TouchUp: occurs when a finger is raised off of the screen while the finger is over an element.
  • TouchEnter: occurs when a touch moves from outside to inside the bounds of an element.
  • TouchLeave: occurs when a touch moves from inside to outside the bounds of an element.
  • PreviewTouchDown: preview event for TouchDown.
  • PreviewTouchMove: preview event for TouchMove.
  • PreviewTouchUp: preview event for TouchUp.
  • GotTouchCapture: occurs when a touch is captured to an element.
  • LostTouchCapture: occurs when this element loses a touch capture.

NOTE

In our Application Framework the command line '--emulate_touch' can be used to enable the emulation of touch input from mouse events.

Note that the coordinates of each touch are given in screen space. The method Visual::PointFromScreen must be used to convert them to local coordinates.

Like keyboard and mouse events, touch events are routed events. The events whose name begins with Preview are tunneling events and the events that begin with Touch are bubbling events. The sequence is as follows:

  • The TouchEnter event occurs one time when the user puts a finger on the element.
  • The TouchDown event occurs one time.
  • The TouchMove event occurs multiple times as the user moves the finger within the element.
  • The TouchUp event occurs one time when the user lifts the finger from the element.
  • The TouchLeave event occurs one time.

When more than two fingers are used, the events occur for each finger.

NOTE

In NoesisGUI, if a touch event is not handled, it is promoted to a mouse event.

Manipulation Events

For cases where an application enables users to manipulate an object, the UIElement class defines manipulation events. Unlike touch events that simply report the position of touch, manipulation events report how the input can be interpreted. There are three types of manipulations: translation, expansion, and rotation. The following list describes how to invoke the three types of manipulations:

  • Put a finger on an object and move the finger across the touchscreen to invoke a translation manipulation. This moves the object.
  • Put two fingers on an object and move the fingers closer together or farther apart from one another to resize the object.
  • Put two fingers on an object and rotate the fingers around each other to invoke a rotation manipulation. This rotates the object.

The UIElement defines the following manipulation routed events:

  • ManipulationStarting: occurs when the manipulation processor is first created.
  • ManipulationStarted: occurs when the manipulation processor detects touch movement.
  • ManipulationDelta: occurs when the input device changes position during a manipulation.
  • ManipulationInertiaStarting: occurs when the input device loses contact with the UIElement object during a manipulation.
  • ManipulationCompleted: occurs when a manipulation and inertia on the UIElement object is complete.

By default, these manipulation events are not generated. To receive manipulation events on a UIElement, set the dependency property IsManipulationEnabled to true in the XAML or by code.

NOTE

In comparison with WPF, Manipulation events are sent in parallel to Touch events. Even if a control is already handling touch events in the control logic, that doesn't prevent Manipulation events from firing.

ManipulationStarting

The ManipulationStarting event occurs when the user places a finger on the object. Among other things, this event allows you to set the ManipulationContainer property. In the subsequent events, the position of the manipulation will be relative to this ManipulationContainer. You can also set the Mode property to indicate the types of manipulations that are possible (Translate, Rotate or Scale).

ManipulationStarted

The ManipulationStarted event occurs after ManipulationStarting once a minimum threshold distance is detected. By default this distance is 10px and can be changed using IView::SetManipulationDistanceThreshold. This event reports the origin of the manipulation.

ManipulationDelta

The ManipulationDelta event occurs multiple times as a user's fingers move on a touchscreen. The DeltaManipulation property of the ManipulationDeltaEventArgs class reports whether the manipulation is interpreted as movement, expansion, or translation. This is where you perform most of the work of manipulating an object. Note that ManipulationDelta occurs before and after the ManipulationInertiaStarting event. ManipulationDeltaEventArgs.isInertial property reports whether the ManipulationDelta event occurs during inertia, so you can check that property and perform different actions, depending on its value.

ManipulationInertiaStarting

The ManipulationInertiaStarting event occurs when the user's fingers lose contact with the object. This event enables you to specify the deceleration of the manipulations during inertia. This is so your object can emulate different physical spaces or attributes if you choose. For example, suppose your application has two objects that represent items in the physical world, and one is heavier than the other. You can make the heavier object decelerate faster than the lighter object.

ManipulationCompleted

The ManipulationCompleted event occurs when the manipulation and any inertia ends. That is, after all the ManipulationDelta events occur, this event signals that the manipulation is complete.

NOTE

You can cancel the manipulation by setting the cancel property on the event arguments in any manipulation event. When set to true, the manipulation events are no longer raised.

Gesture events

Static gesture events are triggered after an interaction is complete. Gesture events include Tapped, DoubleTapped, RightTapped, and Holding. All of them are routed events and they can be disabled on specific elements by setting IsTapEnabled, IsDoubleTapEnabled, IsRightTapEnabled, and IsHoldingEnabled to false (true is the default).

It is possible to handle these events on parent elements even if the corresponding IsXXXEnabled property is false on the parent element, if the event bubbles to a parent from an event source child element where IsXXXEnabled is true.

NOTE

Tapped, DoubleTapped, RightTapped and Holding are taken from the UWP equivalents. Not being part of WPF, these events are implemented as extensions.

Tapped

Tapped triggers when a finger is touched to the screen briefly, and removed. The general idea is that a Tap interaction on an element invokes the element's primary action in your app.

These controls do not raise the Tapped event:

DoubleTapped

A DoubleTap interaction is simply two Tap interactions that occur in quick succession. If a user interaction also fires DoubleTapped, Tapped will fire first to represent the first tap, but the second tap won't fire an additional Tapped.

RightTapped

RightTapped results from processing an action that remains in one place for a certain amount of time. A Holding event from the same element always precedes this, but RightTapped won't fire until the touch point is released. If the pointer pressed time is too short and Tapped fires instead of Holding, or if the Hold action ends with HoldingState as Canceled, RightTapped won't fire.

Holding

Tapped, DoubleTapped, and RightTapped events occur only after the touch point is removed. But the initial Holding event occurs while the touch point is still in contact. This event occurs if the touch point remains in approximately the same position for a period of time. The exact timing of what Noesis interprets as a holding action is adjustable by calling IView::SetHoldingTimeThreshold.

Holding events generally occur in pairs. When the action is first interpreted as a Hold action based on no movement for a period of time, Holding fires, with HoldingState value of Started in the HoldingRoutedEventArgs event data. When the Hold action ends, another Holding event fires, this time with HoldingState of either Completed or Canceled.

  • The Hold action ends with HoldingState as Completed if the user doesn't move the finger during the Hold state and then releases the finger that initiated the action. For this case, RightTapped fires just after the second Holding event.
  • The Hold action ends with HoldingState as Canceled if the user does move the finger that initiated the action. If the Hold action ends with HoldingState as Canceled, RightTapped won't fire.

Tapped and Holding are mutually exclusive. If the action passes the time threshold to be considered a Hold action, it's not considered to be a Tap action also.

Example

C++

<Grid
  xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"
  xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"
  x:Class="MyTouch">
  <Canvas x:Name="root">
    <Rectangle Fill="Red" Width="200" Height="200" RenderTransform="1 0 0 1 50 50"
      IsManipulationEnabled="True"/>
    <Rectangle Fill="Blue" Width="200" Height="200" RenderTransform="1 0 0 1 200 300"
      IsManipulationEnabled="True"/>
  </Canvas>
</Grid>
class MyTouch: public Grid
{
public:
    void OnManipulationStarting(const ManipulationStartingEventArgs& e)
    {
        e.mode = ManipulationModes_All;
        e.manipulationContainer = (Visual*)FindName("root");
        e.handled = true;
    }

    void OnManipulationInertiaStarting(const ManipulationInertiaStartingEventArgs& e)
    {
        e.translationBehavior.desiredDeceleration = 100.0f / (1000.0f * 1000.0f);
        e.rotationBehavior.desiredDeceleration = 360.0f / (1000.0f * 1000.0f);
        e.expansionBehavior.desiredDeceleration = 300.0f / (1000.0f * 1000.0f);
        e.handled = true;
    }

    void OnManipulationDelta(const ManipulationDeltaEventArgs& e)
    {
        UIElement* rectangle = (UIElement*)e.source;
        MatrixTransform* tr = (MatrixTransform*)rectangle->GetRenderTransform();
        Transform2f mtx = tr->GetMatrix();

        mtx.RotateAt(e.deltaManipulation.rotation * DegToRad_f, e.manipulationOrigin.x,
            e.manipulationOrigin.y);
        mtx.ScaleAt(e.deltaManipulation.scale, e.deltaManipulation.scale,
            e.manipulationOrigin.x, e.manipulationOrigin.y);
        mtx.Translate(e.deltaManipulation.translation.x, e.deltaManipulation.translation.y);

        tr->SetMatrix(mtx);
        e.handled = true;
    }

private:
    NS_IMPLEMENT_INLINE_REFLECTION_(MyTouch, Grid)
};

Unity

<Grid
  xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"
  xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml">
  <Canvas x:Name="root">
    <Rectangle Fill="Red" Width="200" Height="200" RenderTransform="1 0 0 1 50 50"
      IsManipulationEnabled="True"/>
    <Rectangle Fill="Blue" Width="200" Height="200" RenderTransform="1 0 0 1 200 300"
      IsManipulationEnabled="True"/>
  </Canvas>
</Grid>
public class Touch: MonoBehaviour
{
    Grid _root;

    void Start()
    {
        _root = (Grid)GetComponent<NoesisView>().Content;
        _root.ManipulationStarting += this.ManipulationStarting;
        _root.ManipulationInertiaStarting += this.ManipulationInertiaStarting;
        _root.ManipulationDelta += this.ManipulationDelta;
    }

    void ManipulationStarting(object sender, ManipulationStartingEventArgs e)
    {
        e.Mode = Noesis.ManipulationModes.All;
        e.ManipulationContainer = (UIElement)_root.FindName("root");
        e.Handled = true;
    }

    void ManipulationInertiaStarting(object sender, ManipulationInertiaStartingEventArgs e)
    {
        e.TranslationBehavior.DesiredDeceleration = 100.0f / (1000.0f * 1000.0f);
        e.RotationBehavior.DesiredDeceleration = 360.0f / (1000.0f * 1000.0f);
        e.ExpansionBehavior.DesiredDeceleration = 300.0f / (1000.0f * 1000.0f);
        e.Handled = true;
    }

    void ManipulationDelta(object sender, ManipulationDeltaEventArgs e)
    {
        var rectangle = (Rectangle)e.Source;
        var transform = (MatrixTransform)rectangle.RenderTransform;
        var matrix = transform.Matrix;

        float rotation = e.DeltaManipulation.Rotation * Mathf.Deg2Rad;
        float originX = e.ManipulationOrigin.X;
        float originY = e.ManipulationOrigin.Y;
        float scale = e.DeltaManipulation.Scale;
        float translationX = e.DeltaManipulation.Translation.X;
        float translationY = e.DeltaManipulation.Translation.Y;

        matrix.RotateAt(rotation, originX, originY);
        matrix.ScaleAt(scale, scale, originX, originY);
        matrix.Translate(translationX, translationY);

        transform.Matrix = matrix;
        e.Handled = true;
    }
}
© 2017 Noesis Technologies