Freigeben über


Touch and Gesture (Compact 2013)

3/26/2014

This developer guide provides an overview of Windows Embedded Compact 2013 touch and gesture features, including touch driver architecture, gesture architecture, and how to program an application to recognize gestures.

Important terms used in this guide include:

  • Touch. Touching a touch panel with one or more fingers (or a stylus). The user may subsequently maintain contact with the panel while moving the contact point or points.
  • Gesture. A sequence of touch, movement, and touch release events. Some gestures match predefined gesture patterns and can act on software applications. For instance, a one-finger pan gesture is a pattern in which the user touches the screen with one finger and retains contact while moving the contact point. An application might move or select an object in response to a pan gesture. If the movement is rapid, however, the gesture may fit a different pattern, such as the flick gesture.
  • Gesture recognizer. Software that monitors and examines touch data for gestures matching certain gesture patterns, and reports them to the Graphics, Windowing, and Events Subsystem (GWES). GWES sends messages to applications to notify them of gesture events. Compact 2013 provides a built-in gesture recognizer. An OEM can create custom recognizers. For information about the Compact 2013 implementation of gesture, see Gesture Architecture.
  • Gesture handler. Software that monitors gesture events for one or more applications and responds to them. Compact 2013 provides a default gesture handler that monitors for flick gestures and causes applications to scroll.

For more information about touch and gesture, see Compact 7 Multi-Touch Driver and Gesture features on Microsoft Showcase.

In This Section

  • Gesture Architecture
    Describes the software architecture of the gesture subsystem, including the core, the recognizers, and the default handler.

See Also

Concepts

Application Development