Touch Samples

Related Articles


This paper introduces two ways that your apps can use touch on Windows 8*. Windows 8 Desktop apps get touch input from the backwards-compatible Windows 7 WM_GESTURE messages. Windows Store apps, on the other hand, use touch events from the WinRT interface. Review these samples to help decide which kind of touch input may be right for your app.

Introducing the Touch Samples

Touch provides a fun and natural way for your customers to interact with your app. Touch interfaces are an integral part of tablets with Intel® Atom™ processors and have become more common on Ultrabook™ devices. Let’s look at some of the ways you can add touch support to your Windows 8 app today.

Here are two simple apps that will help you gain familiarity with touch. The Touch Desktop sample shows the easiest way to use touch in a Windows 8 Desktop app (one alternative is discussed below). It uses the Windows 7 Touch API and WM_GESTURE. The Touch sample for Windows Store apps uses the WinRT interface to implement touch support in a new Windows 8 UI app.

Which approach should you use? If you’re interested in simple touch support for Windows 8 Desktop, or you want backward compatibility with Windows 7, then check out the Touch Desktop sample. If you wish to support the new Windows 8 UI, take a look at the Touch sample for Windows Store apps. Both are built in C++. The Touch Desktop sample uses Visual Studio* 2010. The Windows Store app sample uses Visual Studio 2012, currently the only development environment you can use to create Windows 8 apps.

The samples are similar. Each app draws a basic rectangle on the screen and lets you use touch gestures to manipulate that rectangle.

Figure 1: Sample app with touch

The code that updates the rectangle based on touch events is similar, whether the sample got its input via the Windows 7 Touch API or the WinRT interface. There are enough differences in how the touch events are received, however, that we built these as two complete standalone samples to avoid confusion.

Touch Desktop sample

This app uses the Windows 7 Touch APIs. Windows 8 also supports this interface for Windows 8 Desktop apps, so this is a smart option for apps that need touch support on both Windows 7 and Windows 8.

Getting started

To start this sample app, open the solution. Build and run the sample. You’ll see a window containing just a simple rectangle that responds to touch events. Try it out, and see what each kind of touch does to the rectangle. Try tapping it, drag it around, see what happens when you try to rotate it, and so on. The rectangle changes color, border, location, etc. based on your gestures. To see how it does this, go back to the solution and look at the code in SDP_Desktop_Touch.cpp.


During your app’s initialization, it needs to enable all the gestures it will use. The common ones are enabled by default. For example, here’s code to ensure that your app can use rotate gestures with a call to SetGestureConfig:

GESTURECONFIG config = { 0 };
config.dwWant = GC_ROTATE;
config.dwID = GID_ROTATE;
config.dwBlock = 0;

BOOL result = SetGestureConfig(

This call is optional if your app uses the default gestures.

Receive gestures

After gestures have been configured, the app is ready to receive gestures. In WndProc, catch all gesture messages and use a single handler for them all:

SDP_rectangle my_rect;
GestureHandler my_ghandler(&my_rect);

	//Handle gesture messages sent to the app
		return my_ghandler.WndProc(hWnd, wParam, lParam);

The set of WM_GESTURE messages includes ZOOM, PAN, ROTATE, TWOFINGERTAP, and PRESSANDTAP. All other touch events will not be recognized as gestures and will arrive as mouse click messages (e.g., WM_LBUTTONDOWN). We’ve omitted them in this article for simplicity, but you should add mouse click support so your app receives the full set of touch events

An alternative: When might WM_GESTURE not meet my needs?

WM_GESTURE gives you simple high-level access to five gestures identified by the system. There may be some latency while the system identifies whether the touch event should be emitted to your app as WM_GESTURE or as mouse messages. Your app might need to recognize non-standard gestures, or it may need high-speed access to touch events. If so, you may have better results using WM_TOUCH messages instead. These messages are more low-level, and give you direct access to each touch event. The WM_TOUCH messages give screen coordinates and a touch ID for each touch point on the screen. With these messages, your app can interpret your own gestures.

Your app may receive only one of the two message types, however. To disable WM_GESTURE and start receiving WM_TOUCH messages, call:

	RegisterTouchWindow(hWnd, 0);

Handle gestures:

Now that your app is receiving gestures, let’s look at how they’re handled. Open GestureHandler.cpp. All gestures are passed to GestureHandler::WndProc. To handle a gesture here, first call GetGestureInfo:


ZeroMemory(&gi, sizeof(GESTUREINFO));
gi.cbSize = sizeof(GESTUREINFO);

BOOL result = GetGestureInfo((HGESTUREINFO)lParam, &gi);

This call fills in the GESTUREINFO struct. It describes which gesture was made and has more details about the gesture.

Gesture details

Use the gesture ID dwID to identify each gesture, so you can handle them separately. For every gesture, the app will first receive a message with a dwID of GID_BEGIN, followed by a sequence of messages of that gesture’s ID. For example, a pan gesture will arrive in your app as a sequence of messages with a dwID of GID_PAN. When the gesture is complete, your app will receive a message containing a dwID of GID_END. For this simple app, it isn’t necessary to do much with the GID_BEGIN and GID_END messages, but your app may need to do something with these cases. Here, we simply set a flag so you can see they have been handled.

Now that you know a gesture has started, you need to know more about it in order to take the next step. There are two general types of gesture messages:

  • The GID_ZOOM, GID_PAN, and GID_ROTATE gestures are made up of sequences of individual messages. They are gestures that usually manipulate an object’s screen coordinates.
  • The GID_TWOFINGERTAP and GID_PRESSANDTAP gestures are discrete messages that arrive once and do not require tracking across the gesture.

Since GID_ZOOM, GID_PAN, and GID_ROTATE usually change the object’s coordinates, you need to keep track of their starting point. As the gesture continues, you can update the object on screen with any changes since the start. When the gesture is done, all changes can be finalized and drawn on the screen one last time.

Within a sequence of a messages of a single gesture type, dwFlags are set to mark the first and last of the sequence. We can check dwFlags to see if this is the first or last by testing for:

gi.dwFlags & GF_BEGIN


gi.dwFlags & GF_END

Once we know the gesture and its position, we can learn more about it from the ullArguments and ptsLocation fields. For example, when a zoom gesture starts, we capture the starting data and location of the zoom:

	//Capture the first location of the two fingers
	if (gi.dwFlags & GF_BEGIN)
		m_arguments = gi.ullArguments;
		m_first.x = gi.ptsLocation.x;
		m_first.y = gi.ptsLocation.y;

This code captures the starting location of a zoom, even though it’s not used; add your own code to track zoom in your app.

Every time there’s another message in the same zoom gesture, update the screen to show the zoom.

	//to see the object update as you perform the gesture,
	//calculate the coordinates and redraw it here.
	int height = 0;
	int length = 0;

	height = int (gi.ullArguments - m_arguments);
	length = int (gi.ullArguments - m_arguments);
	m_rect->changeSize(length, height);

	//InvalidateRect(hWnd, NULL, TRUE);
	invalidate(hWnd, FALSE);

	m_arguments = gi.ullArguments;

The ullArguments field indicates how much the zoom has scaled the object, so we resize the rectangle and cause it to be redrawn.

The final zoom message, identified by checking gi.dwFlags & GF_END, has the same scaling code as all the ones before it.

For pan gestures, use the start location and current location of the pan to find how far to move the rectangle:

	if (gi.dwFlags & GF_BEGIN)
		m_arguments = gi.ullArguments;
		m_first.x = gi.ptsLocation.x;
		m_first.y = gi.ptsLocation.y;
		int x = 0;
		int y = 0;

		x = int (gi.ptsLocation.x - m_first.x);
		y = int (gi.ptsLocation.y - m_first.y);

		m_rect->move(x, y);
		//InvalidateRect(hWnd, NULL, TRUE);
		invalidate(hWnd, FALSE);

		m_first.x = gi.ptsLocation.x;
		m_first.y = gi.ptsLocation.y;

Wrapping up

Once the supported gesture messages have been handled, clean up with:


All other message types need to be forwarded on to the default message handler:

DefWindowProc(hWnd, WM_GESTURE, wParam, lParam);

To see full details of how the messages are handled, read the rest of the event handler code.

Windows Store Touch sample

This sample app uses the new Windows 8 UI, and receives touch events from the WinRT interface.

Getting started

To start the sample app, open the solution. Build and run the sample. You’ll see a window containing just a simple rectangle that responds to touch events. Try it out and see what each kind of touch does to the rectangle. Try tapping it, drag it around, see what happens when you try to rotate it, and so on. The rectangle changes color, border, location, etc., like the previous sample did. To see how the sample does this, return to the solution and look at MainPage.xaml in design view.

There’s no special initialization code, just a series of event handlers for the significant touch events. The main window doesn’t use handlers. To see them in use, click the red rectangle, and look at the list of event handlers for that object.

Figure 2: Touch sample for Windows Store apps in design view

Touch events

You’ll see a few specific events connected to the rectangle object:

  • DoubleTapped
  • ManipulationDelta
  • PointerExited
  • PointerPressed
  • PointerReleased

These don’t map exactly to the messages we saw with the Windows 7 WM_GESTURE. The events this Windows 8 UI app gets from the WinRT interface give richer control over some touch events than we saw in the Windows 7 interface.

The app receives the PointerPressed event when the screen is touched (if it’s a single tap, it will instead receive a Tapped event). This may be followed by either the PointerExited or PointerReleased event. If the touch is released while your finger is over the rectangle, the app receives PointerReleased. If the touch moves outside the screen coordinates of the rectangle before it is released, the app will receive PointerExited instead.

Simple events

The handlers for the simple pointer events don’t do much, but they’re there to give you a reference. For this app, the rectangle color is set in PointerPressed, and the border color is set to match in PointerReleased. Nothing happens in PointerExited. Each pointer event handler receives arguments in a PointerRoutedEventArgs structure. It can use this to find the pointer location. Your app will probably need to examine the PointerRoutedEventArgs structure for more specifics.

The DoubleTapped event is used by this app to double the size of the rectangle. Again, the event receives an argument, in this case DoubleTappedRoutedEventArgs, that you can use for more details about the event.

More complex events

The more complex events, like Expansion, Rotation, Scale, or Translation all arrive with the ManipulationDelta event. This event is triggered when the object is manipulated in some way. The event receives a ManipulationDeltaRoutedEventArgs structure in its arguments. We can look at this structure to see what kind of manipulation this was. The Delta element will have at least one of its Expansion, Rotation, Scale, or Translation elements set. We may see any or all of these for any event.

Let’s look at the Rotation and Expansion cases. With Rotation, note how much it has rotated:

	if (e->Delta.Rotation != 0)
		brush = safe_cast<SolidColorBrush^>(rect->Fill);
		currentAngle += e->Delta.Rotation;
		if (currentAngle > 360)
			currentAngle -= 360;
		if (currentAngle < 0)
			currentAngle += 360;

		SDP_Rect_Change_Color(currentAngle, brush);
		drag = 0;

Your app may use it for something else, but this app uses the rotation angle from Delta.Rotation to calculate a color for the rectangle. There’s similar code for handling Expansion (adding to the rectangle’s width and height based on the amount in the Expansion value).

Each of the different Delta values can happen together, for example Rotation and Translation can arrive in a single gesture. In this sample, we found that it was sometimes complicated to correctly identify Translation along with other changes. There was often some small amount of Translation that accompanied other changes, which gave an undesirable movement to the rectangle. Because of this, the sample handles Rotation and Expansion individually and ignores any Translation that comes at the same time. Only Translation events by themselves are used:

		trans_drag->X += e->Delta.Translation.X;
		trans_drag->Y += e->Delta.Translation.Y;

This moves the rectangle directly in the X and Y direction of the Translation.

Take a look at how the event handlers use their inputs and look at the set of other events your app might need to handle.


In these samples, we’ve looked at the WM_GESTURE messages for Windows 7 and WinRT for the new Windows 8 UI. These interfaces let you handle touch events in your Windows 8 apps, and we’ve studied how the sample code responds to typical touch interaction. We hope the samples have been useful for you. Now, you should go add touch support to your apps!


Getting started with Windows Touch Gestures:

Intro to Touch on Ultrabook: Touch Messages from the Windows 7 Interface

WM_GESTURE reference:

Touch for Windows Desktop sample:

Touch interaction design (Windows Store apps):

About the Author

Paul Lindberg is a Senior Software Engineer in Developer Relations at Intel. He helps game developers all over the world to ship kick-ass games for the PC.

For more complete information about compiler optimizations, see our Optimization Notice.


Paul Lindberg (Intel)'s picture

When your app uses the WM_GESTURE messages, a swipe will arrive as a sequence of GID_PAN messages. Look at the way the sample code handles GID_PAN and translation.

Vishwas D.'s picture

The explanation was really helpful. Thanks, but there is no example on how to use swipe events which are also part of gesture events i suppose. Can some one please insist on finding examples on swipe events.

Paul Lindberg (Intel)'s picture

I'm delighted that you like it! Thanks for the update on VS2012 Express.

Michael S.'s picture

Thanks. This is the cleanest, most straight forward explanation of a simple touch app that I've ever seen.
Also, I had only one minor bump in getting it to work in VS2012 Express. It defaulted to Remote Machine so I had to change to Simulator.

Add a Comment

Have a technical question? Visit our forums. Have site or software product issues? Contact support.