Getting User Input

0
307

Exploring Windows Phone Touchscreen Input

Programming a game’s input system really does require a lot of design consideration ahead of time because all we really can use is the touchscreen! Oh, there is an accelerometer that can be used for input, but it is a rare and often niche game that uses the accelerometer to read the phone’s orientation (the angle and position at which it is being held). The touchscreen, for all practical purposes, is treated like mouse input without a visible mouse cursor. Windows programmers will have a slightly harder time adjusting than someone who has been working with the Xbox 360 or another console, which already requires an adjustment in one’s assumptions about user input. Windows games are a cakewalk, with 100-plus keyboard keys, the mouse, and an optional controller! That’s a lot of input! On the Windows Phone, though, all we have is the touchscreen. So we need to make the most of it, being mindful that a user’s finger is rather large compared to the precise input of a mouse cursor.

For the purpose of detecting input for a game, the most obvious method might be to just detect the touch location and convert that to an average coordinate—like mouse input. But the Windows Phone touchscreen is capable of multitouch, not just single-touch input. Although the screen is very small compared to a tablet or PC screen, it is still capable of detecting input from up to four fingers at once. To develop a multitouch game, you will need the actual Windows Phone hardware— either a real phone or an unlocked development model (with no phone service).

Multitouch is a significant feature for the new phone! For our purposes here, however, we will just be concerned with single “tap” input from one finger, simulated with mouse motion in the emulator. We can do a single tap with a mouse click, or a drag operation by touching the screen and moving the finger across the screen. Again, this will have to be done with your mouse for development on the emulator (unless your Windows system uses a touchscreen!).

You will not be able to test multitouch in the emulator without a multitouch screen on your Windows development system or an actual Windows Phone device.

Simulating Touch Input

The key to touch input with a WP7 device with XNA is a class called TouchPanel. The XNA services for mouse, keyboard, and Xbox 360 controller input are not available in a WP7 project. So, for most XNA programmers, TouchPanel will be a new experience. Not to worry; it’s similar to the mouse code if we don’t tap into the multitouch capabilities.

The TouchPanel class includes one very interesting property that we can parse— MaximumTouchCount represents the number of touch inputs that the device can handle at a time.

The second class that we need to use for touch input is called TouchCollection. As the name implies, this is a collection that will be filled when we parse the TouchPanel while the game is running. TouchCollection has a State property that is similar to MouseState, with the enumerated values Pressed, Moved, and Released.

The Touch Demo Project, Step by Step

Let’s create a sample project to try out some of this code. I’ll skip the usual new project instructions at this point since the steps should be familiar by now. Just create a new project and add a font so that we can print something on the screen. I’ve used Moire Bold 24 as the font in this example.

  1. Add some needed variables for working with text output. (The code for the touchscreen input system will be added shortly.)
    [code]
    public class Game1 : Microsoft.Xna.Framework.Game
    {
    GraphicsDeviceManager graphics;
    SpriteBatch spriteBatch;
    SpriteFont MoireBold24;
    Vector2 position;
    Vector2 size;
    string text = “Touch Screen Demo”;
    [/code]
  2. Initialize the font and variables used in the program.
    [code]
    protected override void LoadContent()
    {
    spriteBatch = new SpriteBatch(GraphicsDevice);
    MoireBold24 = Content.Load<SpriteFont>(“MoireBold24”);
    size = MoireBold24.MeasureString(text);
    Viewport screen = GraphicsDevice.Viewport;
    position = new Vector2((screen.Width – size.X) / 2,
    (screen.Height – size.Y) / 2);
    }
    [/code]
  3. Write the update code to get touch input.
    [code]
    protected override void Update(GameTime gameTime)
    {
    if (GamePad.GetState(PlayerIndex.One).Buttons.Back ==
    ButtonState.Pressed)
    this.Exit();
    //get state of touch inputs
    TouchCollection touchInput = TouchPanel.GetState();
    //look at all touch points (usually 1)
    foreach(TouchLocation touch in touchInput)
    {
    position = new Vector2(touch.Position.X – size.X / 2,
    touch.Position.Y – size.Y / 2);
    }
    base.Update(gameTime);
    }
    [/code]
  4. Print the message on the screen at the touch coordinates.
    [code]
    protected override void Draw(GameTime gameTime)
    {
    GraphicsDevice.Clear(Color.CornflowerBlue);
    spriteBatch.Begin();
    spriteBatch.DrawString(MoireBold24, text, position, Color.White);
    spriteBatch.End();
    base.Draw(gameTime);
    }
    [/code]

Figure 4.1 shows the output of the program running in the WP7 emulator. Although the static figure doesn’t show movement, the message “Touch Screen Demo” moves on the screen with touch input! On a real Windows Phone device, this would work with finger input on the screen.

The Windows Phone hardware specification calls for at least four simultaneous touchscreen inputs!

The text message moves based on touch input.
FIGURE 4.1 The text message moves based on touch input.

How to Simulate More Inputs

What if you really do have a great design for a multitouch game but have no way to test it (that is, you do not have a WP7 device or a touchscreen for your Windows PC)? One alternative is to develop and test your game with simulated inputs for each finger, and then test each input separately in the emulator. This actually works quite well! Suppose your game requires one input to move a ship left or right on the screen, and another input to fire a weapon. The easy approach is to just leave the ship’s movement alone (presumably at the center of the screen without movement) to test the firing input. When that is working satisfactorily, then you might test leftto- right movement input with random or regular shooting intervals. This is how most developers will approach the problem.

Using Gestures on the Touchscreen

A related capability of the TouchPanel class is a gesture interface. This is potentially a really great feature for WP7 games, so don’t pass it up! A gesture is essentially nonverbal communication with your hands. A gesture on a touchscreen might be to flick an object across the screen rather than dragging to the exact location where you want it to go. Instead of manually moving something one direction or another, one might just flick it in the general direction. So, it’s up to the game to interpret the gestures based on the game’s user interface design.

Since gesture input can be used in place of touch input, you might want to just use gesture input instead. If all you need for your game is single-touch input, a tap gesture would work in place of the mouselike touch code seen previously. The main difference between the two methods is that gesture input does not support multitouch.

XNA supports gestures with the same TouchPanel class used for touch input. Here are the contents of the GestureType enumeration:

  • None
  • FreeDrag
  • Tap
  • Pinch
  • DoubleTap
  • Flick
  • Hold
  • DragComplete
  • HorizontalDrag
  • PinchComplete
  • VerticalDrag

Gesture-based input does not support multitouch. Only a single gesture (with one finger) can be used at a time.

To use gesture input, we have to enable it, because gesture input is not automatic. There are obvious problems with input if all gestures are enabled by default, because the game will behave erratically with flick gestures in a game using a lot of “drag”-style input to move and interact with the game. Even the simplest of arcadestyle or puzzle games will involve some dragging of the finger across the screen for input, and this could be misinterpreted as a flick if the intended movement is too fast. To enable both tap and flick gestures, add this line to the constructor, Game1(), or LoadContent():

[code]
TouchPanel.EnabledGestures = GestureType.Tap | GestureType.Flick;
[/code]

A simple example can be added to the Update() method of the Touch Demo to see how it works. Note that the first if condition is required. Trying to read a gesture when none is available will cause an exception!

[code]
if (TouchPanel.IsGestureAvailable)
{
GestureSample gesture = TouchPanel.ReadGesture();
if (gesture.GestureType == GestureType.Tap)
{
position = new Vector2(gesture.Position.X – size.X / 2,
gesture.Position.Y – size.Y / 2);
}
}
[/code]

Running this code, you might be surprised to find that the tap gesture is more like a mouse click-and-release event. Click-dragging does not produce the gesture! You can see for yourself by commenting out the touch input code in the Touch Demo program, and running just the gesture code instead. If you’re really interested in gesture input, a real WP7 device is a must-have! The emulator just doesn’t satisfy. If you are doing WP7 development for profit, a dev device or an actual Windows Phone device (with carrier subscription) is a good investment.

Touch input can be quite satisfying to a game designer after working with the traditional keyboard and mouse, because it offers the potential for very creative forms of input in a game. Although regular multitouch input will be the norm for most Windows Phone games, the ability to use gestures also might prove to be fun.