Quantcast
Channel: Intel Developer Zone Articles
Viewing all articles
Browse latest Browse all 3384

Porting Game User Interfaces to Windows 8 Touch Devices

$
0
0

Download Article


Porting Game User Interfaces to Windows 8 Touch Devices.pdf [1.53 MB]

© Copyright 2013 RIVER

Purpose of the document


The purpose of this document is to guide Windows game developers in their transition to Windows 8 on touch-enabled devices, with the sole focus on the user interface (UI). With this approach, the document is reminding the reader about a few fundamentals about desktop user interfaces before comparing desktop and touch user interfaces.

A list of Windows 8 specific guidelines are also provided.

1. Basics


1.1 Basic reminders about point-and-click interfaces

The point-and-click paradigm for user interfaces is based on the simple fact that the user can just point at their wished target with a pointing device (a mouse, for instance) and interact with it. Available interactions are then primary (left-click) or contextual (right-click). This paradigm can be visualized, for instance, in the design of the mouse pointer (the index-pointing hand). Historically, the transition between mainly keyboard-controlled UI to mouse-controlled created the need for graphical elements that let the users have control in more direct and intuitive way. That’s where the desktop metaphor and the use of icons, mouse pointers and buttons come from. They all have a correlation to the real world that makes them intuitive to use for users.

1.2 Theory about touch

Likewise, the transition from mouse to touch, is creating the need for a new paradigm of UI, that let the users have control in their fingers. Being able to interact with the content directly with your fingers makes you psychologically closer to the content. Therefore the expectations from touch users in terms of interactions are different, more tangible, than from the prosthetics that are mouse and mouse pointers. Consequently, the main trivial point for this transition is reducing the distance between the user and the content of the UI.

Your main goal as a developer for touch interface should be to grant an effortless and direct manipulation of the content. Therefore, porting a game from desktop to touch requires the adoption of a solid strategy regarding content, controls and interaction with the UI.

2. Desktop and touch UI


1.1 Layout and navigation

When it comes to layout and navigation, games can involve complex nested and tiered menus, containing the myriads of options that the game offers, as well as being dramatically simple and not requiring any menu or option. The fluid UI of Windows 81 is offering designers and developers the possibility of creating a seamless gaming experience.

In designing the UI, designers should choose whether to adopt a hierarchical or flat approach. The former is an advantage for all those games that bring the gaming experience over the actual gameplay; an example could be FPS2 games, in which, besides the actual gameplay, players spend a reasonable amount of time in configuring and preparing their characters. The latter offers a smoother solution for those games that are primarily experience.

Hierarchical Navigation

Most Windows Store games in Windows 8 will use a hierarchical system of navigation. This pattern is common and will be familiar to people, but is made even better by the Hub navigation pattern. The essence of this pattern is the differentiation of the content in three styles and different layers of detail.

a) Hub pages are the user’s entry point to the app. Content is displayed in order to provide a general overview to the users. Different categories are highlighted, representing each App Section with its content or functionality. The App Hub can show top stories, breaking news, content recommended for the user, and featured elements for all the different categories in one easily pannable surface. Each category group can bubble engaging content up to the hub. Tapping a group’s header enables the user to drill in to a particular section and see more content.

b) Section pages are the second level of an app. Designers should here represent more in detail the content of each section previously presented. Each of the elements on this page will have a dedicated Detail page.

c) Detail pages are the third level of an app. The designer will here showcase specific functionalities and details of the app. The layout varies according to the app, depending on the amount of elements to be shown.

This pattern is often seen in games, browsers, or document creation apps, where the user moves between pages, tabs, or modes that are on the same hierarchical level. This pattern performs better when the app/game has a limited amount of pages to navigate through.

a) The Top App Bar is great for switching between multiple views. Examples include tabs, documents, and messaging or game sessions. This bar, explained also in chapter 2.2, can be triggered by users by swiping down from the top edge of the device.

b) Unlike the hierarchical system, the flat navigation does not offer any backwards navigation button. However, user can navigation by using the Top App Bar or by directly swiping the screen horizontally.Additional content or interactions within the app bar can be achieved by the use of a plus button, as extensively explained in Microsoft’s documentation3.

2.2 Commands and Actions

Designing for Windows 8 touch interfaces requires to focus on simplification and reduction of the information clutter. To reach this goal, designers should focus on what the main user task of each screen is and move the secondary actions and elements outside the chrome. Windows 8 provides several surface you can place commands and controls on. Some within the canvas, other outside the chrome (App Bar, Charms).

Use of the Canvas

Users should be able to complete the core scenarios just by using the canvas, not the chrome. Whenever possible, let users directly manipulate the content on the app’s canvas, rather than adding commands that act on the content. For example, in Zynga’s CityVille, one of the main user action is to shop. Then Zynga decided to allow players to shop items directly from the game canvas and not in a separate shop section.

Use of the Charms

The charm and app contracts are simple and powerful ways to enable common app commands.

It’s important to avoid duplicating app contract functionality (e.g. share) on the app’s canvas or in the App Bar.

  • Search: Let users quickly search through the app’s content from anywhere in the system, including other apps. And vice versa.
  • Share: Let users share content (like game achievements for instance) from the app with other people or apps, and receive shared content.
  • Devices: Let users enjoy audio, video, or images streamed from the app to other devices in their home network.
  • Settings: Consolidate all of the app settings under one roof and let users configure the app with common mechanism they’re already be familiar with.

Use of the App Bar

The App Bar is used to display on-demand commands to users, relevant to the current screen. As an established element of Windows 8 desktop experience, also when designing touch games the App Bar can be helpful for menus and contextual actions. As an example, RPG can benefit of the App Bar to collect all the user possible actions related to the current screen.

Use of the Context Menus

Context menus for clipboard actions (like cut, copy, and paste), or for commands that apply to content that cannot be selected (like an image on a web page). The system provides apps with default context menus for text and hyperlinks. For text, the default context menu shows the clipboard commands. For hyperlinks, the default menu shows commands to copy and to open the link.

Target Size

When porting your app or game, it is suggested to differentiate target sizes that are touch-enabled from those that are touch-optimized. From the Windows 8 guidelines, a touch-enabled target is minimum 23px by 23px, corresponding to a physical size of 5mm, while touch-optimized targets should be preferably 40px by 40px, corresponding to a physical size of 10mm. An overall consideration could be that big targets are usable with mouses, while small targets are arguably unusable with touch.

Spacing

Together with a correct size strategy, a correct spacing should be adopted. From Windows 8 guidelines, a minimum of 2mm between targets would allow a correct use of the UI. This choice is also important for input fields, radio buttons and multiple selection in general.

a) Buttons. The overall definition of what a button is and how it can be interacted with is very different on desktop and on touch. On a desktop interface, buttons would usually have a hover state and a selected state. Hover states such as visual changes, sound or tooltips would provide additional information to the user on the purpose of the button and its consequences. Touch-activated buttons don’t have hover states; therefore, designers should define a solution to provide feedbacks to the users throughout the application/game.

Some examples could be sound hints, haptic feedbacks or subtle animations. As a good design practice, designers should let users disable those feedbacks easily.

As stated in the Windows 8 guidelines, the UI should reduce the use of buttons and enable gestures for users to interact with the content of the app/game. Chapter 3.4 will define to topic of gestures more in detail.

b) Dropdowns. Typically on desktop, dropdowns would preferably replace any stepper/toggle with more than 3 options in most places. This kind of dropdowns works best with keyboard/mouse input, alternate solutions should be used for touch/controller. In Windows 8, a particularly usage of dropdowns that goes over the traditional one regards headers.

Often header dropdown menus enables users to jump laterally among categories. For example, consider a user who is reading a sports article and wants to go to the entertainment section in your news app. The user can do that easily by using the drop-down header.

c) Sliders. In most common applications, sliders are used often for other large or “imprecise” values, such as volume/brightness/sensitivity. In terms of usability, sliders work better on touch then they do using a keyboard or controller, due to the mouse movements they imply. In games, sliders are often adopted for character setup, or feature change. Despite their high user-engagement value, it is always recommended to provide numerical feedbacks on the changed status.

As an example, Skyrim propose sliders for character creation, but no feedbacks on progress or selected value is given; it becomes hectic in this scenario to keep track of setups and easily repeat them throughout time.

Windows 8 is not giving specific guidelines on the matter, besides correct touch-area sizing.

d) Scrollbars. Scrollbars in Windows 8 are very often avoided and not considered as meaningful feedback for the user. In fact, since touch devices offer the possibility of directly scrolling with fingers a direct feedback “on-the-tip” is offered. Windows UX guidelines are suggesting, instead, to provide animated interactions (e.g. elastic scroll), in order to provide feedback to the users. The same logic can be applied to all the Windows 8 elements with dynamic content, such as Tiles.

e) Tab Bars. Tab Bars can be radically different on different platforms. Tabs are essentially just buttons when using mouse/keyboard navigation, but when using touch the tabs could be changed with swipe gestures. Apps in Windows 8 tend to be very wide so the user scrolls to the right to see different pages, which is fulfilling the same role than tabs on the desktop. There is however a need for a quick link to a specific section, which can be done with a dropdown as mentioned in Chapter 2.2.

f) Lists. Lists are one of the most common UI element to display large amount of content and it’s normally the one that needs the most work/polish. Apart from mouse hovers, lists could be mostly the same across all input methods. One thing to keep in mind is that the touch way of scrolling is reversed from the mouse/keyboard scrolling.

g) Text. Despite it is understandable that each game will have its own predefined font, it is anyway advisable to designers to keep in mind Windows 8 guidelines.

The three main fonts adopted by Windows 8 are:

  • Segoe UI (the primary Windows typeface) for UI elements such as buttons and date pickers. Segoe UI supports Latin, Cyrillic, Greek, Arabic, Hebrew, Armenian, and Georgian alphabets.
  • Calibri for text that the user both reads and writes such as email and chat. Calibri supports Latin, Greek and Cyrillic alphabets.
  • Cambria for larger blocks of text such as for a magazine or RSS reader. Cambria supports Latin, Greek and Cyrillic alphabets.

When choosing a font, tracking (global letter-spacing) in the UI is important to the overall readability of the text, particularly when it appears against dark or complex backgrounds. Windows 8 guidelines recommend to use proportional unit for tracking (i.e. em); it has to be equal to the type size in points. For example, the width of em for an 11pt type is 11 points.

h) Transient UI (tooltips, flyouts, context menus and message dialogs) Common on desktop, very often triggered by a right click, context menus are possible on Windows 8 but not recommended. The first choice of placement for the contextual menus should be the app bar, but in some cases, tap-and-hold context menus will be necessary like for text selection.

Because of the absence of hover trigger on touch devices, tooltips should generally be avoided. Implementing this behaviour is not impossible, but the high probability of unintentional tap can lead to a very frustrating experience for the user. Popups in Windows 8 can be either message dialogs or flyouts. Message dialogs should be used only to display urgent information disrupting the experience such as errors or questions.

2.3 Orientation and Views

Postures

When porting and designing games for touch, it is important to consider the touch areas and the way users interact with them. As an assumption and result of several studies, 80-90% of the population is right handed. It is interesting to observe the Intel study on the use of the Ultrabook4.

The two most common postures adopted are similar to tablets when holding it with two hands or using just the index to touch. In both cases, the mapping shows how perimetral call to actions are the easiest to correctly reach by the users. When comes to tablets, there two main scenarios to keep in mind. The first one, where both hands are used to interact. This is typical while typing and playing.

A second one, with the device handheld and the use of one finger to interact. This is more typical when navigating menus.

In both cases a designer should locate the most important and crucial features where the user can access with ease. In case of a constant use with two hands, main actions should be located in the lower part of the screen, towards the sides. In case of a more frequent use with one hand, the main actions could be placed also in the top part of the screen, more centred. When configuring controls for games, good design practice recommends customizable controls for users.

2.4 Feedback and notifications

Feedback

Giving the player proper feedback is important, but more or less impossible with touch screen controls. Vibration could be one way to signal to the user that their input was registered but not all devices support vibrations. Sound is another way to indicate a success or failure, but because the target devices is often portable the user might be in a public area where sound might be hard to hear or people around might be disturbed so the audio could be muted. Visual cues work but need to happen around the users fingers/hand to be seen.

3. Checklist for Testing


The purpose for this section is to give developers and designer a straightforward and practical checklist to be considered when reviewing the ported application. The checklist offers a list of heuristics categorized per topic and providing a practical example.

3.1 Accessibility

Navigation

Are menus broad (many items on a menu) rather than deep (many menu levels)?

Consider menu “spread” over the screen and the implication that gestures create (e.g. tap to select one item, multiple items etc). It might be necessary to collapse menu items (see chapter 3.4 for more).

Navigation

If the system has multiple menu levels, is there a mechanism that allows users to go back to previous menus?

Consider gestures (e.g. swipe) is a viable backward navigation mode.

Navigation

Is the content browsable by gestures?

Semantic Zoom (technique to browse large set of data) and panning make navigation fast and fluid. Instead of putting content in multiple tabs or pages, use large canvases that support panning and Semantic Zoom.

Layout

Are action elements placed on the correct side of the screen?

Most people hold a slate with their left hand and touch it with their right. In general, elements placed on the right side are easier to touch, and putting them on the right prevents occlusion of the main area of the screen.

Besides being a generalized heuristic, it may be anyway considered when a developer is willing to spur one behavior rather than another.

Layout

Are interactive elements placed along the bottom corners?

Because slates are most often held along the side, the bottom corners and sides are ideal locations for interactive elements.

Layout

Is content placed in the upper half of the screen?

Content in the top half of the screen is easier to see than content in the bottom half, which is often blocked by the hands or ignored.

Gestures

Is the application facilitating straight line movements?

Fingertip movements are inherently imprecise as a straight-line motion with one or more fingers is difficult due to the curvature of hand joints and the number of joints involved in the motion.

Gestures

Are all the main interactive elements easy to access?

Some areas on the touch surface of a display device can be difficult to reach due to finger posture and the user's grip on the device.

Posture

Is the application mainly used with one hand holding, one hand interacting with light to medium interaction?

Right or bottom edges offer quick interaction.

Lower right corner might be occluded by hand and wrist.

Limited reaching makes touching more accurate.

Reading, browsing, email, and light typing.

Posture

Is the application mainly used with two hands holding, thumbs interacting with light to medium interaction?

Lower left and right corners offer quick interaction.

Anchored thumbs increase touching accuracy.

Anything in the middle of the screen is difficult to reach.

Touching middle of screen requires changing posture.

Reading, browsing, light typing, gaming.

Posture

Is the application mainly used with the device resting on table or legs, two hands interacting with light to heavy interaction?

Bottom of the screen offers quick interaction.

Lower corners might be occluded by hands and wrists.

Reduced need for reaching makes touching more accurate (e.g. when reading, browsing, email, heavy typing)

Posture

Is the application mainly used while the device rests on table or stand, with or without interaction?

Bottom of screen offers quick interaction.

Touching top of the screen occludes content.

Touching top of screen might knock a docked device off balance.

Interaction at a distance reduces readability and accuracy.

Increase target size to improve readability and precision (e.g. when watching a movie, listening to music).

Commands

Do you have specific contextual action on one page?

Use the app bar to display commands to users on-demand. The app bar shows commands relevant to the user's context, usually the current page, or the current selection.

The app bar is not visible by default. It appears when a user swipes a finger from the top or bottom edge of the screen. The app bar can also appear programmatically on object selection or on right click.

The App Bar is transient, going away after the user taps a command, taps the app canvas, or repeats the swipe gesture. If needed, you can keep the App Bar visible to ease multi-select scenarios.

Commands

Do you have clipboard or content actions in one page?

You can use context menus for clipboard actions (like cut, copy, and paste), or for commands that apply to content that cannot be selected (like an image on a web page).

Commands

Are persistent commands placed on the right?

Start by placing default commands on the right side of the app bar. If there are only a few commands, the app bar may end up with commands only on the right.

For example for the Browse commands, the view command set and the filter/sort set are persistent.

Commands

Are edges used correctly?

If there is a larger number of commands, separate distinct command sets on the left or the right to balance out the app bar and to make commands more ergonomically accessible.

For example you can move the view command set to the left and keep the filter/sort set on the right. Moreover, when a set is active then the related commands appear to the right of the set.

Commands

Are disabled commands shown/hidden?

Commands that are not relevant in certain circumstances should be hidden. When they do appear, they should not disrupt the ordering of persistent commands.

For example, when map view is active the map view commands appear to the right of the view command set.

Commands

Is standard placement for standard commands adopted?

Some commands are common and appear in many apps. To create consistency and instill confidence, standards should just be followed.

Commands

Is “new” button given the right positioning?

If your app calls for a "New" command, where any new type of entity is created (add, create, compose), place that command against the right edge of the bar. This gives every "New" command, regardless of the specific app or context, consistent placement and makes it easily accessible with thumbs.

Touch

Is the application using hover states?

Touch uses a two-state model: the touch surface of a display device is either touched (on) or not (off). There is no hover state that can trigger additional visual feedback.

Touch

Are tooltips used instead of hover states?

Show tooltips when finger contact is maintained on an object. This is useful for describing object functionality (drag the fingertip off the object to avoid invoking it).

For small objects, offset tooltips so they are not covered by the fingertip contact area. This is helpful for targeting.

Touch

Is the application thought for multi-touch interactions?

Supports multi-touch: multiple input points (fingertips) on a touch surface. Manipulations should not be distinguished by the number of fingers used. Interactions, instead, should support compound manipulations. For example, pinch to zoom while dragging the fingers to pan.

Occlusion

Can UI elements be covered by fingers?

Make UI elements big enough so that they cannot be completely covered by a fingertip contact area.

Position menus and pop-ups above the contact area whenever possible

Text/Image

Are you facilitating precise selection?

Where precision is required (for example, text selection), provide selection handles that are offset to improve accuracy.

3.2 Errors and reversibility

Feedback

Is the system offering feedbacks to touch actions?

Increase user confidence by providing immediate visual feedback whenever the screen is touched.

Interactive elements should react by changing color, changing size, or by moving. Items that are not interactive should show system touch visuals only when the screen is touched.

Reversibility

Are all the actions reversible?

If you pick up a book, you can put it back down where you found it. Touch interactions should behave in a similar way — they should be reversible. Provide visual feedback to indicate what will happen when the user lifts their finger. This will make your app safe to explore using touch.

Warnings

Is sound used to signal an error?

Warnings

Do error messages indicate what action the user needs to take to correct the error?

Warnings

Does the system prevent users from making errors whenever possible?

Warnings

Does the system warn users if they are about to make a potentially serious error?

3.3 Touch Language

Gesture

Is the primary action accessible by tapping?

Tapping on an element invokes its primary action, for instance launching an application or executing a command.

Gesture

Is slide used to pan?

Slide is used primarily for panning interactions but can also be used for moving, drawing, or writing. Slide can also be used to target small, densely packed elements by scrubbing (sliding the finger over related objects such as radio buttons).

Gesture

Is swipe used to select, command, and move?

Sliding the finger a short distance, perpendicular to the panning direction, selects objects in a list or grid (ListView and Grid Layout controls). Display the App Bar with relevant commands when objects are selected.

Gesture

Is zooming allowed?

While the pinch and stretch gestures are commonly used for resizing, they also enable jumping to the beginning, end, or anywhere within the content with Semantic Zoom. A Semantic Zoom control provides a zoomed out view for showing groups of items and quick ways to dive back into them.

Gesture

Is rotation allowed?

Rotating with two or more fingers causes an object to rotate. Rotate the device itself to rotate the entire screen.

Gesture

Is swiping from the edges show app commands?

WINDOWS 8 App commands are revealed by swiping from the bottom or top edge of the screen. Use the App Bar to display app commands.

Gesture

Is swiping from the edges show system commands?

WINDOWS 8

Swiping from the right edge of the screen reveals the charms that expose system commands.

Swiping from the left edge cycles through currently running apps.

Sliding from the top edge toward the bottom edge of the screen closes the current app.

Sliding from the top edge down and to the left or right edge snaps the current app to that side of the screen.

General

Are interactions untimed?

Interactions that require compound gestures such as double tap or press and hold need to be performed within a certain amount of time. Avoid timed interactions like these because they are often triggered accidentally and are difficult to time correctly.

UI elements

Is the content able to follow user’s finger?

Elements that can be moved or dragged by a user, such as a canvas or a slider, should follow the user's finger when moving. Buttons and other elements that do not move should return to their default state when the user slides or lifts their finger off the element.

UI elements

Is content always correctly visualized and not covered by fingers?

Especially in case of moving targets as above, it is important to always maintain the content visible and avoid it to disappear “under” the fingers. As an example, a value on a slider, changing when moving the slider, must always be visible by the user. Another example, common in games, is drag and drop features: developers should pay attention to allowing the content to be correctly seen in all cases. The same logic may apply to dropdowns and radio buttons.

3.4 Technicalities

Target size

Is touch minimum size taken into account?

7x7 mm (40px) is a good minimum size if touching the wrong target can be corrected in one or two gestures or within five seconds. Padding between targets is just as important as target size. This logic applies, besides buttons, to sliders, dropdowns, scroll bar and all those UI elements offering control.

Target size

Do crucial actions have bigger target sizes?

Close, delete, and other actions with severe consequences can’t afford accidental taps. Use 9x9 mm (50px) targets if touching the wrong target requires more than two gestures, five seconds, or a major context change to correct.

Target size

Extreme case

If you find yourself cramming things to fit, it’s okay to use 5x5 mm (30px) targets as long as touching the wrong target can be corrected with one gesture. Using 2 mm of padding between targets is extremely important in this case.

Mouse

When a mouse is connected, is the correct UI presented to users?

When a mouse is detected (through move or hover events), show mouse-specific UI. If the mouse doesn't move for a certain amount of time, or if the user initiates a touch interaction, make the mouse UI gradually fade away. This keeps the UI clean and uncluttered.

Mouse

Are interactive objects presented as such with mouse events?

Provide visual feedback (or hover effects) for UI elements to indicate interactivity during mouseover events.

Resolution

Is menu content resizing with screen size?

It can be decided that menu resizes automatically, showing more content with the increasing space; it can be decided that menu scales up proportionally and content too; in case of dense content in small resolutions, content as screen can be made scrolling.

Resolution

Are the “standard” case and the “worse” case kept in consideration?

It is advisable to start from the most used use case and then verify it against the worse case scenario, especially when comes to fitting content and visualizing the correct call to actions.

Resolution

Is the text size correct for both the “standard” case and the “worse” case?

In cases in which small resolution won’t offer space, it is advisable to collapse some of the text or hide it, in favour of a correct visualization and clear communication

Scrolling

Are selected states disabled when scrolling?

It might be advisable to make selected states inactive when scrolling in order to avoid accidental selection.

Orientation

Is orientation lockable when both portrait and landscape modes are available?

In games in which there is a massive use of gyro (e.g. driving games) users should be able to lock the orientation in order to avoid unexpected changes between landscape and portrait.

Content

When the on-screen keyboard is triggered, are input fields accessible?

The screen should be repositioned in order to put focus on the selected input field. In particular, always keeping an eye on the worse case scenario (e.g. dense content) would grant better results.

Content

Is the app supportingsnapped and filled view?

Remember that snapping is simply resizing your app. Snapped and fill views are only available on displays with a horizontal resolution of 1366 relative pixels or greater. Because users can snap every app, you should design your app for the snapped view state. If you don't, the system resizes your app anyway and might crop your content or add scrollbars.

3.5 Design for all

Content

Is your app supporting blind or visually impaired users?

Blind or visually impaired users rely on screen readers to help them create and maintain a mental model of your app's UI. Hearing information about the UI, including the names of UI elements, helps users understand the UI content and invoke available functionality.

To support screen reading, your app needs to provide sufficient and correct information about its UI elements, including the name, role, description, state, position, and so on.

Content

Is the content of your app supporting visually impaired users?

Visually impaired users need text to be displayed with a high contrast ratio. They also need a UI that looks good in high-contrast mode and scales properly after selecting Make everything on your screen bigger in the Ease of Access control panel. Where color is used to convey information, users with color blindness need color alternatives like text, shapes, and icons.

Input

Is the keyboard accessible?

The keyboard is integral to using a screen reader, and it is also important for users who prefer the keyboard as a more efficient way to interact with an app. An accessible app lets users access all interactive UI elements by keyboard only, enabling users to:

Navigate the app by using the Tab and arrow keys.

Activate UI elements by using the Spacebar and Enter keys.

Access commands and controls by using keyboard shortcuts.

The On-Screen Keyboard is available for systems that don't include a physical keyboard, or for users whose mobility impairments prevent them from using traditional physical input devices.

Input

Is the on-screen keyboard showing the correct visual feedback?

The on-screen keyboard should always give feedbacks on pressed/ selected states. Besides, if implemented, use haptic feedback in order to reach the best results.

Input

Is the UI offering a right and left handed switch?

For some application supporting both right and left handed should be required. Developer should offer the possibility of switching between the two modes through a toggle or button.

3.6 Conversion Table

The following table offers translation standards for all the input types. This table should be followed as an example for general interactions, while specific ad-hoc gestures should be crafted according specifications.

Interaction

Touch

Mouse

Keyboard (hardware)

Select (list or grid)

Swipe opposite the panning direction

Right-click

Spacebar

Show app bar

Swipe from top or bottom edge

Right-click

Windows Logo Key+Z, menu key

Context menu

Tap on selected text, press and hold

Right-click

Menu

Launch/activate

Tap

Left-click

Enter

Scrolling short distance

Slide

Scroll-bar, arrow keys, left-click and slide

Arrow keys

Scrolling long distance

Slide (including inertia)

Scroll-bar, mouse wheel, left-click and slide

Page up, Page down

Rearrange (drag)

Slide opposite the scrolling direction past a distance threshold

Left-click and slide

Ctrl-C, Ctrl-V

Zoom

Pinch, stretch

Mouse wheel, Ctrl+mouse wheel, UI command

Ctrl+Plus(+)/Minus(-)

Rotate

Turn

Ctrl+Shift+mouse wheel, UI command

Ctrl+Plus(+)/Minus(-)

Insert cursor/select text

Tap, tap on gripper

Left-click+slide, double-click

Arrow keys, Shift+arrow keys, Ctrl+arrow keys, and so on

Moreinformation

Press and hold

Hover (with time threshold)

Move focus rectangle (with time threshold)

Interaction feedback

Touch visualizations

Cursor movement, cursor changes

Focus rectangles

Movefocus

N/A

N/A

Arrow keys, Tab

4. Best Practices


This chapter is aimed to provide examples of games that has been very well ported from a traditional point-and-click UI to a touch gaming experience. Despite the games presented may vary from the interested developers’ typology, their solutions should be anyway considered as cases that are covering crucial parts of the porting process. The examples presented are covering different areas such as Extended Uses Experience, UI and Navigation. The former, focusing mainly on good examples for keeping a solid and consistent experience throughout the different platforms; the latter, providing cases in which clever and relevant UI solutions have been adopted.

4.1 Extended User Experience

A key value to consider when porting games to touch devices, is to provide a unified and extended user experience. Depending on the game, key elements can serve as mean for unification (e.g. UI, Color Coding, Menu, Storytelling). In some cases the gameplay is the same on every platform, in other cases, the gameplay is more focused on key features. Some examples covering these different scenarios are presented. FIFA 13 is a good example of identical experience on every device. The UI is perfectly matching in all the platforms, both in aesthetics and functionality; developers have been good in adapting sizing and screen positioning for smaller screens, granting a unique feeling.

Even the gameplay offers exactly the same features simply translated to touch gestures; the way a player is controlled on iOS very much resembles the Playstation Gamepad; X/Y movements are handled by a digital stick on the left-hand corner of the screen, and the main actions on the right-hand corner. It is interesting to observe how touch and small devices forced developers into sharpening the focus over selected “core” functionalities, rather than trying to translate and fit everything. As an example Skill move, a 2-step action on playstation (i.e.L2+Stick) has been simplified in one unique panning action (i.e. move around the special move stick while keeping it pressed). Besides, all the special variants of passing and shooting, currently available on playstation have been removed and simplified into a unique way.

Many other points can be made in this gesture-translation analysis; however, the main point that has to be kept from this example is the attention put by EA developers into finding the core functionalities and focusing into translating only those to touch.

The same concept of unified experience can also be analysed from another perspective. Game developer CCP, offers a unified but yet different experience in every device. On PC, the massive multiplayer EVE, offers a complex set of controls and communication tools necessary to control, master and conquer different parts of a universe (i.e. Eve Universe) by sending troops and aerial attacks.

On playstation players are part of those troops that have been sent in the different planets of the Universe to conquer. The game hence becomes a FPS (i.e. Dust 514), but still connected and able to interact live with the players in EVE Online on a desktop device.

The touch experience (i.e. Neocom App) has been focused only on the core of the two previous games: the navigation (i.e. Neocom Menu) from EVE, and the character setup from Dust 514. The result is a companion app, in which players can tweak and prepare their characters for the battle before, during and after the gameplay.

In this case, the experience is spread through the platforms but it focuses on a specific functionality which each platform can serve best (i.e. PC=complexity PS3=immersive gameplay TOUCH=portability and control). In particular, the touch application is able to perfectly resemble the UI of both DUST and EVE, with a particular attention to sizing and interaction. In this case developers have been really good at adapting the deep and complex menu to a simplified and easy-to-navigate touch version.

4.2 Gestures and Gameplay

As already mentioned before, key for successfully porting towards touch devices, is to focus actions and simplify complex combined gestures. However, from a pure experiential point of view, there are other good examples of details that would make the transition to touch controls smoother.

Driving games, Real Racing and Need for Speed among all, are very valid examples of how sensor (e.g. accelerometer) can be used and tuned to increase the quality of the gameplay. In particular, since tilting the device equals steering a car, designers notices how often rotating an ipad means rotating also the screens and the game visuals; it resulted in weird positions while playing and often loss of control. Real racing solved this issue by balancing the device tilting with the visuals tilting. The resulting improvement made the game highly playable compared to competitors and soon became a standard.

When dealing with complex controls, rich in interaction, very often game developers opt for a semi-automatic version of them, giving more attention to the gameplay and the user’s involvement, rather than control’s accuracy. Some might argue that the game does not feel like the original anymore, but practice demonstrated that players using touch devices are not looking specifically for accuracy, but for involvement and immersive game experiences.

Bastion, RPG winner of several awards, used for the iPad version semi-automated control for shooting (i.e. facilitated aiming) and used a simple tap and drag control for moving the character around the environments, with both touch and automatic panning of the scene. The result is an award winning game that clearly resembles the original, despite offering a differently engagement.

Another interesting point that needs to be kept in consideration is the type of controls (e.g. buttons, digital sticks etc) to be presented on the touch UI. Depending on the complexity of the game, very often two main scenarios occur. The first one, typical of games with simpler or very rigid controls (e.g. driving games, old-school games, sports in general etc), will make use of a digital representation of the touchpad.

In the example on previous page (i.e. Tony Hawk touch) a big part of the screen is covered by controls. Besides being the easiest solution when porting a game to touch, the most visible drawback is the fact the the gameplay is often covered by the players’ hands, resulting in a pretty disappointing experience. Moreover, user testing showed how digitalized buttons are really complex to use correctly, as not giving any feedback as physical buttons do. It is good for developers to keep in mind that some companies are focusing on creating physical add-on to be snapped to phones and iPads, in order to simulate physical buttons; however, this topic won’t be covered in the report, also due to the scarce spread of those devices through the market.

The second scenario, instead, is typical of more complex games with a rich storytelling. In this case, the buttons on touch devices are a representation of a specific action, that on a gamepad might be only achievable by using multiple buttons.

In the example presented above (i.e. Grand Theft Auto) players are given controls based on contextual actions. When in “walking” mode, players are given specific action controls that differ from when players are in “driving” mode. The advantage of this solution is the possibility of covering several types of gameplay without ending up with cluttered, hence unusable, UI. Moreover, it can be seen how the game give a hierarchical organization to the displacement of the buttons. More frequent or main actions deserve a bigger size, hence reachability, rather than minor one (e.g. accelerate or brake is bigger and easier to reach then using the horn).

Addition to all this they have added the flexibility to enable the user to customise the UI. This lets you fully customize exactly where each of the button icons, and HUD elements such as the mini-map, appear on the screen. They allow the user to simply touch and drag each item to wherever you want it to be and double tap each item to resize it.

5. Recommended Documentation


Microsoft UX Guidelines
http://msdn.microsoft.com/en-us/library/windows/apps/hh779072.aspx

Microsoft Touch Interaction Design
http://msdn.microsoft.com/en-us/library/windows/apps/hh465415.aspx

Windows Store App Certification Requirements
http://msdn.microsoft.com/en-us/library/windows/apps/hh694083.aspx

Getting started with Windows Store apps
http://msdn.microsoft.com/library/windows/apps/br211386

Apple’s Desktop to Touch transition Guidelines
http://developer.apple.com/library/ios/#DOCUMENTATION/UserExperience/Conceptual/MobileHIG/TranslateApp/TranslateApp.html

Apple’s App Design Strategies
http://developer.apple.com/library/ios/#DOCUMENTATION/UserExperience/Conceptual/MobileHIG/AppDesign/AppDesign.html

Touch UI: iPad to Windows 8
http://msdn.microsoft.com/en-us/library/windows/apps/hh868262

MS Touch Guidelines
http://msdn.microsoft.com/en-us/library/windows/apps/hh779072.aspx

Android Guidelines
http://developer.android.com/design/index.html

Portrait vs Landscape
http://blogs.msdn.com/b/b8/archive/2011/10/20/optimizing-for-bothlandscape-and-portrait.aspx

Windows 8 – Designing Great Games
http://msdn.microsoft.com/en-us/library/windows/apps/hh868271.aspx

Baldur’s Gate: Enhanced Edition (Release for PC & iPad)
https://itunes.apple.com/us/app/baldurs-gate-enhanced-edition/id515114051?mt=8

Grand Theft Auto 3 & Vice City (Released for Consoles, PC and recently iPad/iPhone)
https://itunes.apple.com/nz/app/grand-theft-auto-vice-city/id578448682?mt=8

Mobile Interaction Design (2006), Matt Jones, Gary Marsden
http://www.amazon.com/Mobile-Interaction-Design-Matt-Jones/dp/0470090898/ref=sr_1_1?ie=UTF8&qid=1362584655&sr=8-1&keywords=mobile+interaction+design

Web Form Design: Filling in the Blanks (2008), Luke Wroblewski
http://www.amazon.com/Web-Form-Design-Filling-Blanks/dp/1933820241/ref=sr_1_10?s=books&ie=UTF8&qid=1362584687&sr=1-10&keywords=mobile+interaction+design

Swipe This!: The Guide to Great Touchscreen Game Design (2012), Scott Rogers
http://www.amazon.com/Swipe-This-Guide-Touchscreen-Design/dp/1119966965/ref=sr_1_1?s=books&ie=UTF8&qid=1362584940&sr=1-1&keywords=game+touch+design

Bastion’s Amir Rao - Full Keynote Speech - D.I.C.E. SUMMIT 2013
http://www.youtube.com/watch?v=qylr_oGfmCQ


1 Guidelines for layouts (Windows Store apps)
2 First Person Shooter
3Navigation design for Windows Store apps
4Re-imagining Apps for Ultrabook™: Full Series with Luke Wroblewski

  • ultrabook
  • Windows* 8
  • Windows store
  • Apps
  • Windows desktop
  • APIs
  • touch
  • touch-enable
  • game
  • User Interface
  • Developers
  • Microsoft Windows* 8
  • Windows*
  • Game Development
  • Microsoft Windows* 8 Style UI
  • Touch Interfaces
  • User Experience and Design
  • URL

  • Viewing all articles
    Browse latest Browse all 3384

    Trending Articles



    <script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>