Friday, November 11, 2011

Paper Reading #28

Experimental analysis of touch-screen gesture designs in mobile environments


Authors: Andrew Bragdon Brown University, Providence, Rhode Island, USA
Eugene Nelson Brown University, Providence, Rhode Island, USA
Yang Li Google Research, Mountain View, California, USA
Ken Hinckley Microsoft Research, Redmond, Washington, USA


Proceeding 

CHI '11 Proceedings of the 2011 annual conference on Human factors in computing systems


 

Summary

  • Hypothesis - The researchers believe that bezel initiated gestures are superior to soft buttons, especially in non-ideal environments where focus can not be held on the interface. 
  • Content - Bezel gestures start from the edge of touch interfaces. The other types of gestures tested were soft and hard buttons. A soft button is simply a GUI representation of a button where the hard button is an actual mechanical button. The second factor tested was whether or not the gestures were aligned to an axis. Those that were aligned to an axis are mark-based gestures, where the others are termed free-form. 
  • Methods - 15 participants were asked to test the effectiveness of the different types of gestures in different environments. Participants were tested when sitting, standing, walking, and introduced 3 levels of distraction to these motor states. The least distracting environment allowed constant visual contact with the interface, where the most distracting environment did not allow the user to look at the screen. At the end of the study, participants took a questionnaire to determine other factors. 
  • Results - More time was taken for gestures in more distracting environments, regardless of the gesture type. However, in this environment, it was found that bezel initiated gestures were superior to hard and soft buttons. There was no difference from gestures performed during sitting and standing. The accuracy of free-form gestures was lower than mark-based gestures. When distractions were not an issue, soft buttons performed better. The fastest and most preferred environment was sitting with no distractions and soft buttons.
Discussion
The bezel initiated gesture built into Android to display notifications is one of my favorite things about the user interface of the system. I am able to perform the action in distracting environments relatively easily. The only issue for future use is that since it is used to display the notification menu, there is a degree of dimensionality lost for future applications. However, in distracting environments such as driving, the best thing to do is to remove the distraction entirely, either by pulling over, or not messing with a phone.

No comments:

Post a Comment