Tuesday, October 25, 2011

Paper Reading #23

User-defined motion gestures for mobile interaction


Proceeding 
CHI '11 Proceedings of the 2011 annual conference on Human factors in computing systems

Summary
  • Hypothesis - The Researchers believe that some types of gestures will be more difficult than others
  • Methods - Twenty participants were asked to make gestures on how to do simple tasks with mobile devices. Since learning about new concepts in smart phones may make the users less able to perform the tasks, only people who were previously used to smart phones were tested. The gestures to be made were supposed to be created to be as simple to the user as possible. Since the hardware in the mobile devices are not able to completely recognize all gestures perfectly, the users were told to treat them as "magic bricks" that could understand any gesture. Also the users were asked to describe why they chose such gestures while they were performing them in order to understand their reasoning. After the users created their gestures, they graded their own gestures on a Likert scale.
  • Content - The tasks performed were as follows: Answer call, hang-up call, ignore call, voice search, place call, act on selection, home screen, app switch next, app switch previous, next (vertical), previous (vertical), next (horizontal), previous (horizontal), pan left, pan right, pan up, pan down, zoom in, and zoom out. The ways gestures are described are with the nature of the gesture (symbolic or abstract), whether it has a context, whether the gesture causes an action after or during the movement, how much impulse is applied to the gesture, the dimensionality of the gesture, and it's complexity.
  • Results - The more direct mapping to real life examples for manipulation the gestures were, the more widely the participants liked the gestures. For example, the gesture for answering a call that most participants agreed upon was a motion that was similar to the motion that would be to answer a call by putting the phone to the ear. Hanging up the phone was done in another similar mapping to old-style phones by turning the screen around and parallel to the ground. A slightly unnatural mapping was found for moving the screen. On touch gestures, people "drag" the objects in the screen with a movement that is in the same direction as the motion of the object. However, for motion gestures, the participants moved the window in the opposite direction of the movement of the objects in the screen.
Discussion
It would be nice if every gesture had a very easy and simple direct mapping to normal movements in the physical world. The problem is that with the added complexity and functions that computer systems can offer, there sometimes is no other way to perform a task other than using a computational device. In this case it is probable that an unnatural gesture would have to be made. Then again, this theoretical situation would make the gesture the only way to perform the action in such a world, and in a sense, would be the "normal" action to do anyways. One can only hope that adequate gestures are mapped to actions when mobile device complexity explodes.

No comments:

Post a Comment