in|situ| Lab - Log In

Combining Chord and Gesture Recognition

Master-level internship at in|situ|

Advisors: Wendy Mackay and Stéphane Huot

Summary

The goal of this internship is to combine chords and gestures, to provide expressive control parameters, based on our previous work with Arpège and Octopocus.

Description

Although multi-finger interaction is readily available on a variety of multi-touch devices, such as tablets and smart phones, few commercial systems provide more than simple swipe, pinch and tap gestures. Yet expert users need to access a wide variety of commands, ideally in the context of the objects they are manipulating on the screen. One of the major barriers to learning complex gestures and chords is that they can be difficult to learn. The InSitu group has already developed gesture (Bau & Mackay, 2008) and chord (Ghomi et al., 2013) recognition algorithms that permit progressive feedforward and feedback to the user, which offers users a large vocabulary of direct manipulation commands. Expert users can perform the gestures or chords directly; novices who hesitate can see which commands are currently available and how to execute them with an associated gesture or chord. Our previous work involves discrete gestures and chords, each associated with a single command. The purpose of this internship is to combine chords and gestures, to provide expressive control parameters. For example, a three-finger chord might invoke the volume command, and the index finger can then slide up or down to increase the volume. Another chord might invoke the rotate command, and the thumb, index finger and middle finger could move in independent circles to control roll, pitch and yaw.

During the internship, the student will be expected to:

  • explore existing progressive recognition algorithms for detecting multi-touch gestures and chords,
  • video prototype alternative methods for visualizing the available chords and gestures, and
  • design, implement and evaluate a system that enables users to invoke commands with chords and control related parameters with gestures.

The internship can be 4 to 6 months long and hopefully will result in a paper submitted to a conference such as ACM CHI. Note that this internship is associated with W. Mackay's ERC Advanced Grant (CREATIV), which seeks to create more effective human-computer partnerships through co-adaptive instruments, and funding is available for a Ph.D. thesis in this area.

Required skills

  • Basic background in Human-Computer Interaction
  • Programming in C/C++, Objective-C and/or Java
  • Experience programming touch or gesture-based interfaces is a plus

References

Emilien Ghomi, Stéphane Huot, Olivier Bau, Wendy E. Mackay and Michel Beaudouin-Lafon (2013) Arpège: Learning Multitouch Chord Gestures Vocabularies. In ACM International Conference on Interactive Tabletops and Surfaces. ACM: St. Andrews, Scotland. http://www.youtube.com/watch?v=dGxeHjGp9kE

Olivier Bau and Wendy Mackay (2008) OctoPocus: A Dynamic Guide for Learning Gesture-Based Command Sets. In Proceedings of ACM Symposium on User Interface Software and Technology (UIST 2008). ACM, pages 37-46. http://vimeo.com/2116172