Multimodal mobile interaction - making the most of our users' capabilities
Speakers: Stephen Brewster
Topic(s): Human Computer Interaction
Mobile user interfaces are heavily based on small screens and keyboards. These can be hard to operate when on the move which limits the applications and services we can use. This talk will look at the possibility of moving away from these kinds of interactions to ones more suited to mobile devices and their dynamic contexts of use where users need to be able to look where they are going, carry shopping bags and hold on to children at the same time as using their phones. Multimodal (gestural, audio and haptic) interactions provide us new ways to use our devices that can be eyes and hands free, and allow users to interact in a 'head up' way. These new interactions will facilitate new services, applications and devices that fit better into our daily lives and allow us to do a whole host of new things. I will discuss some of the work we are doing on input using gestures done with fingers, wrist and head, along with work on output using 3D sound and haptic displays in applications such as for mobile devices such as text entry and navigation. I will also discuss some of the issues of social acceptability of these new interfaces; we have to be careful that the new ways we want people to use devices are socially appropriate and don't make us feel embarrassed or awkward.
About this Lecture
Number of Slides: 35
Duration: 45 minutes
Languages Available: English
Last Updated: 12-16-2015
Request this Lecture
To request this particular lecture, please complete this online form.
Request a Tour
To request a tour with this speaker, please complete this online form.
All requests will be sent to ACM headquarters for review.