Mobile computing devices are extremely popular. Mobile phones and handheld computers are one of the fastest growth areas of computing, and this growth will extend into sophisticated, fully wearable computers in the near future. However, these devices often have limited input and output capabilities. Limited screen space means displays can easily become cluttered. Input is also limited; slow and cumbersome methods such as small keyboards or handwriting recognition are the norm. Current interaction techniques limit mobile devices because activities like walking, driving or navigating all require high visual attention and dealing with a complex display at the same time can cause problems.
The innovative aspect of this project is to explore a new paradigm for interacting with mobile computers, based 3D sound and gestures, to create interfaces that are powerful, usable and natural. The gesture modelling will be a novel combination of dynamic systems models and nonparametric statistical models. We will develop a wearable computer that uses 3D sound for output and head, hand and device gestures for input. This will allow us to investigate new presentation methods and interaction techniques to allow richer and more complex, tightly coupled interactions with mobile devices, opening up the possibilities for using mobile devices in a range of new ways.
EPSRC project GR/R98105
Links to other relevant websites.
|
| ||
Mobile HCI'04 |
Inertial sensing and miniature accelerometers provided by XSens, B. V. |