Pufferfish and Moodagent Spherical display
|
Single pixel video
|
MoreGrasp Rehabilitation system
|
BeoSound Moment system (Impact outcome of Ph.D. studentship co-funded by Bang and Olufsen)
|
Touching the micron
Rewarding the original
video |
Anglepose
video |
1. Brain Computer Interaction
video (38Mb) mp4 (17Mb |
|
2. Stane tactile input, and bearing-based Mobile Spatial Interaction video.
|
3. Using the Body Space approach to allow the user to answer the phone just by bringing the phone to the listening position. In this video you see examples from stationary, while walking, and where the user waves the phone around first. The phone is not responding to 'any' movement, but just one compatible with being brought to a listening position. mp4. Here is another Body Space video where a music player is controlled by using body locations to determine content or function. In this clip, songs are stored around the right shoulder, and can be browsed and selected just by hand movements. The volume control is located near the hip, and the back/forward track is at the ear. The BodySpace webpages give more background.
S. Strachan, R. Murray-Smith, S. O’Modhrain, BodySpace: inferring body pose for natural control of a music player, Extended abstracts of ACM SIG CHI Conference, San Jose, 2007. pdf video (mp4)
4. Shoogle is an interface for actively feeling the content of your phone. To test whether there are new SMS messages or e-mails, just give it a shake. It feels as if there are balls bouncing around in the phone, if there are new messages. The impact sound gives you information about who sent it, and what sort of message it is. This is a general technique for coupling inference mechanisms and multimodal interaction. mp4-video
5. Body Space, MESH and MoodPlayer demo. Quicktime Video (16Mb) Quicktime Video (Streaming) This video shows Syntonetic's Moodplayer linked up to a Pocket PC/MESH system via Bluetooth.
. | See this paper for more details about the Body Space concepts:
|
6. Tremor control of a PocketPC (S. Strachan, R. Murray-Smith, Muscle Tremor as an Input Mechanism, UIST 2004, Santa Fe, 2004. pdf)
7. Multimodal Speed Dependent Automatic Zooming. Version with stylus interaction
|
8. Tilt-interaction with a mobile phone emulator - version 1
9. Tilt-interaction with a mobile phone emulator - version 2
10. Haptic granular synthesis (A. Crossan, J. Williamson, R. Murray-Smith, Haptic Granular Synthesis: Targeting, Visualisation and Texturing, International Symposium on Non-visual & Multimodal Visualization, London, IEEE Computer Society, 2004 pdf)
11. Text entry video (11Mb) (J. Williamson, R. Murray-Smith, Dynamics and probabilistic text entry, DCS Technical Report TR-2003-147, Department of Computing Science, Glasgow University, June, 2003. pdf )
12. Haptic dancing (S. Gentry, R. Murray-Smith, Haptic dancing: human performance at haptic decoding with a vocabulary, IEEE International conference on Systems Man and Cybernetics, Washington, D.C., USA, 2003 pdf )
13. Two-player pong, via Bluetooth with accelerometer input. An experimental platform for exploring display-free games via multimodal feedback.
14. Xsens P3C Accelerometer with Bluetooth link, and haptic feedback for control of aircraft in X-plane simulator:
15. MP3 file selection via tap or accelerometer input. An implementation of the pointing without a pointer approach to selection, where the display modality is audio, and the input is tapping or shaking, depending on mode. The correlation between the rhythm of the track and the rhythm of the tapping is used to select the song. The ambiguity of the user's tapping is visible in the width of the red band at the top of the display.
16. Navigating a campus with audio and vibration feedback, based on uncertain location and orientation sensing.
|
17. Active selection with the 'eggheads' metaphor. A number of 'heads' experience orientation disturbances. Input motion is applied to all heads equally. By cancelling the disturbance, selection is achieved. The demo can be downloaded.
18. Active selection video for Brownian motion targets. Each individual target moves on a smooth, independent course. The user's mouse actions are applied equally to the trailing targets. Correlating motion results in selection. The demo can be downloaded.
19. Tilt-based photo browsing on a phone.
From: S. J. Cho, R. Murray-Smith, C. Choi, Y. Sung, K. Lee, Y-B. Kim, Dynamics of Tilt-based Browsing on Mobile Devices, Extended abstracts of ACM SIG CHI Conference, San Jose, 2007. pdf video mp4 |
20. Steve Strachan's "Star Wars Light Sabre" demo, running on the Nokia 5500.. The latency is an issue and it's a very basic system but it's still a fun demo...The app was programmed using Python for S60.