The 2nd Workshop was in Edinburgh as part of Interact'99. Copies of the papers can be accessed; they are at the bottom of the page so dont be put off by the call for papers.
A third workshop is planned for early in 2001. Probably as part of HCI-IHM 2001 in Lille. Mail johnson@dcs.glasgow.ac.uk if you would like details when they're available.
GIST Technical Report G98-1.
Department of Computing Science,
21-23rd May 1998.
University of Glasgow,
Scotland.
Chris Johnson.
Usability and Mobility; Interactions on the move. *
Peter Johnson
Exploiting Context in HCI Design for Mobile Systems *
Tom Rodden, Keith Chervest, Nigel Davies and Alan Dix
Ubiquitous Input for Wearable Computing: Qwerty Keyboard without A Board *
Mikael Goldstein, Robert Book, Gunilla Alsio and Silvia Tessa
Using Non-Speech Sounds in Mobile Computing Devices *
Stephen Brewster, Grégory Leplâtre and Murray Crease
Design Lifecycles and Wearable Computers for Users with Disabilities *
Helen Petrie, Stephen Furner, Thomas Strothotte,
Developing Scenarios for Mobile CSCW *
Steinar Kristoffersen , Jo Herstad, Fredrik Ljungberg,, Frode Løbers, Jan R. Sandbakken, Kari Thoresen
Human-Computer-Giraffe Interaction: HCI in the Field *
Jason Pascoe, Nick Ryan, and David Morse
Some Lessons for Location-Aware Applications *
Peter J. Brown
Developing a Context Sensitive Tourist Guide *
Nigel Davies, Keith Mitchell, Keith Cheverst, Gordon Blair,
On the Importance of Translucence for Mobile Computing *
Maria R. Ebling and M. Satyanarayanan
Developing Interfaces For Collaborative Mobile Systems *
Keith Cheverst, Nigel Davies, Adrian Friday,
Wireless Markup Language as a Framework for Interaction with Mobile Computing and Communication Devices *
Jo Herstad, Do Van Thanh and Steinar Kristoffersen
Giving Users the Choice between a Picture and a Thousand Words *
Malcolm McIlhagga, Ann Light and Ian Wakeman,
User Needs for Mobile Communication Devices: *
Kaisa Väänänen-Vainio-Mattila and Satu Ruuska
Department of Computing Science, University of Glasgow, Glasgow, Scotland, G12 8QQ.
johnson@dcs.gla.ac.uk
This meeting stands at the intersection of two streams of technological development. The first stems from improvements in portable computing devices, ranging from lap tops to 'ubiquitous devices' with embedded processing power. The second strand of technological development originates in the growth of mobile telecommunications through cellular and satellite infrastructures. An increasing number of devices are being developed to exploit techniques from both areas and this workshop is really about the design challenges that are created by such an integration.
These challenges are partly technological. It is unclear what mechanisms will be needed to support user tasks with future generations of 'integrated' mobile devices. A number of papers in this collection address these basic infrastructure questions. Herstad, Van Thanh and Kristoffersen present a number of innovative programming and application development techniques for the development of mobile systems. Chervest, Davies and Friday analyse the fundamental characteristics for quality of service in heterogenous networks. McIlhagga, Light and Wakeman argue that designers must consider the usability costs, as well as the benefits, of allowing applications to adapt to different levels of connectivity. Ebling and Satyanarayanan explore the wide trade-offs that exist between communications connectivity and power consumption, between cost and performance, between translucent and opaque user interfaces.
These architectures will only be successfully exploited if designers have a clear idea of the requirements that mobile systems must satisfy. Existing requirements analysis techniques provide limited support for the diverse user groups that form and interact in a dynamic and ad hoc manner over mobile networks. Petrie, Furner and Strothotte's paper describe the changes that must be made to the conventional development cycle for such applications. There are further challenges. Research in Human Computer Interaction has recently begun to acknowledge the importance of the users' context and environment when designing interactive systems. The challenge of mobile systems is that this environment may be continually changing. A number of further papers address the problems that this poses for requirements analysis. Mattila and Ruuska demonstrate that social enquiry methods from ethnography can be used to identify user significant concepts and associations during the operation of mobile devices. Several other papers, such as that by Pascoe, Ryan and Morse, argue that the same contextual approaches that are necessary in requirements analysis, must also be used during the evaluation of mobile systems. User interfaces that work well in lab setting may not work so well on the plains of Africa or 30 feet up an electricity pylon. This is a contiuning theme behind Pete Johnson's position paper and the design techniques advocated by Rodden, Chervest, Davies and Dix. Of particular concern is the manner in which location-aware computing devices integrate with existing tasks; The papers by Brown and by Davies, Mitchell, Chervest and Blair argue that this is critical if innovative mobile applications are to move from the laboratory to the 'real world'.
Even if it is possible to identify user requirements for mobile computing devices, it is far from clear whether we have appropriate devices to satisfy their needs. Goldstein, Brook, Alsio and Tessa focus on the use of virtual keyboards to avoid the problems of data entry on mobile devices. Brewster, Leplâtre and Crease argue that the limited display resources of existing systems must be augmented with more diverse and meaningful auditory cues.
This brief review has only touched on the many diverse problems that are facing the designers of human computer interfaces to mobile devices. We have not, however, mentioned the most significant challenge: there is little or no dialogue being conducted ebtween the diverse groups that are working on user interfaces to mobile devices. There are dozens of commercial and academic research groups in this area. My experience in organisiang this event is that most of them are completely unaware of each other's existance.
Many people have contributed to the organisation of this workshop. In particular, thanks are due to my friends and colleagues in Glasgow Interactive Systems Group (GIST). Steve Brewster, Mark Dunlop and Phil Gray inspired the event and contributed to the detailed planning. Colin Burns Daniela Busse and Meurig Sage helped to shoulder some of the leg work involved in preparing badges etc.
Chris Johnson, Glasgow, 20th May 1998.
Department of Computer Science, Queen Mary and Westfield College, University of London. London E1 4NS.
pete@dcs.qmw.ac.uk
Introduction
The developments in wireless communication, distributed systems, and increases in the power and interactive capabilities of hand-held and portable devices, provide us with the possibility to have wide-ranging and continual access to computing resources in a variety of contexts. These technological changes make increasing demands on the quality of the user interface and offer the potential to further progress the functionality of computing devices. However, this makes human-computer interaction all the more central to the design and development of such mobile systems. The case remains that functionality does not exist for the user if that functionality is not usable.
This paper considers aspects of mobile systems from an HCI perspective and in doing so reflects upon how well-equipped HCI is to support the design and development of mobile systems. Four areas of concern for HCI are raised and briefly discussed with example scenarios in which mobile systems might be developed to illustrate how these design situations present HCI researchers and practitioners with new challenges.
The Scenarios
Three scenarios of usage are presented here to illustrate the novelty and challenging aspects of the design problems. The scenarios are taken from actual situations in which mobile computing is being used or being considered for use in experimental forms. The scenarios themselves are not derived from real-life situations that have been analysed as part of this research. The three scenarios are: the memory aid; the roadside accident; the home patient.
The memory aid scenario.
We all forget things and at times we forget quite important things. Some people who have had brain damage through old age, illness or accident experience memory problems. Work at Addenbrookes Hospital and the MRC APU in Cambridge by Wilson and her colleagues (Wilson 1997) has been focused upon providing such people with a portable computer that acts as a memory aid. The people concerned have severe memory problems such that they would forget to feed the cat, to buy food, or to take medication.
The scenario is one in which an ageing person with mild memory loss is given a device to wear for example, on their belt (no larger than a telephone pager). The device has been programmed by a relative (say their daughter) to give them a "bleep" and a message at relevant points of time, location, or task situation throughout the day . This is intended to remind the patient to carry out tasks such as, take their medicine (and which medicine in what quantity), to go to the shops, to buy the food for their dinner, to make a shopping list and to take the shopping list and their money with them.
Designing such a device raises many HCI design questions. How would the different users interact with it, how would the reminders appear, what would happen if the reminder was forgotten, or the task already carried out? Does the system also allow the shopping list to be entered in and used in the shop? How conspicuous should the device itself be, would people be willing to carry or wear it, or would it be a source of stigma? How would the daughter interact with the device, or know that the parent had actually taken the medicine etc.? These are just some of the questions that arise in this situation.
The roadside accident scenario.
Important uses of portable devices of various sorts are made in emergency situations. In particular, in medical settings, ambulance crew (both land and air-based) carry life-saving devices to the scene of an emergency. One useful application of mobile computing systems in such settings is the transmission of patient data to a medical centre and providing increased communication between the ambulance crew and the clinical staff in the centre. For example, in the case of head injuries, a mobile device might transmit images of the results of various tests and body-state parameters, together with information about the patient's level of consciousness (e.g. the quality of their speech and comprehension, the state of their pupils) and their respiratory and heart rates to the consultant neurosurgeon in the medical centre. In addition, the consultant might have audio and video contact with the ambulance crew to advise and assist in the care of the patient and in sending the patient to an appropriate specialist hospital.
The context of use of this equipment would mean that the design would have to take account of the system being used out in the open, in all weathers at all times of day or night. The quality and speed of the image transmission would need to be reliable, secure and of a high enough standard to meet the requirements of the NHS, as would the quality of the interaction between the consultant and the ambulance crew. As well as considering the environmental conditions of use, attention would also have to be given to the conditions of the users. The consultant and the ambulance crew would be in very different contexts and conditions to each other, the device would not have to add to the stress and cognitive load placed on the various members of this distributed team. These are just some of the challenging HCI problems to be addressed in designing such a mobile system for roadside accident patient care.
The patient in the home scenario.
The third scenario is yet again a clinical one. The situation is one in which patient data is monitored and transmitted to the medical centre on a regular basis during the patient's normal every-day life. Johnson (1997) has studied the data collected in the home and in the clinical centre, of patients such as pregnant women, babies and chronic asthmatics. His findings show that there is a lack of correspondence between the two sets of data. From this he suggests that data collected in a clinical setting may be less reliable than data collected on a regular basis in the patient's everyday life. He further suggests that mobile data collection devices that could be worn by the patient could be used to collect and transmit data to a clinical centre.
The scenario involves the patient putting the device on, checking that it is working properly, and then being able to ignore it again until they remove the device. In the case of some patients (such as chronic asthmatics) the device might be worn during the night as well as during the daytime. The device might also allow the patient to see their own data so that they themselves can check their blood pressure, or respiratory levels over various time periods, with annotations added by the patient describing the context and situation of activity alongside the data recording. Furthermore, the transmitted data would also be monitored by the medical centre and in cases where the received data suggested a cause for concern, the clinicians would be immediately put in touch with the patient and if necessary emergency treatment instigated. The device worn by the patient would need to be easy to fit, easy to wear, and not appear to have any stigma associated with it. In addition, it would need to be easy to interact with by the patient and clinician, (separately) in testing if it is working correctly and in annotating and monitoring the data collected.
The problems of Usability and Mobility.
There are at least four problems to be faced in addressing the HCI of mobile systems. These concerns are:
(i) the demands of designing for mobile users, their tasks and contexts.
(ii) accommodating the diversity and integration of devices, network services and applications,
(iii) the current inadequacy of HCI models to address the varied demands of mobile systems,
(iv) the demands of evaluating mobile systems
Generally speaking, HCI has developed a good understanding of how to design and evaluate forms of human computer interaction in "fixed" contexts of use, in a single domain, with the users always using the same computer, to undertake tasks alone, or in collaboration with others. This is not the situation of use for mobile computing. Now the computer is moving from the workplace and occasional home use onto the streets and into our everyday lives, it will become as commonplace as wearing a watch is for many of us. Will it be the same "watch" we always wear for all occasions, and will we know how to "tell the time" on it in all its different guises and contexts of use? Clearly, this is a limited analogy since a watch will normally only tell the time, date and ring an alarm.
This short paper briefly touches upon the four concerns above in turn. Following this each of the remaining three concerns are briefly discussed and some general conclusions for HCI in the area of mobile computing are made.
Mobile systems design issues with users, tasks and contexts
With the promise of a technology that will give access to computing resources to a wider range of users carrying out more complex and multiple tasks and in a wider range of situations our existing "craft" knowledge of design will be found as wanting as our more formalised HCI knowledge. The classes of users, tasks and contexts of use will be novel to the general HCI community and outside the scope of experience of many current designers. Design problems escalate the more the context of usage has to be considered, and the more variable and unpredictable that context becomes.
Within HCI, design methods and support for designers has largely considered the design of the artefact itself, while some attempts have been made to consider the tasks, the users, the contexts and situations of use these have been less extensive.
To contribute to the design of mobile systems we need to understand what the design problems of mobile systems are. This may sound circular or tautological but it is not. There are extensive psychological, sociological, organisational and environmental phenomena to be studied when we start to investigate the "worlds" in which mobile computing might take place. However, whether or not these phenomena have any relevance to system design has to be considered, and if they do have relevance, how that relevance can be used to inform and contribute to the quality of the design. For example, much current interest has been given to "distributed cognition" and to "activity theory" as possible approaches to understanding people, groups and activity in a social/organisational context. However, the problem for design is not to understand or explain that behaviour, structure, or society, but design systems to work within it and improve upon it. The designer's problem is a different one to that of psychologist, sociologist or organisational theorist.
Diversity and Integration.
In a recent address to the ACM Intelligent User Interfaces 98 conference in January, Dan Olsen projected a glimpse of the challenges facing the computing and HCI communities, (Olsen, 1998). The title of Olsen's address was "Interacting in Chaos" and his thesis is as follows: Today it is not uncommon to find a single person interacting with many different computers. For example, you might have a PC or an Apple Macintosh in your office, a further PC or Apple Macintosh at your home, a portable lap-top computer that you take with you on business trips or to conferences. You might also carry a mobile phone or pager (or both), and a Pilot or electronic notebook/dairy, along with all the different devices you might find in your car, on the train or in public places. In addition to these devices there are the various printers, computing servers, scanners, servers and other useful devices to which you could be connected. Together with the various electronic mail, internet, news, groupware and other network services and applications that you might connect to and use.
The "chaos" comes about because each of these computers, devices and network services has only limited ability to communicate with each other and often have widely different formats of data, processing and interaction. For instance, when you read your email on your Pilot you do not see it in the same format as you do when you read it on your Apple Macintosh, and when you enter a phone number into your mobile phone you are not also able to add it to your electronic address book, hence, the "chaos". Olsen's analysis goes even deeper than this, for he points out that the problem is not that there is a proliferation of devices, services and software, but that we do not have the ability to model the properties and variability of these in such a way that we can begin to solve the problem. He claims that our existing forms of modelling software, systems and interaction do not adequately address problems of diversity, inconsistency, accessibility (or the lack thereof), replication and integration. From an HCI perspective, we must be prepared to develop new ways of modelling, designing and evaluating the usability of mobile interactive systems.
Modelling.
Can we develop useful HCI models that will contribute to the design and development of mobile systems? In his excellent address to the EPSRC MNA workshop, Nigel Davies (Davies, 1997), described the work at Lancaster University on the mobile GUIDE system. This system makes use of an available land-based network around the city of Lancaster together with a wireless communication network from nodes on the land-based network, to provide the user with a location-sensitive guide to the city. In a prototype version, the user has a flat-panel, pen-based device that receives and sends signals to the local network node and downloads "packets" of information relevant to the specific geographical location they are now in. Quite apart from the fact that an early version of the prototype device would not work in the wet, there are some interesting HCI issues here.
From an HCI modelling perspective we are well-equipped to model cognitive aspects of users (e.g. Barnard & May, 1997), their tasks (e.g. Johnson, Johnson & Hamilton, 1997), the domain (e.g. Lim & Long, 1995) and to model aspects of collaboration and group working. In addition, we can model aspects of an interface and in some cases generate standard forms of interfaces from abstract models (e.g. Wilson & Johnson, 1996).
However, can we model users adequately such that we could, for example, say how using a mobile guide would interfere with their ability to notice a speeding car as they stepped out onto a street, or how much attention they would pay to the guide while trying to deal with a child pleading for another ice-cream? The point here is that the complexity of the activities becomes large because there are so many things going on at the same time, and this makes it increasingly difficult to model the interactions between these activities. Similarly, from a task perspective could we model the interaction between the various tasks of using the guide, touring the city, minding the child and crossing a busy road, in such a way that we could understand how they influenced each other? From the perspective of domain modelling, what is the domain here - it is an interaction between several domains including, road safety, tourism and child minding. From a collaboration and co-operative working perspective there is also a problem because the nature of the activities are often such that they and the people involved in them are conflicting rather than co-operating or collaborating. From an interface modelling perspective it would be difficult to model the various interactions the user would be having with the guide. In user interface design where model-based design has been used it is obvious that the ability to generate interfaces is limited to standard forms of interaction with well-defined interaction styles. The types of interaction used in a mobile tourist guide could require (at present) less common forms of input (e.g. pen and speech) and output (e.g. text, graphics, sounds, speech, pictures, movies) as well as unconventional forms of interaction and with some variability in the form and quality of interaction available at any given time. Current model-based user interface techniques would not provide for this variability.
Evaluation of interaction and usability in mobile contexts
Regarding evaluation, we have a number of evaluation techniques to choose from such as empirical testing, discount usability methods and cognitive and task analytic methods. Evaluation would clearly be possible, but the criteria and the methods used would need to be researched. It is obvious that discount usability methods would not adequately assess the usability of a mobile tourist guide (or any other mobile system) since they ignore the context of use. Similarly, the conventional usability laboratory would not be able to adequately simulate such important aspects as the weather and could not easily provide for the wide range of competing activities and demands on users that might arise in a natural setting. Data collection methods such as video recording or observations in natural settings would be very hard and extremely difficult to do in anything but an unnatural manner when a mobile computer system is the subject of the evaluation. Consequently, forms of data and data collection methods would be needed that were outside the common range of usability studies.
Conclusions
Mobile systems in the form of portable telephones, pagers, notebooks and lap-top computers are common place but at present they are poorly integrated and only represent a small proportion of the range of different types of mobile systems that we are likely to see. The development of mobile systems to be used in everyday life will place demands upon the HCI community. It is easy to see that in the design of video recorders, HCI has had no impact at all. They are just as unusable as they ever were and they are sold on the basis of increased functionality which most users never get to use because the interface is so bad. In some of the contexts in which mobile systems could be of use to us, the quality of the interface will matter. In some of the examples described above the interfaces may save or cost lives, in others such as the tourist guide, the device will simply not be used if the interface is poorly designed. HCI methods, models and techniques will need to be reconsidered if they are to address the concerns of interaction on the move.
References
Barnard, P.J., & May, J. (1997) Cognitive Task Modelling, NATO workshop on Cognitive Task Analysis, Washington, November, 1997.
Davies, N. (1997) Invited presentation to EPSRC workshop on Multimedia Network Applications, Warwick November, 1997.
Lim, K., & Long, J.B. (1995) The MUSE methodology for usability software engineering. CUP, Cambridge, UK.
Johnson Paul, (1997) Invited contribution to EPSRC workshop on Healthcare Informatics, Abbingdon.
Johnson, P., Johnson, H., & Hamilton, F. (1997) Task Knowledge Structures., Paper presented to NATO workshop on Cognitive Task Analysis, Washington, November, 1997.
Olsen, D. (1998) Interaction in chaos. In Proceedings of ACM 2nd International Conference on Intelligent User Interfaces, San francisco January. ACM press.
Wilson, B. (1997) Memory aids. Invited presentation to the EPSRC workshop on Healthcare Informatics, Abbingdon, December 1997.
Wilson, S. M., & Johnson, P. (1996) in Vanderdonck J. (ed.) Computer aided user interface design. University of Namur Press.
Tom Rodden, Keith Chervest, Nigel Davies,
Department of Computing, Lancaster University, Lancaster, LA1 4YR.
School of Computing, Staffordshire University, Stafford, ST18 ODG.
Background
The last five years has seen a shift in the nature of mobile computers. The development of increasingly powerful laptop computer systems has been mirrored by the production of a range of small computational devices. The increased prominence of these devices outlined a number of distinct research challenges. These challenges have tended to focus on extending the utility of these devices using new forms of interaction; techniques to overcome display limitations or improvements in the general ergonomics of these devices. The merging of these devices with existing telecommunication services and the production of devices that offer connections to other systems presents yet another set of research challenges in terms of the development of cooperative multi-user applications.
The authors are engaged in a number of projects investigating various aspects of mobile systems development. In particular, an MNA funded project "Interfaces And Infrastructure For Mobile Multimedia Applications" is looking at the way in which the special user interface requirements of cooperative mobile systems can be used directly to drive the development of an effective system architecture, user interface toolkit and underlying communications infrastructure.
In various ways mobile systems break assumptions that are implicit in the design of fixed-location computer applications leading to new design challenges and feeding back to a better understanding of the richness of human–computer interaction.
One central aspect of our work is the temporal issues that arise due to network delays and intermittent network availability. We have already addressed this in some detail based on previous theoretical work on pace of interaction and practical experience in building collaborative mobile applications [Dix, 1992; Davies 1994; Dix, 1995]. In addition, there has been considerable wider interest in temporal issues, both in the context of mobile systems and also more generally [Johnson, 1996; Johnson, 1997; BCSHCI, 1997; Howard and Fabre, 1998].
However, this paper considers a second critical issue in the design and development of cooperative mobile systems, the context sensitive nature of mobile devices. the importance of this is clear in the recent research in ubiquitous computing, wearable computers and augmented reality [Weiser, 1991, 1994; Aliaga, 1997]. Furthermore, more prosaic developments such as mobile phones, GPS and embedded in-car automation all point to a more mobile and embedded future for computation. This development of applications which exploit the potential offered by this technology brings together issues from distributed systems, HCI and CSCW. However, designers of these systems currently have few principles to guide their work. In this paper we explore the development of a framework that articulates the design space for this class of system and in doing so points to future principles for the development of these systems.
Fixed-location computers are clearly used for a variety of tasks and are set within a rich social and organisational context. However, this is at best realised within individual applications and the nature of the device as a whole is fixed and acontextual. In contrast, the very nature of mobile devices sets them within a multi-faceted contextual matrix, bound into the physical nature of the application domain and closely meshed with existing work settings. In this paper we seek to articulate the nature of this matrix and how it may used as resource for designers and developers.
Making use of the context of a device is important for two reasons. Firstly, it may allow us to produce new applications based on the special nature of the context, for example interactive guide maps. Secondly, and equally important, it can help us tailor standard applications for mobile devices, for example when a sales rep visits a company, the spreadsheet can have a default files menu which includes the recent ordering history for the company. Such tailoring is not just an added extra, limited screen displays mean that highly adaptive, contextual interfaces become necessary for acceptable interaction.
Moving from the device to the context of use
A considerable amount of research surrounding the development of mobile devices has obviously focused on the portable nature of these devices and the technical problems in realising these. Mobile computing devices represent real technical challenges and have always stretched the state of the art in terms of displays and interaction devices. This focus on the development of appropriate forms of device is perhaps best exemplified by the development of so called "wearable computers". These have seen the construction of new forms of interaction devices that support a limited number of dedicated tasks. These have included support for mechanics, portable teaching aids and note taking machines [Fickas 1997].
The development of dedicated function devices is complemented by the emergence of a range of general purpose devices are normally characterised as Personal Digital Assistants. The majority of these devices focus on supporting some form of personal organisation by combining diary and note taking facilities. These devices are characterised by their personal and individual nature and any communication provided has focused on providing support for access to on-line information such as the email and the World Wide Web.
The emergence of mobile telecommunication standards such as GSM and the increased availability of these services has also led more recently to the development of a range devices that provide mobile access to on-line services (e.g., the Nokia communicator). This merging of computer and communication facilities allows the development of systems that provide on-line immediate access to information. These portable networked devices have also been combined with the use of GPS technologies to develop a range of portable devices that are aware of their position [Long 1996].
The ability of the current generation of portable devices to have an awareness of their setting and an increased ability to access network resources means that we need to broaden our consideration of these devices to see their use in tandem with other portable devices. This view of portable devices means that we need to balance the current consideration of the interaction properties of individual devices with a broader consideration of the context of use. This move toward a consideration of the context of use builds upon previous trends in the development of portable devices, includes the use of TABS in developing mediaspaces at PARC and the associated emergence of the notion of Ubiquitous computing [Weiser, 1991, 1993]. More recent work at MIT has also focused on the development of small-scale devices that exploit context to provide an ambient awareness of interaction [Ishii 1997].
Considering the context of Mobile Systems
Our particular focus is a consideration of applications that we term advanced mobile applications. Although research prototypes exist that demonstrate the technical possibilities many of these have yet to emerge as fully-fledged applications. These applications are distributed in nature and characterised by peer-to-peer and group communications, use of multimedia data and support for collaborating users. Examples of such applications include mobile multimedia conferencing and collaborative applications to support the emergency services.
In considering the design and development of interfaces for mobile devices we wish to particularly focus on the situation where mobile devices behave differently and offer different interaction possibilities depending on the particular context in which the system is been used. For example, in the development of mobile multimedia guides such as the systems at Georgia Tech [Long 1996] and the Lancaster Guide [Davies 1998] the information presented to the user and the interaction possibilities is strongly linked to the location where the device is been used. Interaction is no longer solely a property of the device but rather is strongly dependant on the context in which the device is been used.
In this paper we wish to examine the nature of the context in which mobile devices are used and the implications for future HCI design. The aim of this focus on the context is to allow the highly situated nature of the devices to be reflected in the design of interactive systems that exploit these systems. This focus on the situated nature of these devices reflects the growing acceptance of these devices and the need to allow them to closely mesh with the existing practices. This needs to focus on the context of use mirrors previous work in the development of interactive systems within CSCW [Hughes 1994]
In considering context as a starting point for the design of interaction within we need to unpack what we actually mean by the term context and how we may exploit to determine different interaction possibilities within mobile systems. The following sections consider some of the ways in which context has played a key design role in the development of distributed mobile applications and the consequences suggested for the development of future applications.
Infrastructure Context
The interaction offered by advanced mobile applications is not solely dependent on the particular features of the mobile devices used. Rather it is a product of the device and the supporting infrastructure used to realise the application. The impact of the properties of the supporting distribution infrastructure to different styles of interaction has been discussed in CSCW and HCI [Greenberg 1994]. In mobile systems the nature of the infrastructure is even more likely to change as the application is used and the sort of service available may alter dramatically. This variability in the infrastructure may dramatically effect interaction and it is essential that interaction styles and interfaces provide access to information reflecting the state of the infrastructure
This issue is particularly acute in the case of safety critical applications where applications must be rigorously engineered to ensure a high level of dependability. The dependency of these systems comes not only from the reliability of the communication infrastructure and devices but also from the users awareness of the nature of the application. Provision of this awareness require us to reconsider the traditional views of distribution transparency and abstraction and allow the user access to the properties of the infrastructure and infer different interaction results from this contextual information.
In essence, the user interfaces to mobile applications must be designed to cope with the level of uncertainty that is inevitably introduced into any system that uses wireless communications. For example, consider our experiences in the development of an advanced mobile application used to support collaborative access to safety critical information by a group of field engineers [Davies 1994]. If one of these engineers becomes disconnected from the group as a result of communications failure then it is vital that the remaining user's interfaces reflect this fact. This requires interaction between the application's user interface and the underlying communications infrastructure via which failures will be reported. In addition, if the information being manipulated is replicated by the underlying distributed systems platform the validity of each replica will clearly be important to the engineers. In this case the user interface will need to reflect information being obtained from the platform.
The design of these applications needs to not only reflect the semantics of the application and the features supported but it must also consider as a key design element the variability of the supporting infrastructure and how this variability is reflected to the user. Similarly, the particular features of the infrastructure may need to be put in place and designed in line with the interaction needs of the mobile application.
Application Context
In addition to the infrastructure issues discussed above, distributed mobile applications need to consider the detailed semantics of the application. In the case of mobile applications the normal design considerations are amplified by the need to consider the limited interaction facilities of mobile devices. In addition, a number of additional contextual issues need to be considered in the design of these applications.
Mobile devices are intended to be readily available and of use to the community of users being supported. As a consequence we need to consider the highly situated nature of this interaction. Developing a clear understanding of what people do in practice and the relationship with technology is essential to informing the development of these applications. The relationship between users and mobile technology is still unclear and few studies have taken place that considers the development of mobile cooperative application [Davies 1994].
For example we may choose to exploit the personal nature of these devices to associate mobile devices with users. This allows us to tailor applications to allow them to be sensitive to the identity of the user of the device. This information may be exploited along with additional contextual information (e.g. location) to present appropriate information. One example of this would be a particular doctor visiting patients within a hospital. At a particular bed, who the doctor is and their relationship to the patient in the bed may determine the information presented. Contrast this situation with the development of a museum guide where the devices need to be considered as general purpose and no information is available about the relationship between users and the artefact being described.
The design of advanced multimedia applications needs to explicitly identify the nature of the work being supported and the practicalities of this work. In doing so developers need to consider the relationship between the mobile devices and their users and how this can be used to determine the nature of the interfaces presented. This is particularly important if devices are to be used to identify users and potentially make information about their location and what they are doing available to others. In this case a consideration of the issues of privacy and the need for some symmetry of control is essential.
System Context
In addition, to exploiting information about who will be using devices, interaction with mobile applications also needs to consider the system as a whole. The nature of these devices is that more advanced applications need to be distributed in nature. Thus rather than having functionality reside solely within a single machine (or device) it is spread across the system as a whole. This means we need to consider the interaction properties of the system in terms of the distributed nature of the application. This is particular true when we consider issues of pace and interaction [Dix, 1992]. Consider for example, the development of appropriate caching strategies for field engineers who will only ever be examining or servicing units within a sub region of a particular area.
The need for rapid feedback is an accepted premise of HCI design and many applications provide direct manipulation interfaces based on the ability to provide rapid feedback. The development of distributed applications has seen a reconsideration of the nature of feedback and the importance of considering the technical infrastructure as impacting this [Dix, 1995]. The variable nature of the Internet and the effects on World Wide Web interaction is perhaps the most readily identifiable manifestation of this effect [BCSHCI, 1997]. A natural design tension exists between replicated application architectures that maximise feedback and centralised applications that prioritise feedthrough across the application users [Ramduny and Dix, 1997]. The need to consider the overall functionality of the application and to design structures that provide appropriate access to different levels of functionality is amplified in the case of mobile applications where the infrastructure may vary considerably as the application is in use.
Location Context
One of the unique aspects of mobile devices is that the can have an awareness of the location within which they are been used. This location information may be exploited in determining the form of interaction supported. This may either be direct in that it explicitly exploits the nature of the setting within the application. For example, the development of guides that tell you about your current location. It may also be less direct in the development of systems that inform you of incidents depending on your particular location.
The degree to which the mobile application is coupled with the location of devices and how this location is made available to users is a key design decision in supporting different interaction styles. The device used and the form of interaction it supports is no the sole determinant in the form of interaction. Rather it a product of the location of the devices and the location of other devices. This means that we need to consider the issues involved in the correspondence between these devices and their location. For example, if a guide describes a particular location and is dependent on references to that location to support the interaction we must ensure that this contextual reference is maintained. It is essential that our approaches to design explicitly involve the issues of location and the link with these contextual cues.
Physical Context
Finally, mobile computer systems are likely to be aware of, or embedded into their physical surroundings. Often this is because they are embedded in an application specific device, for example in a mobile phone or car. In these situations the computer system is mobile by virtue of being part of a larger mobile artefact. This context can and does affect the application interface, for example, the telephone directory within a mobile phone can be very different from one in an independent PDA. Another example is a car radio (now often computer controlled) which has different design considerations to a static radio including the need to automatically retune as the car travels between local radio areas and transmitter zones. Because the computer systems are embedded into application specific devices they may also be aware of their environmental context, for example, the speed of the car. Some of this sensory information may be used simply to deliver information directly to the user, but others may be used to modify interface behaviour. For example, in a tourist guide, increasing text size in poor lighting conditions or, in a car system, limiting unimportant feedback during periods of rapid manoeuvring.
Each of these different context represent different portions of the design space within which mobile systems must be placed and the features of infrastructure, application, system and location all provide potential trade-offs that developers must address in realising mobile interactive systems. Currently, designers undertake this trade off with little support or guidance as the system as little is know of the extent of the design space into which mobile applications are placed. In the following section we wish to consider a taxonomy of mobile computation that charts this design space to allow developers to consider the properties of the mobile system under construction and how this may be related to other applications and systems.
Towards a taxonomy of mobile computation
Having considered some of the different ways in which context may affect or be used in mobile devices; we now want to build a classification of mobile and context-aware devices to better understand the design space. Clearly, as we are considering mobile systems, ideas of space and location are of paramount importance in our consideration of the context of these systems. We will therefore first examine different kinds of real and virtual location and different levels of mobility including issues of control. However any notion of location puts the device within an environment which has both attributes itself and may contain other devices and users with which the device may interact.
Figure 1. A device in its environment
Of real and virtual worlds
A lawnmower is a physical device; it inhabits the real world and can be used to affect the real world. Computers open up a different kind of existence in an electronic or virtual world. This is not just the realm of virtual reality, as we surf the web, use ftp to access remote files or even simply explore our own file system we are in a sense inhabiting virtual space. Even the vocabulary we use reflects this: we 'visit', 'explore', 'go to', 'navigate' ... our web browsers even have a button to go 'back'. Their has been a growing acceptance of the consideration of a virtual space and the development of electronic worlds and landscapes.
The emergence of virtual space
The turn to virtual worlds and spatial approaches generally has emerged from work in HCI and CSCW on the use of spatial metaphors and techniques to represent information and action in electronic systems. This work has its roots in the use of a rooms metaphor to allow the presentation of information [Henderson, 1985]. From these early spatial approaches we have seen concepts of spatial arrangement exploited in the development of desktop conferencing systems such as Cruiser [Root, 1988] and more generally in the work of Mediaspaces [Gaver, 1992].
The recent development of co-operative systems in CSCW has also seen a growing application of concepts drawn from spatial arrangements. These include the development of groupkit to form teamrooms [Roseman, 1996], the emergence of the worlds system [Fitzpatrick, 1996] and the use of a notion of places to support infrastructure [Patterson, 1996]. This exploitation of virtual spaces is most notable in the development of shared social worlds exsiting soleley withi the machine [Benford, 1995]. However, the use of space and virtual spaces has not been isolated to an existance solely within the computer and a number of researcher have considered how space and location can be considered in both virtually and physically within the development of applications. This is most evident in the augmenting of existing physical spaces to form digital spaces populated by electronically sensitive physical artefacts (or tangible bits)[Ishii, 1997] that are sensitive to their position within both physical and virtual space.
Combining the real and the virtual
The work in tangible bit undertaken by Ishii(1997) represent the start of a trend to interweave real and virtual spaces that exploit a capability differently offered by mobile computer applications and we would suggest that this interplay between the real and the virtual is at the core of the design of co-operative mobile applications as devices and users have a location and presence that is both virtual and physical each of which is available to the computer application.
This interplay between the real and the virtual provides a starting point for the development of our taxonomy. A direct result of the need to recognise this coupling is that many of the categories we will consider for taxonomising and understanding mobile and context-aware computation have counterparts in both the real physical world and virtual electronic world.There are important differences and the virtual world does not always behave in ways we have come to expect from the physical world and these differences are often exploited by designers and developers.
In particular, even the object of interest for mobile computation may have a physical or virtual existence depending on the nature of the application. At one extreme we have simple hand held GPS systems that simply tell you where you are in physical space – perhaps these do not even rank as mobile computation. At the other extreme there are agents which simply have an existence within the virtual world, for example web crawlers or the components within CyberDesk[Wood, 1997]. Between these we have more complex physical devices such as the PDA which have both a real world existence and also serve as windows into virtual space (especially when combined with mobile communications).
In the development of the taxonomy presented here we will focus will on physical mobile computational devices. However, we will also draw on examples of virtual agents where they are instructive to highlight the co-existence of these two forms of space and the issues of mobility that may exist in both.
Location
Mobility makes us think of automatically about location and the way in which this sense of location can be both understood in the system and changes in location can effect the system. Any simple mobile device will have a physical location both in space and time. Understanding the nature of this location and how the developers of interactive mobile applications may exploit it is important and in this section we wish to consider what we might actually mean by the term location. This exploration is more than a mere issue of terminology as developing a understanding of what we actually mean by location represents a consideration of one of the core design concepts in the production of mobile systems.
Looking at the spatial dimension there are some devices (for example GPS based map systems) where the exact Cartesian position in 2D or 3D space is important in defining a sense of absolute physical location. For others a more topological idea of space is sufficient in understanding position and in these case location is consider not in an absolute sense but in relation to other objects or sensors. For example the Lancaster GUIDE system is based on radio cells roughly corresponding to rooms and sections of Lancaster Castle and the CyberGuide [Long,1996] system at Georgia Tech. shows visitors around the GVU laboratory by altering its behaviour depending on what item of equipment is closest.
This distinction between a sense of the absolute and relative in location can also be applied to time. We can consider a simple, linear, Cartesian time typified by a scheduler or alarm clock. However, we can also have applications where a more relative measure of time is used, for example, during a soccer match we may consider the action in the first half, second half and extra time but not care exactly whether the match was played at 3pm or 6pm. Similarly, in the record of a chess game, all that matters is the order of the moves, not how long they took. In fact, many calendar systems employ a hierarchical and relative model of time: hours within days, days within weeks. At first this might seem like a simple division of linear time, but such systems often disallow appointments spanning midnight, or multi-day meetings that cross into two weeks.
We can thus think of both space and time as falling into 'Cartesian' and topological categories and can consider location in both space and time in these terms. We may also consider location in both a physical and virtual sense. If we consider ideas of virtual location, for example position within a hypertext, we see that we may similarly have ideas of time and space within the electronic domain. As an example of virtual time consider looking up next week's appointments in a scheduler, the real time and virtual time need not correspond. For those with busy schedule these seldom correspond and the art of mapping from the real to the virtual is often an delicate balancing act worked out in practice.
This consideration of location provides us with the following categorisation of location:
real |
Virtual |
||
space |
Cartesian |
GPS |
VR |
time |
linear |
stop watch |
history time line |
Figure 2.
Location in different kinds of spaceNote that these are not mutually exclusive categories: an item in a room also has a precise longitude and latitude, a device will exist at a precise moment in linear time, but may being used to view past events at that stage. Indeed possibly one of the most interesting things is where these different ideas of location are linked to allow visualisation and control. For example, moving a display up and down in physical space could be used to change the virtual time in an archaeological visualisation systems, and in an aircraft cockpit, setting the destination city (topological destination) instructs the autopilot to take an appropriate course in Cartesian space/time. This interplay between the real and the virtual is central to the development of augmented reality spaces where the movement of devices within a space may manifest in effects that are both real and virtual. These spaces only work because the location of the device can be controlled in virtual and physical space and its effects proved alterations to either the physical or virtual space.
Mobility
Out core concern in the development of our design taxonomy is the issue of mobility and its implication for how we understand human computer interaction. In the previous section we considered how the issue of location can be unpacked to provide an understanding in both a physical and virtual sense and how the nature of the space effects our consideration of location. In this section we wish to focus on how might understand mobility and what potential design issues may emerge from a more detailed consideration of mobility.
Devices may be mobile for a number of reasons. They may be mobile because they are carried around by users (as with a PDA or a wearable computer), because they move themselves (robots!) or because they are embedded within some other moving object (a car computer). Furthermore a number of different devices may be spread within our environment so that they become pervasive, as in the case of an active room such as the ambient room suggested by Ishii(1997). The issue of pervasive is itself a rather thorny one in that we it is not clear what constitutes pervasive in terms of devices and how this relates to previous discussions surrounding ubiquitous devices. The issue of ubiquitous computing has focused on the backgrounding of the device and the computer essentially "disappearing" into the environment. For us the issue of pervasive device has less to do with the devices fading into the environment and more to do with an expectation that particular devices are normally available. For us pervasive computing is intimately bound up with the inter-relationship between different devices and the expectation that these devices can work in unison to provide some form of shared functionality. An active room is active because it contains a number of devices which when they work in unison provide some form function. Essentially, we are seeing a number of computing devices working in co-operation to proved some functionality and some of these devices may be mobile. However, often these devices are not. Consider for example the layout of base stations that provide the information displayed on mobile devices to allow a space to offer some form of pervasive computing facility.
We can disentangle the different levels of mobility into three dimensions which are used in Figure 3 to classify example mobile systems.
First we can consider the level of mobility within the environment:
• fixed – that is the device is not mobile at all! (e.g a base station fixed in a particular place)
• mobile – may be moved by others (e.g. carried around, e.g PDA or wearable computer)
• autonomous – may move under its own control (e.g. a robot)
Second, we can consider the extent to which the device is related to other devices or its environment:
• free – the computational device is independent of other devices and its functionality is essentially self contained.
• embedded – the device is part of a larger device
• pervasive – the functionality provided by the device is essentially spread throughout the environment and results for the a devices relation to other elements in the environment.
These separations do not consider the nature of the device and the sort of functions it may afford. The physical design of the device itself is an issue that needs to be considered carefully and needs to be considered in terms of existing traditions of aesthetic and practical design. The consideration of these features are beyond the scope of the framework and taxonomy we wish to present here which focuses on the development of the device.
As a final part of our taxonomy we can reflect the co-operative nature of advanced mobile applications by considering the extent to which the device is bound to a particular individual or group. We have three classes for this too:
• personal – the device is primarily focused on supporting one person
• group – the device supports members of a group such as a family
• public – the device is available to a wide group
We would not suggest that these categories are absolute but rather provide them as sample equivalent cases of utility to designers. All the categories have grey cases, but perhaps this last dimension most of all. In particular we should really consider both the static and dynamic nature of how these categories are applied. For example, we could classify a computer laboratory as 'public', but of course, after logging in, each computer becomes personal. We will return to these dynamic aspects when we look at how devices can become aware of their users.
In fact, the 'group' category really covers two types of device. Some, like a liveboard actually support a group working together. Others, like an active refrigerator (which allows messages to be left, email browsing etc.) may primarily support one person at a time but is available to all members of a family. In-car computers systems exhibit both sorts of 'groupness', they may perform functions for the benefit of the passengers of the car as well as the driver and also the exact mix of people from within the family (or others) within the car may vary from trip to trip.
Some of the examples in Figure 3 are clear, but some may need a little explanation. The 'Star Trek' reference is to the computer in Star Trek that responds to voice commands anywhere in the ship, but does not actually control the ship's movements. This is perhaps a wise move given the example of HAL in the 2001! (Note HAL is put in the group category as it has a small crew, but this is exactly one of the grey distinctions.) Our reference to 'shopping cart' refers to the development of supermarket trolleys that allow you to scan items as they are added and keeps track of your purchases to enable a fast checkout. Often these require the insertion of a shopper identification, in which case these become dynamically personalised.
Personal |
Group |
public |
|||
Fixed |
office PC |
Liveboard |
Computer lab. |
||
Free |
Mobile |
PDA |
tour guides |
||
Autonomous |
Factory robot |
||||
Fixed |
Active fridge |
ATM |
|||
Embedded |
Mobile |
Wearable devices |
Car computer |
Shopping cart |
|
Autonomous |
Auto pilot |
Mono rail |
|||
Fixed |
active room |
||||
Pervasive |
Mobile |
Star Trek |
|||
Autonomous |
web agent |
HAL |
web crawler |
Figure 3.
A Taxonomy of different levels of mobilityNotice there are various blank cells in this taxonomy reflecting our use of the taxonomy as a means of charting the design space for interactive mobile devices. Some of these blanks represent difficult cases where there may not be any sensible device. For example, a fixed–pervasive–personal device would have to be something like an active hermits cell. In fact, the whole pervasive–personal category is problematic and the items 'web agent' and 'web crawler' in the final row may be better regarded as virtual devices of the free–autonomous class.
Other gaps represent potential research opportunities. For example, what would constitute a free–mobile–group device? This would be a portable computational device that supports either different individuals from a group, or a group working together – possibly an electronic map that can be passed around and marked.
Most of the examples are of physical devices. Virtual devices may also be classified in a similar way, for example, Word macros are embedded–mobile (or even autonomous in the case of macro viruses!) as are Java applets. The only virtual devices in Figure 3 are the items 'web agent' and 'web crawler' in the final row. However, these are perhaps better regarded as virtual devices of the free–autonomous class. This ambiguity is because any virtual device or agent must be stored and executed upon a physical computational device and the attributes of the physical device and virtual device may easily differ. For example, a PDA may contain a diary application. This is mobile by virtue of being stored within the PDA (a virtual device embedded within a physical device). However, if the PDA is used as a web browser it may execute a Java applet that is a form of virtual agent embedded within a web page (a virtual embedding in a mobile artefact). That is we have an embedded–mobile–public virtual agent temporarily executing on a free–mobile–personal device! This dual presence in multiple contexts is both the difficulty and the power of virtual environments and one that requires some significant research to resolve.
Populating an environment
Devices may need to be aware of aspects of their environment in addition to their location within it. These may vary because the device is moving form location to location (the headlamps on a car turning on automatically as the car goes into a tunnel) or because the environment is changing (a temperature monitor). In a sense, devices need to be aware that they populate an environment and need to reflect the coupling with the environment depicted in Figure 1.
This awareness may include both the physical environment (light, temperature, weather) and the electronic environment (network state, available memory, current operating system). A simple of the latter are Javascript web pages which run different code depending on the browser they are running on.
Environments are normally populated with a range of different devices. Within the physical and virtual environment of a device there may be other computational devices, people (including the user(s) of the device) and passive objects such as furniture. These may be used to modify the behaviour of the device. For example, in CyberDesk 'ActOn' buttons are generated depending on what other applications are available and the types of input they can accept.
Figure 4 gives examples of items in the environment that may be relevant for a mobile or context-aware device taking a car computer and an active web page as running examples.
Physical |
Virtual |
|
People |
Current driver of car |
visitor at web page |
Devices |
other cars |
running applets |
Objects |
roadside fence |
other pages on the site |
Figure 4.
Examples of entities within the environmentThis sense of awareness of the surrounding environment and conveying this awareness to others is an issue of some sensitivity in design. For example, in the case of active badges the issue of awareness of users and how this may be applied became embroiled within a discussion of privacy [Harper, 1992]. This may become even more problematic in the case of multiple devices that display an awareness of others. For example, consider the suggested "fun" interest badge device offered by Philips in the development of it visions of the future [Philips, 1996] design study. These badges are programmed with a set of interest profiles for people and are intended to light up when you meet someone else with a compatible profile. The social acceptability of this form of device may well become a significant issue in determining their acceptability and the general acceptance of devices of this form.
Measurement and awareness
In order to modify their behaviour devices must be able to detect or measure the various attributes we have mentioned: their location, environment, other devices, people and things.
These are mostly status phenomena and elsewhere [Dix and Abowd 1996, Ramduny, Dix and Rodden 1998] we have discussed the various ways in which an active agent can become aware of a status change. In short, these reduce to finding out directly or via another agent (human or electronic). For example, a car with a built in GPS sensor can detect its position directly and thus give directions to the driver, but a simple PDA may need to be told of the current location by its user in order to adjust timezones. Other computational agents may also be important sources of information about themselves (as in the case of CyberDesk) and about other parts of the environment (for example recommender systems).
Items in the environment (people, devices, objects) are particularly difficult: not only may they change their attributes (position etc.), but also the configuration of items may change over time (e.g. people may enter or leave an active room). This leads to three levels of awareness. We'll look at these with the example of a car computer:
• presence – someone has sat down in the driver's seat, but all the car can tell is that the door has been opened then closed
• identity – the driver enters her personal pin number and the car can then adjust the see position for the driver
• attributes – the car detects from the steering behaviour that the driver is getting drowsy and sounds a short warning buzzer
Notice how in this example, presence was not detected at all, identity was informed by the driver, but the sleepiness of the driver was detected directly. In other cases different combinations of detection or informing may be found. Security systems often have ultrasonic sensors to tell that someone is near (presence). Similarly, the car could be equipped with a pressure sensor in the driver's seat. Active badges, video-based face recognition or microphones matching footstep patterns can be used to tell a room who is there and hence play the occupant's favourite music and adjust the room temperature.
These examples are all about detecting people, but the same things occur in other settings. In the virtual world an agent may need to detect the same things: presence – whether any other applications are running, identity – if so what they are (e.g. Netscape), and attributes – what web page is currently being viewed. Also physical devices may detect one another for example allowing several people with PDAs to move into 'meeting' mode. Infact, awareness models that do just this form of detection within the virtual world abound[Rodden, 1996].
Detection and measurement may vary in accuracy: perhaps a box was put onto the car seat pressure sensor, the driver lied about her identity, the ultrasonic sensor cannot tell whether there is one or more people. It will also typically have some delay, especially when indirect means are used which is especially problematic if the attribute being measured changes rapidly. Thus actual detection is a trade-off between accuracy, timeliness and cost. Depending on the outcomes certain adaptations may be ill advised – a car wrongly identifies its driver and adjusts the seat thinking the driver is short, the real driver is quite tall and ends up squashed behind the steering wheel). The fidelity of awareness is very closely tied to the demands of the application and represents a genuine trade-off between the cost of measurement, the nature of the measurement and the importance of accuracy in the awareness information.
From requirements to architecture
As we have seen the taxonomy we suggest offers up many exciting design possibilities for specific applications suggested by the contextual nature of mobile devices. Although we are investigating some of these in a number of projects at Lancaster University the primary aim of our current 'infrastructure' project is to examine the generic requirements to emerge from taxonomies of this form. These requirements can then be exploited to develop the underlying toolkits, architecture and infrastructure needed for temporally well designed, context-aware, collaborative mobile systems. One of the issues suggested strongly by our framework is that the issues of human computer interaction involved in mobile systems extend well beyond the interface provided by the device and have significant impacts on the infrastructure.
Research has demonstrated the shortcomings of existing infrastructure components for supporting adaptive mobile applications [Davies, 1994], [Joesph, 1995]. In more detail, existing components have two critical shortcomings. Firstly, they are often highly network specific and fail to provide adequate performance over a range of network infrastructures (e.g. TCP has been shown to perform poorly over wireless networks [Caceres, 1994]). Secondly, existing components often lack suitable APIs for passing status information to higher levels. As a consequence of these shortcomings new systems are increasingly being developed using bespoke communications protocols and user interfaces. For example, the GUIDE system described in [Davies 1998].
As these devices become more widespread the need increases for generic application architectures for at least subclasses. There is clear commercial pressure for this, in particular, Windows-CE is being promoted for use in embedded systems. However, if these are simply developed by modifying architectures and toolkits originally designed for fixed environments there is a danger that some of the rich interaction possibilities afforded by mobile devices may be lost.
There are some examples of generic frameworks on which we can build. In Georgia Tech., location aware guides are being constructed using the CyberDesk/Cameo architecture [Wood et al., 1997]. Cameo is a software architecture based on the theoretical framework of status–event analysis. Status–event analysis lays equal weight to events, which occur at specific times, and status, phenomena which always have a value which can be sampled [Dix and Abowd, 1996]. The discrete nature of computation forces and emphasis in many specification and implementation notations towards the former, however, most contextual information is of the latter type – status phenomena. Computation using status phenomena requires callback-type programming, as is familiar in many user interface toolkits, to be used far more widely.
Another major architectural issue for context-aware applications is the way in which contextual issues cut across the whole system design. This is reminiscent of other aspects of user-interface where the structures apparent at the user interface often do not match those necessary for efficient implementation and sound software engineering [Dix and Harrison, 1989]. In UI design this has led to a conflict between architectures which decompose in terms of user interface layers, such as the Seeheim and ARCH-Slinkey models [Gram and Cockton, 1996] and more functionally decomposed object-oriented models. In fact the object and agent-based architectures themselves usually include a layered decomposition at the object level as in the MVC (Model–View–Controller) model [Lewis, 1995] and in the PAC (Presentation–Abstraction–Control) model [Coutaz, 1987]. Although the display and input hardware may be encapsulated in a single object or group of objects, its effects are felt in the architectural design of virtually every user-interface component. In a similar fashion the hardware that supplies contextual information may well be encapsulated within context-objects, but their effect will permeate the system. This requires a similar orthogonal matrix structure to that found in models such as PAC or MVC.
Conclusion
In this paper we have considered human computer interaction with mobile devices in terms of the development of advanced mobile applications. The maturing of technology to allow the emergence of multi-user distributed applications that exploit mobile applications means that we can no longer focus the issues of interaction on the nature of the device. Rather we must explicitly consider impact of the context in informing the design of different interaction techniques. The context needs to be considered in terms of the devices relationship with the technical infrastructure, the application domain, the socio-technical system in which it is situated, the location of its use and the physical nature of the device. The interaction style supported by this class of mobile application is as dependant on this context as the properties of the device itself. As a result, it is essential that work on the nature of these devices and the development of techniques that are aware of the limits of these devices is complemented by a broader consideration of the nature of interaction. However, these modified and novel forms of interaction cannot be realised without corresponding software architectures. So far we have identified two major structural principles which underlie this architectural design: the importance of representing status phenomena and the need for contextual information to cut across the software design space.
References
Aliaga, D. G. (1997). Virtual objects in the real world. Communications of the ACM, 40(3): 49-54.
BCS HCI (1997). British HCI Group Workshop on Time and the Web. Staffordshire University, June 1997.
Benford, S., Bowers, J., Fahlen, L., Mariani, J, Rodden. T, Supporting Cooperative Work in Virtual Environments. The Computer Journal, 1995. 38(1).
Cáceres, R., and L. Iftode. "The Effects Of Mobility on Reliable Transport Protocols." Proc. 14th International Conference on Distributed Computer Systems (ICDCS), Poznan, Poland, Pages 12-20. 22-24 June 1994.
Coutaz, J. (1987). PAC, an object oriented model for dialogue design. Human\(enComputer Interaction \(en INTERACT'87, Eds. H.-J. Bullinger and B. Shackel. Elsevier (North-Holland). pp. 431-436.
Davies, N., G. Blair, K. Cheverst, and A. Friday. "Supporting Adaptive Services in a Heterogeneous Mobile Environment." Proc. Workshop on Mobile Computing Systems and Applications (MCSA), Santa Cruz, CA, U.S., Editor: Luis-Felipe Cabrera and Mahadev Satyanarayanan, IEEE Computer Society Press, Pages 153-157. December 1994.
Davies, N., K. Mitchell, K. Cheverst, and G.S. Blair. "Developing a Context Sensitive Tourist Guide", Technical Report Computing Department, Lancaster University. March 1998.
Dix, A. and G. Abowd (1996). Modelling status and event behaviour of interactive systems. Software Engineering Journal, 11(6): 334–346.
Dix, A. J. (1992). Pace and interaction. Proceedings of HCI'92: People and Computers VII, Cambridge University Press. pp. 193-207.
Dix, A. J. (1995). Cooperation without (reliable) Communication: Interfaces for Mobile Applications. Distributed Systems Engineering, 2(3): 171–181.
Dix, A. J. and M. D. Harrison (1989). Interactive systems design and formal development are incompatible? The Theory and Practice of Refinement, Ed. J. McDermid. Butterworth Scientific. pp. 12-26.
Fickas, S., G. Kortuem, and Z. Segall. "Software Issues in Wearable Computing." Proc. CHI Workshop on Research Issues in Wearable Computers, Atlanta, GA, U.S.,
Fitzpatrick, G., et al, Physical Spaces, Virtual Places and Social Worlds: A study of work in the virtual, Proc. CSCW’96, ACM Press
Gaver W., The Affordances of Media Spaces for Collaboration, Proc. CSCW’92, 1992, ACM Press.
Gram, C. and G. Cockton, Eds. (1996). Design Principles for Interactive Software. UK, Chapman and Hall.
Greenberg S.,Marwood D., 'Real Time Groupware as a Distributed Dystem; Concurreny Control and its effect on the Interface' Proceedings of CSCW'94, North Carolina, Oct 22-26, 1994, ACM Press.
Henderson, A.J., and Card, S.A., Rooms: The Use of Multiple Virtual Workspaces to Reduce Space Contention, ACM Transactions on Graphics, Vol. 5, No. 3, July 1985.
Howard, S. and J. Fabre, Eds. (1998). Temporal Aspects of Usability: The relevance of time to the development and use of human-computer systems – Special issue of Interacting with Computers (to appear).
Hughes J., Rodden T., King V., Anderson K. 'The role of ethnography in interactive systems design', ACM Interactions, ACM Press, Vol II, no. 2, 56-65, 1995.
Johnson, C. and P. Gray (1996). Workshop Report: Temporal Aspects of Usability (Glasgow, June 1995). SIGCHI Bulletin, 28(2).
Johnson, C. W. (1997). The impact of time and place on the operation of mobile computing devices. Proceedings of HCI'97: People and Computers XII, Bristol, UK, pp. 175–190.
Joseph, A., A. deLespinasse, J. Tauber, D. Gifford, and M.F. Kaashoek. "Rover: A Toolkit for Mobile Information Access." Proc. 15th ACM Symposium on Operating System Principles (SOSP), Copper Mountain Resort, Colorado, U.S., ACM Press, Vol. 29, Pages 156-171. 3-6 December 1995.
Lewis (1995). The Art and Science of Smalltalk. Prentice Hall.
Long, S., R. Kooper, G.D. Abowd, and C.G. Atkeson. "Rapid Prototyping of Mobile Context-Aware Applications: The Cyberguide Case Study." Proc. 2nd ACM International Conference on Mobile Computing (MOBICOM’96), Rye, New York, U.S., ACM Press,
Patterson, J.F et al., Notification Servers for Synchronous Groupware, Proc. CSCW’96, ACM Press.
Ramduny, D. and A. Dix (1997). Why, What, Where, When: Architectures for Co-operative work on the WWW. Proceedings of HCI'97, Bristol, UK, Springer. pp. 283–301.
Ramduny, D., A. Dix and T. Rodden (1998). Getting to Know: the design space for notification servers. submitted to CSCW'98,
Root, R.W., Design of a Multi-Media Vehicle for Social Browsing, Proc. CSCW’88, Portland, Oregon, Spetember 26-28 1988, pp25-38
Roseman, M, Greenberg, S, TeamRooms: Network Places for Collaboration, Proc. CSCW’96, ACM Press
Weiser, M. (1991). The computer of the 21st century. Scientific American, 265(3): 66-75.
Weiser, M. (1993). Some computer science issues in ubiquitous computing. Communications of the ACM, 36(7): 75-84.
Wood, A., A. K. Dey and G. D. Abowd (1997). CyberDesk: Automated Integration of Desktop and Network Services. Proceedings of the 1997 conference on Human Factors in Computing Systems, CHI '97, pp. 552–553.