Multimodal User Interaction in Smart Environments: Delivering Distributed User Interfaces
Abstract
The ongoing utilization of computer technologies in all areas of life leads to the development of smart environments comprising numerous networked devices and resources. Interacting in and with such environments requires new interaction paradigms, abstracting from single interaction devices to utilize the environment as interaction space. Using a networked set of interaction resources allows supporting multiple modalities and new interaction techniques, but also requires the consideration of the set of devices and the adaptation to this set at runtime. While the generation of user interfaces based on UI models, although still challenging, has been widely researched, the runtime processing and delivery of the derivable user interfaces has gained less attention. Delivering distributed user interfaces while maintaining their interdependencies and keeping them synchronized is not a trivial problem. In this paper we present an approach to realize a runtime environment, capable of distributing user interfaces to a varying set of devices to support multimodal interaction based on a user interface model and the management of interaction resources.