Multimodal User Interaction in Smart Environments: Delivering Distributed User Interfaces

Abstract

The ongoing utilization of computer technologies in all areas of life leads to the development of smart environments comprising numerous networked devices and resources. Interacting in and with such environments requires new interaction paradigms, abstracting from single interaction devices to utilize the environment as interaction space. Using a networked set of interaction resources allows supporting multiple modalities and new interaction techniques, but also requires the consideration of the set of devices and the adaptation to this set at runtime. While the generation of user interfaces based on UI models, although still challenging, has been widely researched, the runtime processing and delivery of the derivable user interfaces has gained less attention. Delivering distributed user interfaces while maintaining their interdependencies and keeping them synchronized is not a trivial problem. In this paper we present an approach to realize a runtime environment, capable of distributing user interfaces to a varying set of devices to support multimodal interaction based on a user interface model and the management of interaction resources.

@INPROCEEDINGS{Blumendorf2007,
  author = {Marco Blumendorf, Sebastian Feuerstack and Sahin Albayrak},
  title = {Multimodal User Interaction in Smart Environments: Delivering Distributed User Interfaces},
  booktitle = {European Conference on Ambient Intelligence: Workshop on Model Driven Software Engineering for Ambient Intelligence Applications},
  year = {2007},
  owner = {blumendorf},
  pdf = {Blumendorf2007.pdf},
  timestamp = {2007.10.27}
}
Autoren:
Marco Blumendorf, Sebastian Feuerstack, Sahin Albayrak
Kategorie:
Tagungsbeitrag
Jahr:
2007
Ort:
European Conference on Ambient Intelligence: Workshop on Model Driven Software Engineering for Ambient Intelligence Applications, Darmstadt, Germany