Prototyping of Multimodal Interactions for Smart Environments based on Task Models

Abstract

Smart environments offer interconnected sensors, devices, and appliances that can be considered for interaction to substantially extend the potentially available modality mix. This promises a more natural and situation aware human computer interaction. Technical challenges and differences in interaction principles for distinct modalities restrict multimodal systems to specialized systems that support specific situations only. To overcome these limitations enabling an easier integration of new modalities to enhance interaction in smart environments, we propose a task-based notation that can be interpreted at runtime. The notation supports evolutionary prototyping of new interaction styles for already existing interactive systems. We eliminate the gap between design- and runtime, since support for additional modalities can be prototyped at runtime to an already existing interactive system.

@INPROCEEDINGS{Feuerstack2007,
  author = {Sebastian Feuerstack, Marco Blumendorf and Sahin Albayrak},
  title = {Prototyping of Multimodal Interactions for Smart Environments based on Task Models},
  booktitle = {European Conference on Ambient Intelligence: Workshop on Model Driven Software Engineering for Ambient Intelligence Applications},
  year = {2007},
  owner = {blumendorf},
  pdf = {Feuerstack2007.pdf},
  timestamp = {2007.10.27}
}
Autoren:
Sebastian Feuerstack, Marco Blumendorf, Sahin Albayrak
Kategorie:
Tagungsbeitrag
Jahr:
2007
Ort:
European Conference on Ambient Intelligence: Workshop on Model Driven Software Engineering for Ambient Intelligence Applications, Darmstadt, Germany