Dynamic user interface distribution for flexible multimodal interaction


The availability of numerous networked interaction devices within smart environments makes the exploitation of these devices for innovative and more natural interaction possible. In our work we make use of TVs with remote controls, picture frames, mobile phones, touch screens, stereos and PCs to create multimodal user interfaces. The combination of the interaction capabilities of the different devices allows to achieve a more suitable interaction for a situation. Changing situations can then require the dynamic redistribution of the created interfaces and the alteration of the used modalities and devices to keep up the interaction. In this paper we describe our approach for dynamically (re-) distributing user interfaces at run-time. A distribution component is responsible for determining the devices for the interaction based on the (changing) environment situation and the user interface requirements. The component provides possibilities to the application developer and to the user to influence the distribution according to their needs. A user interface model describes the interaction and the modality relations according to the CARE properties (Complementarity, Assignment, Redundancy and Equivalency) and a context model gathers and provides information about the environment.

 author = {Blumendorf, Marco and Roscher, Dirk and Albayrak, Sahin},
 title = {Dynamic user interface distribution for flexible multimodal interaction},
 booktitle = {International Conference on Multimodal Interfaces and the Workshop on Machine Learning for Multimodal Interaction},
 series = {ICMI-MLMI '10},
 year = {2010},
 isbn = {978-1-4503-0414-6},
 location = {Beijing, China},
 pages = {20:1--20:8},
 articleno = {20},
 numpages = {8},
 url = {http://doi.acm.org/10.1145/1891903.1891930},
 doi = {http://doi.acm.org/10.1145/1891903.1891930},
 acmid = {1891930},
 publisher = {ACM},
 address = {New York, NY, USA},
 keywords = {distributed user interfaces, human-computer interaction, model-based user interfaces, multimodal interaction, smart home environments},
Marco Blumendorf, Dirk Roscher, Sahin Albayrak
Conference Paper
International Conference on Multimodal Interfaces and the Workshop on Machine Learning for Multimodal Interaction