Executable Models for Human-Computer Interaction: Meta-Modeling Ubiquitous User Interfaces
Abstract
The availability of a variety of multimedia and interaction devices, sensors and appliances makes the development of user interfaces for smart environments a challenging and time-consuming task. Using standard modeling languages and technologies, like e.g. HTML, it is impossible to create context-adaptive, multimodal, multi-user and multi-device user interfaces that assist the user in a smart environment. The high heterogeneity and dynamics of smart environments require a new type of modeling languages that would enable the creation of truly ubiquitous applications. This work deals with the design of executable user interface languages and the utilization of user interface models at runtime. Enhanced with runtime concepts, the user interface models presented in the work enable the reasoning about the user interface, its state and the decisions of the designer at runtime on a high level of abstraction, as well as provide means for their consistent modification. An explicit definition of the design time rationale, the state information and the execution logic in the metamodels enables a dynamic adaptation of model-based user interfaces to contexts unforeseeable at design time.