Voice-based assistance systems are already widely established as interaction modality in Smart Spaces. However, it is repeatedly shown that these systems do not conform to data protection and privacy regulations. This is due to the cloud-based evaluation of speech data, where speech data is collected and analyzed for the improvement of speech recognition. However, this is highly questionable in terms of data protection and privacy law. Furthermore, these cloud-based voice assistants always require an internet connection.
To preserve privacy and general user acceptance, new forms and modalities of interaction must be developed. One promising modality is gesture interaction. Especially the use of micro finger gestures, which are simple and inconspicuous to execute, represent a promising interaction. We are studying and developing interaction devices and methods that allow the recognition of these micro finger gestures. In addition, we are developing algorithms running on the interaction device itself or on a mobile computer such as a smartphone. Consequently, this represents an interactive system that complies with data protection regulations. We call this novel gesture-based form of interaction invisible interaction.
Current projects dealing with questions of invisible interaction are the DFG project eRing and the BMBF-funded project UbiAct.