Predicting GEMS Based on PDA-Assessed Gestural Motion Data
Abstract
Music is often discussed to be perceived as emotional because it renders expressive movements into audible musical structures. Thus, a valid approach to measure musical emotion could be to assess movement stimulated by music. This study evaluated the discriminative power of mobile-device generated acceleration data produced by free movement during music listening experience for the prediction of different degrees of the Geneva Emotion Music Scales (GEMS-9). The variance of the perceived GEMS-states can be explained by the fitted models as follows: power (r² = .36), sadness (r² = .17), tenderness (r² = .14), joy (r² = .15), tension (r² = .11), peacefulness (r² = .08), nostalgia (r² = .07), wonder (r² = .07) and transcendence (r² = .02). These findings will contribute to understand, how acceleration data can be used to integrate embodied music cognition into Music Recommender Systems.