B-Interface 2010 Abstracts


Full Papers
Paper Nr: 7
Title:

Identifying Psychophysiological Correlates of Boredom and Negative Mood Induced During HCI

Authors:

Dimitris Giakoumis, Athanasios Vogiannou, Illka Kosunen, Kostantinos Moustakas, Dimitrios Tzovaras and George Hassapis

Abstract: This paper presents work conducted towards the automatic recognition of negative emotions like boredom and frustration, induced due to the subject’s loss of interest during HCI. Focus was on the basic pre-requisite for the future development of systems utilizing an “affective loop”, namely effective recognition of the human affective state. Based on the concept of “repetition that causes loss of interest”, an experiment for the monitoring and analysis of biosignals during repetitive HCI tasks was deployed. During this experiment, subjects were asked to play a simple labyrinth-based 3D video game repeatedly, while biosignals from different modalities were monitored. Twenty one different subjects participated in the experiment, allowing for a rich biosignals database to be populated. Statistically significant correlations were identified between features extracted from two of the modalities used in the experiment (ECG and GSR) and the actual affective state of the subjects.

Paper Nr: 11
Title:

The Smart Sensor Integration Framework and its Application in EU Projects

Authors:

Johannes Wagner, Frank Jung, Jonghwa Kim, Thurid Vogt and Elisabeth André

Abstract: Affect sensing by machines is an essential part of next-generation human-computer interaction (HCI). However, despite the large effort carried out in this field during the last decades, only few applications exist, which are able to react to a user's emotion in real-time. This is certainly due to the fact that emotion recognition is a challenging part in itself. Another reason is that so far most effort has been put towards offline analysis and only few applications exist, which can react to a user's emotion in real-time. In response to this deficit we have developed a framework called Smart Sensor Integration (SSI), which considerably jump-starts the development of multimodal online emotion recognition (OER) systems. In this paper, we introduce the SSI framework and describe how it is successfully applied in different projects under grant of the European Union, namely the CALLAS and METABO project, and the IRIS network.

Paper Nr: 12
Title:

A Spectral Mapping Method for EMG-based Recognition of Silent Speech

Authors:

Matthias Janke, Michael Wand and Tanja Schultz

Abstract: This paper reports on our latest study on speech recognition based on surface electromyography (EMG). This technology allows for Silent Speech Interfaces since EMG captures the electrical potentials of the human articulatory muscles rather than the acoustic speech signal. Therefore, our technology enables speech recognition to be applied to silently mouthed speech. Earlier experiments indicate that the EMG signal is greatly impacted by the mode of speaking. In this study we analyze and compare EMG signals from audible, whispered, and silent speech. We quantify the differences and develop a spectral mapping method to compensate for these differences. Finally, we apply the spectral mapping to the front-end of our speech recognition system and show that recognition rates on silent speech improve by up to 12.3% relative.

Short Papers
Paper Nr: 3
Title:

Facial Features’ Localization using a Morphological Operation

Authors:

Kenz A. Bozed, Ali Mansour and Osei Adjei

Abstract: Facial features’ localization is an important part of various applications such as face recognition, facial expression detection and human computer interaction. It plays an essential role in human face analysis especially in searching for facial features (mouth, nose and eyes) when the face region is included within the image. Most of these applications require the face and facial feature detection algorithms. In this paper, a new method is proposed to locate facial features. A morphological operation is used to locate the pupils of the eyes and estimate the mouth position according to them. The boundaries of the allocated features are computed as a result when the features are allocated. The results obtained from this work indicate that the algorithm has been very successful in recognising different types of facial expressions.

Paper Nr: 8
Title:

Start and End Point Detection of Weightlifting Motion using CHLAC and MRA

Authors:

Fumito Yoshikawa, Takumi Kobayashi, Kenji Watanabe, Katsuyoshi Shirai and Nobuyuki Otsu

Abstract: Extracting human motion segments of interest in image sequences is essential for quantitative analysis and effective video browsing, although it requires laborious human efforts. In analysis of sport motion such as weightlifting, it is required to detect the start and end of each weightlifting motion in an automated manner and hopefully even for different camera angle-views. This paper describes a weightlifting motion detection method employing cubic higher-order local auto-correlation (CHLAC) and multiple regression analysis (MRA). This method extracts spatio-temporal motion features and leans the relationship between the features and specific motion, without prior knowledge about objects. To demonstrate the effectiveness of our method, the experiment was conducted on data captured from eight different viewpoints in practical situations. The detection rates for the start and end motions were more than 94% for 140 data in total even for different angle views, 100% for some angles.

Paper Nr: 9
Title:

Prerequisites for Affective Signal Processing (ASP) – Part IV

Authors:

Egon L. van den Broek, Marjolein D. van der Zwaag, Jennifer A. Healey, Joris H. Janssen and Joyce H. D. M. Westerink

Abstract: In [1–3], a series of prerequisites for affective signal processing (ASP) was defined: validation (e.g., mapping of constructs on signals), triangulation, a physiology-driven approach, contributions of the signal processing community, identification of users, theoretical specification, integration of biosignals, and physical characteristics. This paper defines three additional prerequisites: historical perspective, temporal construction, and real-world baselines.

Paper Nr: 10
Title:

Biometrics for Emotion Detection (BED): Exploring the combination of Speech and ECG

Authors:

Marleen H. Schut, Kees Tuinenbreijer, Egon L. van den Broek and Joyce H. D. M. Westerink

Abstract: The paradigm Biometrics for Emotion Detection (BED) is introduced, which enables unobtrusive emotion recognition, taking into account varying environments. It uses the electrocardiogram (ECG) and speech, as a powerful but rarely used combination to unravel people's emotions. BED was applied in two environments (i.e., office and home-like) in which 40 people watched 6 film scenes. It is shown that both heart rate variability (derived from the ECG) and, when people's gender is taken into account, the standard deviation of the fundamental frequency of speech indicate people's experienced emotions. As such, these measures validate each other. Moreover, it is found that people's environment can indeed of influence experienced emotions. These results indicate that BED might become an important paradigm for unobtrusive emotion detection.

Paper Nr: 13
Title:

Contagion of Physiological Correlates of Emotion between Performer and Audience: An Exploratory Study

Authors:

Javier Jaimovich, Niall Coghlan and R. Benjamin Knapp

Abstract: Musical and performance experiences are often described as evoking powerful emotions, both in the listener/observer and player/performer. There is a significant body of literature describing these experiences along with related work examining physiological changes in the body during music listening and the physiological correlates of emotional state. However there are still open questions as to how and why, emotional responses may be triggered by a performance, how audiences may be influenced by a performers mental or emotional state and what effect the presence of an audience has on performers. We present a pilot study and some initial findings of our investigations into these questions, utilising a custom software and hardware system we have developed. Although this research is still at a pilot stage, our initial experiments point towards significant correlation between the physiological states of performers and audiences and we here present the system, the experiments and our preliminary data.

Paper Nr: 15
Title:

Motion and Single-trial Biosignal Analysis Platform for Monitoring of Rehabilitation

Authors:

Perttu Ranta-aho, Stefanos Georgiadis, Timo Bragge, Eini Niskanen, Mika P. Tarvainen, Ina M. Tarkka and Pasi A. Karjalainen

Abstract: Three-dimensional motion analysis is a powerful tool for the assessment of human movements during different rehabilitation applications. An adaptive virtual reality rehabilitation environment which is based on modern motion and biosignal analysis techniques is described.

Paper Nr: 16
Title:

AffectPhone: A Handset Device to Present User’s Emotional State with Warmth/Coolness

Authors:

Ken Iwasaki, Takashi Miyaki and Jun Rekimoto

Abstract: We developed AffectPhone, a system that detects a user's emotional state using the GSR, and conveys this state via changes in the temperature (wamrth or coolness) of the back panel of the other handset. Since GSR is a good measure of a user's level of arousal, we detect the GSR using electrodes attached to the sides of the handset. When the user's level of arousal increases or decreases, a Peltier module in the back panel of the other device generates warmth or coolness. This system does not require special sensors to be attached to the user's body, and therefore, it does not interrupt the user's daily use of the mobile phone. Moreover, this system is designed to convey non-verbal information in an ambient manner, and therefore, it would be more efficient than displays or speakers. This system is expected to help enhance existing telecommunication.