Cognitive Human-Machine Interaction

Combining computer science with behavioral sciences, Fraunhofer Singapore develop novel methods and algorithms for the application of real-brain states recognition.

Cognitive Human-Machine Interaction

The exciting world of Human–Machine Interaction (HMI) involves the study of the interaction between the user and the machine. Here we see how computer science combines with behavioral sciences and results in interaction at the user interface which includes both software and hardware.  

We propose novel methods and algorithms for time-series analysis that can be successfully applied for real-time brain states classification from Electroencephalogram EEG and can be integrated in real-time EEG-enabled systems. Current approach in brain state recognition algorithms development is to propose new features (including non-linear), feature extraction algorithms and/or learn features using deep learning techniques and study different neural network systems to improve accuracy of brain states recognition. The proposed algorithms can be used for optimisation of human-machine interfaces in rehabilitation systems, entertainment, robotics and more.


Areas of Expertise


Cognitive Abilities Enhancement

The development of novel algorithms and techniques to assess cognitive abilities from EEG and tracking improvements using 3-D neurofeedback games.


Human Factors Evaluation

Using EEG to quantify human emotions, mental workload, stress, and attention in real time.


Affective Computing

Experts in EEG-based emotional state recognition, Fraunhofer Singapore develop novel algorithms of emotions recognition.

Human Factors Evaluation Using Mobile BCI

Neuroscience-based or neuroscience-informed design is a new area of BCI application. It takes its roots in study of human well-being in architecture and human factor study in engineering and manufacturing. We proposed a mobile EEG-based system to monitor and analyse human factors measurements of newly designed systems, hardware and/or working places. The EEG is used as a tool to monitor and record the brain states of subjects during human factors study experiments. In traditional human factors studies, the data of mental workload, stress, emotion and vigilance recognition are obtained through questionnaires that are administered upon completion of some task or the whole experiment. However, this method only offers the evaluation of overall feelings subjects during the task performance and/or after the experiment. Real-time EEG-based human factors evaluation of systems allows researchers to analyse the changes of subjects’ brain states during the performance of various tasks. Machine learning techniques are applied to the EEG data to recognize levels of mental workload, stress, emotion and vigilance during each task. By utilizing the proposed EEG-based system, true understanding of subjects working pattern can be obtained. Based on the analyses of the objective real time data together with the subjective feedback from the subjects, we are able to reliably evaluate current systems/hardware and/or working place design and refine new concepts of future systems.



Affective Computing

We have a very high expertise in EEG-based emotional state recognition. We developed novel algorithms of emotions recognition. EEG-based emotion recognition algorithms allow recognition up to eight emotional states: ‘satisfied’, ‘happy’, ‘surprised’, ‘protected’, ‘sad’, ‘unconcerned’, ‘angry’, and ‘fear’ with adequate accuracy. Additionally, different levels of valence are recognized from the extreme negative to the extreme positive. The proposed algorithms can be used in music therapy,  music player, games, human factors study, neuromarketing,  etc.


An EEG dataset for Stable Affective Feature Selection

Affective brain-computer interface (aBCI) is a direct communication pathway between human brain and computer, via which the computer tries to recognize the affective states of its user and respond accordingly. As aBCI introduces personal affective factors into human-computer interactions, it could potentially enrich the user’s experience during the interaction with a computer.

Successful emotion recognition plays a key role in such a system. The state-of-the-art aBCIs leverage machine learning techniques which consist in acquiring affective electroencephalogram (EEG) signals from the user and calibrating the classifier to the affective patterns of the user. Many studies have reported satisfactory recognition accuracy using this paradigm. However, affective neural patterns are volatile over time even for the same subject. The recognition accuracy cannot be maintained if the usage of aBCI prolongs without recalibration. Existing studies have overlooked the performance evaluation of aBCI during long-term use.

We propose a dataset which includes multiple recording sessions spanning across several days for each subject. Multiple sessions across different days were recorded so that the long-term recognition performance of aBCI can be evaluated. Based on this dataset, we demonstrate that the recognition accuracy of aBCIs deteriorates when re-calibration is ruled out during the long-term usage. Then, we propose a stable feature selection method to choose the most stable affective features, for mitigating the accuracy deterioration to a lesser extent and maximizing the aBCI performance in the long run. We invite other researchers to test the performance of their aBCI algorithms on this dataset, and especially to evaluate the long-term performance of their algorithms.

Download dataset  

Join the Team

Explore the latest offers in career and scholarship opportunities.

Business Opportunities

Make us your preferred support partner for industry and business.

  • Contract research and development
  • Licensing of technologies
  • Consulting services and studies
  • Training and Services

Academic Collaborations

Partner with us in academic and research collaborations.