Cognitive Human-Computer Interaction

The exciting world of Human–Computer Interaction (HCI) involves the study of the interaction between the user and the computer. Here we see how computer science combines with behavioral sciences and results in interaction at the user interface which includes both software and hardware.  

We propose novel methods and algorithms for time-series analysis that can be successfully applied for real-time brain states classification from Electroencephalogram EEG and can be integrated in real-time EEG-enabled systems.  Current approach in brain state recognition algorithms development is to propose new features (including non-linear), feature extraction algorithms and/or learn features using deep learning techniques and study different neural network systems to improve accuracy of brain states recognition. The proposed algorithms can be used for optimization of human-computer interfaces, human-machine interfaces, in rehabilitation systems, entertainment, robotics, etc.

 

Cognitive Abilities Enhancement

We developed novel algorithms and techniques to assess cognitive abilities from EEG and improved using 3-D neurofeedback games.

Human factors evaluation using EEG

Human emotions, mental workload, stress, and attention are recognized from EEG in real time.

Affective Computing

We have a very high expertise in EEG-based emotional state recognition. We developed novel algorithms of emotions recognition.

CHCI Lab@NTU

Visit to Cognitive Human Computer Interaction CHCI Lab@NTU.

Cognitive Abilities Enhancement using Neurofeedback.

 Neuroscience-based or neuroscience-informed design is a new area of BCI application. It takes its roots in study of human well-being in architecture and human factor study in engineering and manufacturing. We proposed a mobile EEG-based system to monitor and analyse human factors measurements of newly designed systems, hardware and/or working places. The EEG is used as a tool to monitor and record the brain states of subjects during human factors study experiments. In traditional human factors studies, the data of mental workload, stress, emotion and vigilance recognition are obtained through questionnaires that are administered upon completion of some task or the whole experiment. However, this method only offers the evaluation of overall feelings subjects during the task performance and/or after the experiment. Real-time EEG-based human factors evaluation of systems allows researchers to analyse the changes of subjects’ brain states during the performance of various tasks. Machine learning techniques are applied to the EEG data to recognize levels of mental workload, stress, emotion and vigilance during each task. By utilizing the proposed EEG-based system, true understanding of subjects working pattern can be obtained. Based on the analyses of the objective real time data together with the subjective feedback from the subjects, we are able to reliably evaluate current systems/hardware and/or working place design and refine new concepts of future systems

Human Factors Evaluation Using Mobile BCI

Neuroscience-based or neuroscience-informed design is a new area of BCI application. It takes its roots in study of human well-being in architecture and human factor study in engineering and manufacturing. We proposed a mobile EEG-based system to monitor and analyse human factors measurements of newly designed systems, hardware and/or working places. The EEG is used as a tool to monitor and record the brain states of subjects during human factors study experiments. In traditional human factors studies, the data of mental workload, stress, emotion and vigilance recognition are obtained through questionnaires that are administered upon completion of some task or the whole experiment. However, this method only offers the evaluation of overall feelings subjects during the task performance and/or after the experiment. Real-time EEG-based human factors evaluation of systems allows researchers to analyse the changes of subjects’ brain states during the performance of various tasks. Machine learning techniques are applied to the EEG data to recognize levels of mental workload, stress, emotion and vigilance during each task. By utilizing the proposed EEG-based system, true understanding of subjects working pattern can be obtained. Based on the analyses of the objective real time data together with the subjective feedback from the subjects, we are able to reliably evaluate current systems/hardware and/or working place design and refine new concepts of future systems.

 

 

Affective Computing

We have a very high expertise in EEG-based emotional state recognition. We developed novel algorithms of emotions recognition. EEG-based emotion recognition algorithms allow recognition up to eight emotional states: ‘satisfied’, ‘happy’, ‘surprised’, ‘protected’, ‘sad’, ‘unconcerned’, ‘angry’, and ‘fear’ with adequate accuracy. Additionally, different levels of valence are recognized from the extreme negative to the extreme positive. The proposed algorithms can be used in music therapy,  music player, games, human factors study, neuromarketing,  etc.