Notice of retraction
Vol. 32, No. 8(2), S&M2292

ISSN (print) 0914-4935
ISSN (online) 2435-0869
Sensors and Materials
is an international peer-reviewed open access journal to provide a forum for researchers working in multidisciplinary fields of sensing technology.
Sensors and Materials
is covered by Science Citation Index Expanded (Clarivate Analytics), Scopus (Elsevier), and other databases.

Instructions to authors
English    日本語

Instructions for manuscript preparation
English    日本語

Template
English

Publisher
 MYU K.K.
 Sensors and Materials
 1-23-3-303 Sendagi,
 Bunkyo-ku, Tokyo 113-0022, Japan
 Tel: 81-3-3827-8549
 Fax: 81-3-3827-8547

MYU Research, a scientific publisher, seeks a native English-speaking proofreader with a scientific background. B.Sc. or higher degree is desirable. In-office position; work hours negotiable. Call 03-3827-8549 for further information.


MYU Research

(proofreading and recording)


MYU K.K.
(translation service)


The Art of Writing Scientific Papers

(How to write scientific papers)
(Japanese Only)

Sensors and Materials, Volume 32, Number 4(1) (2020)
Copyright(C) MYU K.K.
pp. 1261-1277
S&M2175 Research Paper of Special Issue
https://doi.org/10.18494/SAM.2020.2552
Published: April 10, 2020

Unsupervised Recurrent Neural Network with Parametric Bias Framework for Human Emotion Recognition with Multimodal Sensor Data Fusion [PDF]

Jie Li, Junpei Zhong, and Min Wang

(Received August 9, 2019; Accepted October 16, 2019)

Keywords: emotion recognition, multimodal sensors, recurrent neural network, subconscious behaviors

In this paper, we present an emotion recognition framework based on a recurrent neural network with parametric bias (RNNPB) to classify six basic emotions of humans (joy, pride, fear, anger, sadness, and neutral). To capture the expression to recognize emotions, human joint coordinates, angles, and angular velocities are fused in the process of signal preprocessing. A wearable Myo armband and a Kinect sensor are used to collect human joint angular velocities and angles, respectively. Thus, a combined structure of various modalities of subconscious behaviors is presented to improve the classification performance of RNNPB. To this end, two comparative experiments were performed to demonstrate that the performance with the fused data outperforms that of the single modality sensor data from one person. To investigate the robustness of the proposed framework, we further carried out another experiment with the fused data from several people. Six types of emotions can be basically classified using the RNNPB framework according to the recognition results. These experimental results verified the effectiveness of our proposed framework.

Corresponding author: Min Wang


Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

Cite this article
Jie Li, Junpei Zhong, and Min Wang, Unsupervised Recurrent Neural Network with Parametric Bias Framework for Human Emotion Recognition with Multimodal Sensor Data Fusion, Sens. Mater., Vol. 32, No. 4, 2020, p. 1261-1277.



Forthcoming Regular Issues


Forthcoming Special Issues

Special Issue on International Conference on BioSensors, BioElectronics, BioMedical Devices, BioMEMS/NEMS and Applications 2019 (Bio4Apps 2019) (1)
Guest editor, Hirofumi Nogami and Masaya Miyazaki (Kyushu University)
Conference website


Special Issue on Optical Sensors: Novel Materials, Approaches, and Applications
Guest editor, Yap Wing Fen (Universiti Putra Malaysia)


Special Issue on Intelligent Sensing Control Analysis, Optimization, and Automation (2)
Guest editor, Cheng-Chi Wang (National Chin-Yi University of Technology)


Special Issue on Geomatics Technologies for the Realization of Smart Cities (2)
Guest editor, He Huang and XiangLei Liu (Beijing University of Civil Engineering and Architecture)


Special Issue on Cyber–Physical Systems (CPS) and Internet of Things (IoT)
Guest editor, Yutaka Arakawa (Kyushu University)


Special Issue on Sensors and Materials Emerging Investigators in Japan
Guest editor, Tsuyoshi Minami (The University of Tokyo)


Copyright(C) MYU K.K. All Rights Reserved.