ISSN (print) 0914-4935
ISSN (online) 2435-0869
Sensors and Materials
is an international peer-reviewed open access journal to provide a forum for researchers working in multidisciplinary fields of sensing technology.
Sensors and Materials
is covered by Science Citation Index Expanded (Clarivate Analytics), Scopus (Elsevier), and other databases.

Instructions to authors
English    日本語

Instructions for manuscript preparation
English    日本語

Template
English

Publisher
 MYU K.K.
 Sensors and Materials
 1-23-3-303 Sendagi,
 Bunkyo-ku, Tokyo 113-0022, Japan
 Tel: 81-3-3827-8549
 Fax: 81-3-3827-8547

MYU Research, a scientific publisher, seeks a native English-speaking proofreader with a scientific background. B.Sc. or higher degree is desirable. In-office position; work hours negotiable. Call 03-3827-8549 for further information.


MYU Research

(proofreading and recording)


MYU K.K.
(translation service)


The Art of Writing Scientific Papers

(How to write scientific papers)
(Japanese Only)

Sensors and Materials, Volume 32, Number 6(3) (2020)
Copyright(C) MYU K.K.
pp. 2167-2176
S&M2248 Research Paper of Special Issue
https://doi.org/10.18494/SAM.2020.2832
Published: June 30, 2020

Virtual Keyboard Recognition with e-Textile Sensors [PDF]

Eun-Ji Ahn, Sang-Ho Han, Mun-Ho Ryu, and Je-Nam Kim

(Received April 15, 2019; Accepted March 12, 2020)

Keywords: gesture recognition, electronic textile, neural network, virtual keyboard

In this study, we propose a gesture recognition method using e-textile sensors and involving the pressing of numeric keys from “0” to “9”. An e-textile sensor comprises a double-layer structure with complementary resistance characteristics, and it is attached to the garment to obtain a resistance signal. For gesture recognition, we tested dynamic time warping (DTW), machine learning, long short-term memory (LSTM), and bidirectional LSTM (BiLSTM). Before applying each machine learning technique, we performed normalization and resized the data to ensure that they are of the same length. A total of 120 iterations were performed for each gesture for a single subject. The results indicate that the lowest gesture classification accuracy for DTW was 74.2%, followed by 78.8 and 91.6% for LSTM and BiLSTM, respectively.

Corresponding author: Mun-Ho Ryu


Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

Cite this article
Eun-Ji Ahn, Sang-Ho Han, Mun-Ho Ryu, and Je-Nam Kim, Virtual Keyboard Recognition with e-Textile Sensors, Sens. Mater., Vol. 32, No. 6, 2020, p. 2167-2176.



Forthcoming Regular Issues


Forthcoming Special Issues

Special Issue on Advanced Materials on Electronic and Mechanical Devices and their Application on Sensors (5)
Guest editor, Teen-Hang Meen (National Formosa University), Wenbing Zhao (Cleveland State University), and Cheng-Fu Yang (National University of Kaohsiung)


Special Issue on Advances in Shape Memory Materials
Guest editor, Ryosuke Matsui (Aichi Institute of Technology) and Hiroyuki Miki (Tohoku University)


Special Issue on Perceptual Deep Learning in Computer Vision and its Application
Guest editor, Chih-Hsien Hsia (National Ilan University)


Special Issue on Materials, Devices, Circuits, and Analytical Methods for Various Sensors (3)
Guest editor, Chien-Jung Huang (National University of Kaohsiung), Cheng-Hsing Hsu (National United University), Ja-Hao Chen (Feng Chia University), and Wei-Ling Hsu (Huaiyin Normal University)
Conference website


Special Issue on Sensing Technologies and Their Applications (1)
Guest editor, Rey-Chue Hwang (I-Shou University)
Call for paper


Special Issue on New Trends in Smart Sensor Systems
Guest editor, Takahiro Hayashi (Kansai University)
Call for paper


Copyright(C) MYU K.K. All Rights Reserved.