Notice of retraction
Vol. 32, No. 8(2), S&M2292

ISSN (print) 0914-4935
ISSN (online) 2435-0869
Sensors and Materials
is an international peer-reviewed open access journal to provide a forum for researchers working in multidisciplinary fields of sensing technology.
Sensors and Materials
is covered by Science Citation Index Expanded (Clarivate Analytics), Scopus (Elsevier), and other databases.

Instructions to authors
English    日本語

Instructions for manuscript preparation
English    日本語

Template
English

Publisher
 MYU K.K.
 Sensors and Materials
 1-23-3-303 Sendagi,
 Bunkyo-ku, Tokyo 113-0022, Japan
 Tel: 81-3-3827-8549
 Fax: 81-3-3827-8547

MYU Research, a scientific publisher, seeks a native English-speaking proofreader with a scientific background. B.Sc. or higher degree is desirable. In-office position; work hours negotiable. Call 03-3827-8549 for further information.


MYU Research

(proofreading and recording)


MYU K.K.
(translation service)


The Art of Writing Scientific Papers

(How to write scientific papers)
(Japanese Only)

Sensors and Materials, Volume 32, Number 10(3) (2020)
Copyright(C) MYU K.K.
pp. 3517-3530
S&M2355 Research Paper of Special Issue
https://doi.org/10.18494/SAM.2020.2933
Published: October 30, 2020

Computed Tomography Image Recognition with Convolutional Neural Network Using Wearable Sensors [PDF]

Yuqing He, Lei Lei, Guangsong Yang, Chih-Cheng Chen, Christopher Chun Ki Chan, and Kuei-Kuei Lai

(Received April 22, 2020; Accepted August 12, 2020)

Keywords: convolutional neural network, CT image recognition, diagnosis support, overfitting

We propose a modified convolutional neural network (CNN) tailor-made for computed tomography (CT) image disease recognition to assist doctors in disease diagnosis. First, we analyze the effects of varying the CNN activation function and pooling parameters, as well as the CNN’s performance using one data set. Second, we address the activation error that occurs when the sample data size is increased by preprocessing images by an enhancement technique, adjusting the activation function and initialization weighting, training/testing the target, and adaptively extracting features. We found that our method alleviates overfitting with these techniques. The experimental results show that our proposed scheme improves the recognition rate and can better generalize findings.

Corresponding author: Guangsong Yang, Chih-Cheng Chen


Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

Cite this article
Yuqing He, Lei Lei, Guangsong Yang, Chih-Cheng Chen, Christopher Chun Ki Chan, and Kuei-Kuei Lai, Computed Tomography Image Recognition with Convolutional Neural Network Using Wearable Sensors, Sens. Mater., Vol. 32, No. 10, 2020, p. 3517-3530.



Forthcoming Regular Issues


Forthcoming Special Issues

Special Issue on Geomatics Technologies for the Realization of Smart Cities (1)
Guest editor, He Huang and XiangLei Liu (Beijing University of Civil Engineering and Architecture)


Special Issue on International Conference on BioSensors, BioElectronics, BioMedical Devices, BioMEMS/NEMS and Applications 2019 (Bio4Apps 2019) (1)
Guest editor, Hirofumi Nogami and Masaya Miyazaki (Kyushu University)
Conference website


Special Issue on Optical Sensors: Novel Materials, Approaches, and Applications
Guest editor, Yap Wing Fen (Universiti Putra Malaysia)


Special Issue on Intelligent Sensing Control Analysis, Optimization, and Automation (2)
Guest editor, Cheng-Chi Wang (National Chin-Yi University of Technology)


Special Issue on Geomatics Technologies for the Realization of Smart Cities (2)
Guest editor, He Huang and XiangLei Liu (Beijing University of Civil Engineering and Architecture)


Special Issue on Materials, Devices, Circuits, and Analytical Methods for Various Sensors (4)
Guest editor, Chien-Jung Huang (National University of Kaohsiung), Cheng-Hsing Hsu (National United University), Ja-Hao Chen (Feng Chia University), and Wei-Ling Hsu (Huaiyin Normal University)
Conference website


Copyright(C) MYU K.K. All Rights Reserved.