Notice of retraction
Vol. 32, No. 8(2), S&M2292

ISSN (print) 0914-4935
ISSN (online) 2435-0869
Sensors and Materials
is an international peer-reviewed open access journal to provide a forum for researchers working in multidisciplinary fields of sensing technology.
Sensors and Materials
is covered by Science Citation Index Expanded (Clarivate Analytics), Scopus (Elsevier), and other databases.

Instructions to authors
English    日本語

Instructions for manuscript preparation
English    日本語

Template
English

Publisher
 MYU K.K.
 Sensors and Materials
 1-23-3-303 Sendagi,
 Bunkyo-ku, Tokyo 113-0022, Japan
 Tel: 81-3-3827-8549
 Fax: 81-3-3827-8547

MYU Research, a scientific publisher, seeks a native English-speaking proofreader with a scientific background. B.Sc. or higher degree is desirable. In-office position; work hours negotiable. Call 03-3827-8549 for further information.


MYU Research

(proofreading and recording)


MYU K.K.
(translation service)


The Art of Writing Scientific Papers

(How to write scientific papers)
(Japanese Only)

Copyright(C) MYU K.K.

Robust Recognition of Chinese Text from Cellphone-acquired Low-quality Identity Card Images Using Convolutional Recurrent Neural Network

Jianmei Wang, Ruize Wu, and Shaoming Zhang

(Received July 23, 2020; Accepted January 6, 2021)

Keywords: Chinese text recognition, synthetic data, convolutional recurrent neural network, conditional generative adversarial network, DenseNet

Automatic reading text from an identity (ID) card image has a wide range of social uses. In this paper, we propose a novel method for Chinese text recognition from ID card images taken by cellphone cameras. The paper has two main contributions: (1) A synthetic data engine based on a conditional adversarial generative network is designed to generate million-level synthetic ID card text line images, which can not only retain the inherent template pattern of ID card images but also preserve the diversity of synthetic data. (2) An improved convolutional recurrent neural network (CRNN) is presented to increase Chinese text recognition accuracy, in which DenseNet substitutes VGGNet architecture to extract more sophisticated spatial features. The proposed method is evaluated with more than 7000 real ID card text line images. The experimental results demonstrate that the improved CRNN model trained only on the synthetic dataset can increase the recognition accuracy of Chinese text in cellphone-acquired low-quality images. Specifically, compared with the original CRNN, the average character recognition accuracy is increased from 96.87 to 98.57% and the line recognition accuracy is increased from 65.92 to 90.10%.

Corresponding author: Shaoming Zhang




Forthcoming Regular Issues


Forthcoming Special Issues

Special issue on Novel Materials and Sensing Technologies on Electronic and Mechanical Devices (2)-1
Guest editor, Teen-Hang Meen (National Formosa University), Wenbing Zhao (Cleveland State University), and Hsien-Wei Tseng (Longyan University)


Special Issue on Materials, Devices, Circuits, and Analytical Methods for Various Sensors (4)
Guest editor, Chien-Jung Huang (National University of Kaohsiung), Cheng-Hsing Hsu (National United University), Ja-Hao Chen (Feng Chia University), and Wei-Ling Hsu (Huaiyin Normal University)
Conference website


Special issue on Novel Materials and Sensing Technologies on Electronic and Mechanical Devices (2)-2
Guest editor, Teen-Hang Meen (National Formosa University), Wenbing Zhao (Cleveland State University), and Hsien-Wei Tseng (Longyan University)


Special Issue on New Trends in Robots and Their Applications
Guest editor, Ikuo Yamamoto (Nagasaki University)


Special Issue on Artificial Intelligence in Sensing Technologies and Systems
Guest editor, Prof. Lin Lin (Dalian University of Technology)
Call for paper


Special issue on Novel Materials and Sensing Technologies on Electronic and Mechanical Devices (3)
Guest editor, Teen-Hang Meen (National Formosa University), Wenbing Zhao (Cleveland State University), and Hsien-Wei Tseng (Longyan University)


Copyright(C) MYU K.K. All Rights Reserved.