ISSN (print) 0914-4935
ISSN (online) 2435-0869
Sensors and Materials
is an international peer-reviewed open access journal to provide a forum for researchers working in multidisciplinary fields of sensing technology.
Sensors and Materials
is covered by Science Citation Index Expanded (Clarivate Analytics), Scopus (Elsevier), and other databases.

Instructions to authors
English    日本語

Instructions for manuscript preparation
English    日本語


 Sensors and Materials
 1-23-3-303 Sendagi,
 Bunkyo-ku, Tokyo 113-0022, Japan
 Tel: 81-3-3827-8549
 Fax: 81-3-3827-8547

MYU Research, a scientific publisher, seeks a native English-speaking proofreader with a scientific background. B.Sc. or higher degree is desirable. In-office position; work hours negotiable. Call 03-3827-8549 for further information.

MYU Research

(proofreading and recording)

(translation service)

The Art of Writing Scientific Papers

(How to write scientific papers)
(Japanese Only)

Sensors and Materials, Volume 31, Number 10(3) (2019)
Copyright(C) MYU K.K.
pp. 3289-3302
S&M2011 Research Paper of Special Issue
Published: October 31, 2019

Automated Reconstruction of Railroad Rail Using Helicopter-borne Light Detection and Ranging in a Train Station [PDF]

Wang-Gyu Jeon and Eui-Myoung Kim

(Received March 14, 2019; Accepted August 7, 2019)

Keywords: LiDAR, rail, detection, modelling, iterative RANSAC, automation

A coordinate-based 3D model of a railroad rail is essential for the maintenance of railway services. However, automated processing to reconstruct objects from light detection and ranging (LiDAR) data in areas where such facilities are installed in a complex manner is still a challenge. In this study, our objective is to develop a method for the automated reconstruction of rails from LiDAR data in a complex area. Unlike the running sections of a train where one or two rails are present, many rails are installed by joining or branching in a railway station. Three factors, namely, height difference, spatial relationship with other objects, and point density difference, were considered in detecting the rails in this complicated area. For tracking and modeling rails, an iterative random sample consensus (RANSAC) and singular value decomposition (SVD) algorithms were used. The results showed that about 90% of the rail tracks were extracted, and the precision rate was about 99%. All processes were fully automated, and the method proposed in this study was developed to be applicable to various line-type facilities as well as rails.

Corresponding author: Eui-Myoung Kim

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

Cite this article
Wang-Gyu Jeon and Eui-Myoung Kim, Automated Reconstruction of Railroad Rail Using Helicopter-borne Light Detection and Ranging in a Train Station, Sens. Mater., Vol. 31, No. 10, 2019, p. 3289-3302.

Forthcoming Regular Issues

Forthcoming Special Issues

Special Issue on Advanced Materials on Electronic and Mechanical Devices and their Application on Sensors (5)
Guest editor, Teen-Hang Meen (National Formosa University), Wenbing Zhao (Cleveland State University), and Cheng-Fu Yang (National University of Kaohsiung)

Special Issue on Advances in Shape Memory Materials
Guest editor, Ryosuke Matsui (Aichi Institute of Technology) and Hiroyuki Miki (Tohoku University)

Special Issue on Perceptual Deep Learning in Computer Vision and its Application
Guest editor, Chih-Hsien Hsia (National Ilan University)

Special Issue on Materials, Devices, Circuits, and Analytical Methods for Various Sensors (3)
Guest editor, Chien-Jung Huang (National University of Kaohsiung), Cheng-Hsing Hsu (National United University), Ja-Hao Chen (Feng Chia University), and Wei-Ling Hsu (Huaiyin Normal University)
Conference website

Special Issue on Sensing Technologies and Their Applications (1)
Guest editor, Rey-Chue Hwang (I-Shou University)
Call for paper

Special Issue on New Trends in Smart Sensor Systems
Guest editor, Takahiro Hayashi (Kansai University)
Call for paper

Copyright(C) MYU K.K. All Rights Reserved.