S&M Young Researcher Paper Award 2020
Recipients: Ding Jiao, Zao Ni, Jiachou Wang, and Xinxin Li [Winner's comments]
Paper: High Fill Factor Array of Piezoelectric Micromachined
Ultrasonic Transducers with Large Quality Factor

S&M Young Researcher Paper Award 2021
Award Criteria
Notice of retraction
Vol. 32, No. 8(2), S&M2292

Print: ISSN 0914-4935
Online: ISSN 2435-0869
Sensors and Materials
is an international peer-reviewed open access journal to provide a forum for researchers working in multidisciplinary fields of sensing technology.
Sensors and Materials
is covered by Science Citation Index Expanded (Clarivate Analytics), Scopus (Elsevier), and other databases.

Instructions to authors
English    日本語

Instructions for manuscript preparation
English    日本語


 Sensors and Materials
 1-23-3-303 Sendagi,
 Bunkyo-ku, Tokyo 113-0022, Japan
 Tel: 81-3-3827-8549
 Fax: 81-3-3827-8547

MYU Research, a scientific publisher, seeks a native English-speaking proofreader with a scientific background. B.Sc. or higher degree is desirable. In-office position; work hours negotiable. Call 03-3827-8549 for further information.

MYU Research

(proofreading and recording)

(translation service)

The Art of Writing Scientific Papers

(How to write scientific papers)
(Japanese Only)

Copyright(C) MYU K.K.
Published in advance: November 19, 2021

Lane Line Detection Based on Improved Semantic Segmentation in Complex Road Environment [PDF]

Chaowei Ma, Dean Luo, and He Huang

(Received July 16, 2021; Accepted October 27, 2021)

Keywords: smart city, autonomous driving, complex road environment, lane line detection, semantic segmentation

With the concepts of smart city and smart travel and the rapid development of modern sensors, artificial intelligence, and other modern technologies, automatic driving technology that can effectively solve road congestion and ensure driving safety has become the main direction of future industry development. Accurate lane line technology is a fundamental technology for realizing autonomous driving. However, in actual road environments, lane lines are often detected with a low accuracy because of various factors, including light intensity changes and lane line obstruction, which greatly affect the safety of autonomous driving. To address the current challenges in lane line detection, in this study, we propose a lane line detection model based on improved semantic segmentation for complex road scenarios, such as lane line occlusion, mutilation, and shadowing. The Visual Geometry Group–Special Convolutional Neural Network (VGG-SS) proposed in this paper, which is based on the VGG-16 network, introduces a self-attentive distillation model and a spatial convolutional neural network (SCNN) model. Empirical results show that the proposed model outperforms the current semantic segmentation models, achieving better detection effects and a higher F1 value of 82.6 in complex road scenarios. The results prove that the proposed method can effectively improve the detection accuracy of lane lines.

Corresponding author: He Huang

Forthcoming Regular Issues

Forthcoming Special Issues

Special Issue on Advanced Methods and Devices for Remote Sensing
Guest editor, Lei Deng and FuZhou Duan (Capital Normal University, Beijing)

Special Issue on Microfluidics and Related Nano/Microengineering for Medical and Chemical Applications
Guest editor, Yuichi Utsumi (University of Hyogo)
Call for paper

Special Issue on International Conference on BioSensors, BioElectronics, BioMedical Devices, BioMEMS/NEMS and Applications 2019 (Bio4Apps 2019) (2)
Guest editor, Hirofumi Nogami and Masaya Miyazaki (Kyushu University)
Conference website

Special Issue on Biological Odor Sensing System and Their Applications
Guest editor, Takeshi Sakurai (Tokyo University of Agriculture)

Special Issue on High-sensitivity Sensors and Sensors for Difficult-to-measure Objects
Guest editor, Ki Ando (Chiba Institute of Technology)
Call for paper

Special Issue on Sensing Technologies and Their Applications (II)
Guest editor, Rey-Chue Hwang (I-Shou University)
Call for paper

Copyright(C) MYU K.K. All Rights Reserved.