Notice of retraction
Vol. 32, No. 8(2), S&M2292

Print: ISSN 0914-4935
Online: ISSN 2435-0869
Sensors and Materials
is an international peer-reviewed open access journal to provide a forum for researchers working in multidisciplinary fields of sensing technology.
Sensors and Materials
is covered by Science Citation Index Expanded (Clarivate Analytics), Scopus (Elsevier), and other databases.

Instructions to authors
English    日本語

Instructions for manuscript preparation
English    日本語

Template
English

Publisher
 MYU K.K.
 Sensors and Materials
 1-23-3-303 Sendagi,
 Bunkyo-ku, Tokyo 113-0022, Japan
 Tel: 81-3-3827-8549
 Fax: 81-3-3827-8547

MYU Research, a scientific publisher, seeks a native English-speaking proofreader with a scientific background. B.Sc. or higher degree is desirable. In-office position; work hours negotiable. Call 03-3827-8549 for further information.


MYU Research

(proofreading and recording)


MYU K.K.
(translation service)


The Art of Writing Scientific Papers

(How to write scientific papers)
(Japanese Only)

Copyright(C) MYU K.K.

Rapid Local Image Style Transfer Method Based on Residual Convolutional Neural Network

Liming Huang, Ping Wang, and Cheng-Fu Yang

(Received October 21, 2020; Accepted February 2, 2021)

Keywords: image style transfer, residual neural network, semantic segmentation, DeepLab2, convolutional neural network

The technology of image style transfer can learn the style of a target image in a fully automated or semi-automated way, which is often very difficult to achieve by manual methods, thus saving much time and improving production efficiency. With the rapid spread of commercial software applications such as beauty selfie apps and short entertainment videos such as TikTok, local image style transfer and its generation speed of images are becoming increasingly important, particularly when these recreational products have features especially valued by users. We propose an algorithm that involves semantic segmentations and residual networks and uses VGG16 for feature extraction to improve the efficiency of local image style transfer and its generation speed, and our experiments prove that the proposed method is useful as compared with other common methods. The investigated technology can be applied in many specific areas, such as the beauty camera of smart phones, computer-generated imagery in advertisements and movies, computed tomography images, nuclear magnetic resonance imaging of cancer diagnosis under harsh conditions, and virtual simulation in industry design.

Corresponding author: Cheng-Fu Yang




Forthcoming Regular Issues


Forthcoming Special Issues

Special Issue on Micro-nano Biomedical Sensors, Devices, and Materials
Guest editor, Tetsuji Dohi (Chuo University) and Seiichi Takamatsu (The University of Tokyo)


Special Issue on Artificial Intelligence in Sensing Technologies and Systems
Guest editor, Prof. Lin Lin (Dalian University of Technology)


Special issue on Novel Materials and Sensing Technologies on Electronic and Mechanical Devices Part 3
Guest editor, Teen-Hang Meen (National Formosa University), Wenbing Zhao (Cleveland State University), and Hsien-Wei Tseng (Yango University)


7th Special Issue on the Workshop of Next-generation Front-edge Optical Science Research
Guest editor, Takayuki Yanagida (Nara Institute of Science and Technology)


Special Issue on Sensing and Data Analysis Technologies for Living Environment, Health Care, Production Management and Engineering/Science Education Applications (Selected Papers from ICSEVEN 2020)
Guest editor, Chien-Jung Huang (National University of Kaohsiung), Rey-Chue Hwang (I-Shou University), Ja-Hao Chen (Feng Chia University), Ba-Son Nguyen (Research Center for Applied Sciences)
Call for paper


Special Issue on Materials, Devices, Circuits, and Analytical Methods for Various Sensors (Selected Papers from ICSEVEN 2020)
Guest editor, Chien-Jung Huang (National University of Kaohsiung), Ja-Hao Chen (Feng Chia University), and Yu-Ju Lin (Tunghai University)
Conference website
Call for paper


Copyright(C) MYU K.K. All Rights Reserved.