Journal of Engineering and Applied Sciences

Year: 2018
Volume: 13
Issue: 11 SI
Page No. 8670 - 8674

A Study on Channel Expansion Structure for Reducing Model Size and Speeding Up of Classifier Using Inverted Residual Block

Authors : Seong-Kyun Han and Soon-Chul Kwon

Abstract: In this study, we propose a structure for model size reduction and speeding up of classifier using inverted residual block. Model size reduction for convolutional neural network computation in embedded systems is one of the main technique. To get a classifier structure that small and fast, we compare and analyze the experimental results of channel expansion parameter structure in inverted residual block proposed in MobileNetV2. Experiments were conducted on the Cifar-10 dataset for training and testing and compared with the method of MobileNetV2, 1.7% accuracy reduction, 60% model size reduction and 50% reduction in inference time were achieved.

How to cite this article:

Seong-Kyun Han and Soon-Chul Kwon, 2018. A Study on Channel Expansion Structure for Reducing Model Size and Speeding Up of Classifier Using Inverted Residual Block. Journal of Engineering and Applied Sciences, 13: 8670-8674.

Design and power by Medwell Web Development Team. © Medwell Publishing 2020 All Rights Reserved