Search in Medwell
 
 
Journal of Engineering and Applied Sciences
Year: 2018 | Volume: 13 | Issue: 11 SI | Page No.: 8670-8674
DOI: 10.36478/jeasci.2018.8670.8674  
A Study on Channel Expansion Structure for Reducing Model Size and Speeding Up of Classifier Using Inverted Residual Block
Seong-Kyun Han and Soon-Chul Kwon
 
Abstract: In this study, we propose a structure for model size reduction and speeding up of classifier using inverted residual block. Model size reduction for convolutional neural network computation in embedded systems is one of the main technique. To get a classifier structure that small and fast, we compare and analyze the experimental results of channel expansion parameter structure in inverted residual block proposed in MobileNetV2. Experiments were conducted on the Cifar-10 dataset for training and testing and compared with the method of MobileNetV2, 1.7% accuracy reduction, 60% model size reduction and 50% reduction in inference time were achieved.
 
How to cite this article:
Seong-Kyun Han and Soon-Chul Kwon, 2018. A Study on Channel Expansion Structure for Reducing Model Size and Speeding Up of Classifier Using Inverted Residual Block. Journal of Engineering and Applied Sciences, 13: 8670-8674.
DOI: 10.36478/jeasci.2018.8670.8674
URL: http://medwelljournals.com/abstract/?doi=jeasci.2018.8670.8674