Abstract: Lately, deep learning has seen enormous using in computer vision and classification applications. In this study, an implementation of deep architecture is done in order to compare two classification architectures, those are usual neural networks that contain unique hidden layer with the notion of deep learning as a Deep Belief Networks (DBN) that represented by many layers. Both architectures are implemented on the images of MNIST digit dataset for the classification purpose. The usual network of digit recognition was trained as a supervised learning using backpropagation algorithms while DBN was trained using two stages, one as unsupervised learning and the other as a supervised learning. In the unsupervised learning, we used the contrastive divergence algorithm and with the supervised used back propagation as a fine tuning networks. The features are extracted as pixels from the image represented that digit to train the networks that depended on the intensity of pixel in image that white color represented as a 0s and black color represented as a 1s. DBN is performs as many layers each layer represent as a Restricted Boltzmann Machines (RBM) as a stack that will represent in sequence. The learning of DBNs consisting of two steps, a pre-training step and a fine-tune step. DBNs gave a higher performance as compared with the usual neural networks with an accuracy of approximately 98.58% for classification of handwrite digit of MNIST dataset.
Majid Hameed Khalaf, Belal Al-Khateeb and Rabah Nory Farhan, 2017. MNIST Classification using Deep Learning. Asian Journal of Information Technology, 16: 268-273.