TY - JOUR T1 - MNIST Classification using Deep Learning AU - Khalaf, Majid Hameed AU - Al-Khateeb, Belal AU - Farhan, Rabah Nory JO - Asian Journal of Information Technology VL - 16 IS - 2 SP - 268 EP - 273 PY - 2017 DA - 2001/08/19 SN - 1682-3915 DO - ajit.2017.268.273 UR - https://makhillpublications.co/view-article.php?doi=ajit.2017.268.273 KW - Restricted Boltzmann Machine KW -deep belief networks KW -contrastive divergence KW -backpropagation neural networks KW -MNIST AB - Lately, deep learning has seen enormous using in computer vision and classification applications. In this study, an implementation of deep architecture is done in order to compare two classification architectures, those are usual neural networks that contain unique hidden layer with the notion of deep learning as a “Deep Belief Networks” (DBN) that represented by many layers. Both architectures are implemented on the images of MNIST digit dataset for the classification purpose. The usual network of digit recognition was trained as a supervised learning using backpropagation algorithms while DBN was trained using two stages, one as unsupervised learning and the other as a supervised learning. In the unsupervised learning, we used the contrastive divergence algorithm and with the supervised used back propagation as a fine tuning networks. The features are extracted as pixels from the image represented that digit to train the networks that depended on the intensity of pixel in image that white color represented as a 0’s and black color represented as a 1’s. DBN is performs as many layers each layer represent as a Restricted Boltzmann Machines (RBM) as a stack that will represent in sequence. The learning of DBNs consisting of two steps, a pre-training step and a fine-tune step. DBNs gave a higher performance as compared with the usual neural networks with an accuracy of approximately 98.58% for classification of handwrite digit of MNIST dataset. ER -