Loading

Handwritten Digit Recognition Using Different Dimensionality Reduction Techniques
Ruksar Sheikh1, Mayank Patel2 

1Ruksar Sheikh, Dept. of Computer Science Engineering, Geetanjali Institute of Technical Studies, Udaipur , India.
2Mayank Patel, Dept. of Computer Science Engineering, Geetanjali Institute of Technical Studies, Udaipur , India.

Manuscript received on 08 March 2019 | Revised Manuscript received on 14 March 2019 | Manuscript published on 30 July 2019 | PP: 999-1002 | Volume-8 Issue-2, July 2019 | Retrieval Number: B1798078219/19©BEIESP | DOI: 10.35940/ijrte.B1798.078219
Open Access | Ethics and Policies | Cite | Mendeley | Indexing and Abstracting
© The Authors. Blue Eyes Intelligence Engineering and Sciences Publication (BEIESP). This is an open access article under the CC-BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/)

Abstract: The objective of this paper is to introduce to Technologies of linear dimension reduction popularly known as Principal Component Analysis and Linear Discriminant Analysis. PCA reduces the size of data and conserve maximum variance in the form of new variable called principal components where LDA works with minimum class distance and maximizing difference between the classes. Axis of maximum variance is found by PCA while axis of class separability is found by LDA. This method is experimented over and MNIST handwritten digit data set. Our conclusion explains PCA can outperform LDA when training data set a small and recalls values with lesser computational complexity. The present in linear techniques in this paper presents clear understanding and methods in comparative manner.
Keywords: PCA, LDA, Dimensionality Reduction, MNIST Handwritten Digit Dataset, Covariance

Scope of the Article: Pattern Recognition