Loading

Big Data Analytics for Images in Public Cloud using Map Reduce on Local Clusters
Buvaneswari.V.B1, S.Shanthi2, M.Pyingkodi3
1Buvaneswari.V.B , Assistant Professor P.G and Research Department of Computer Science, Government Arts College, Coimbatore, Tamil Nadu, India.
2Dr. S.Shanthi, Department of Computer Applications, Kongu Engineering College, Erode, India.
3M.Pyingkodi Department of Computer Applications, Kongu Engineering College, Erode, India. 

Manuscript received on November 20, 2019. | Revised Manuscript received on November 28, 2019. | Manuscript published on 30 November, 2019. | PP: 7384-7390 | Volume-8 Issue-4, November 2019. | Retrieval Number: D5303118419/2019©BEIESP | DOI: 10.35940/ijrte.D5303.118419

Open Access | Ethics and Policies | Cite  | Mendeley | Indexing and Abstracting
© The Authors. Blue Eyes Intelligence Engineering and Sciences Publication (BEIESP). This is an open access article under the CC-BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/)

Abstract: MapReduce is a programming model used for parallel computing of big data in public cloud. Big Data have characteristics like variety, velocity and volume. The research work carries out MapReduce using Matlab which is a powerful image processing and numeric computation tool. The research considers unstructured image data in public cloud Dropbox as big data and applies MapReduce algorithm to map and reduce all the images stored in it. The research work aims to retrieve the images in public cloud with maximum Red, Green, Blue color and the colors that intersect between them. The same code is modified to find all Red, Green and Blue that supports more parallelism and aids in improving the speed of MapReduce by eliminating the dependency between iterations. The speed of parallel MapReduce shows considerable improvement only with increased file size and coding style. Parallel MapReduce computation is carried out with default workers, three and four workers of the local cluster with scale up architecture. This model is developed using Matlab and can be implemented in Hadoop as well.
Keywords: MapReduce, Big Data, Parallel Computing, Cloud, Image Processing, Cluster.
Scope of the Article: Computing, Cloud.