A Big Data Framework for Quality Assurance and Validation
S. Nachiyappan1, Justus S2
1Nachiyappan S, Assistant Prof (Sr.), SCSE, VIT University, Chennai.
2Justus S, Associate Professor, SCSE, VIT University, Chennai.
Manuscript received on 03 March 2019 | Revised Manuscript received on 09 March 2019 | Manuscript published on 30 July 2019 | PP: 2490-2494 | Volume-8 Issue-2, July 2019 | Retrieval Number: A1912058119/19©BEIESP | DOI: 10.35940/ijrte.B1912.078219
Open Access | Ethics and Policies | Cite | Mendeley | Indexing and Abstracting
© The Authors. Blue Eyes Intelligence Engineering and Sciences Publication (BEIESP). This is an open access article under the CC-BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/)
Abstract: Big data is a new technology, which is defined by large amount of data, so it is possible to extract value from the capturing and analysis process. Large data faced many challenges due to various features such as volume, speed, variation, value, complexity and performance. Many organizations face challenges while facing test strategies for structured and unstructured data validation, establishing a proper testing environment, working with non relational databases and maintaining functional testing. These challenges have low quality data in production, delay in execution and increase in cost. Reduce the map for data intensive business and scientific applications Provides parallel and scalable programming model. To get the performance of big data applications, defined as response time, maximum online user data capacity size, and a certain maximum processing capacity. In proposed, to test the health care big data . In health care data contains text file, image file, audio file and video file. To test the big data document, by using two concepts such as big data preprocessing testing and post processing testing. To classify the data from unstructured format to structured format using SVM algorithm. In preprocessing testing test all the data, for the purpose data accuracy. In preprocessing testing such as file size testing, file extension testing and de-duplication testing. In Post Processing to implement the map reduce concept for the use of easily to fetch the data.
Index Terms: Preprocessing, Map Reduce in Post Processing, Structured Data Using SVM.
Scope of the Article: Big Data