Loading

Research Trends on Deep Transformation Neural Models for Text Analysis in NLP Applications
T.Chellatamilan1, B.Valarmathi2, K.Santhi3

1T.Chellatamilan, Associate Professor, School of Information Technology and Engineering Vellore, India.
2B.Valarmathi, Associate Professor, School of Information Technology and Engineering Vellore, India.
3K.Santhi, Associate Professor, School of Computer Science and Engineering Vellore Institute of Technology, Vellore, India.

Manuscript received on May 25, 2020. | Revised Manuscript received on June 29, 2020. | Manuscript published on July 30, 2020. | PP: 750-758 | Volume-9 Issue-2, July 2020. | Retrieval Number: B3838079220/2020©BEIESP | DOI: 10.35940/ijrte.B3838.079220
Open Access | Ethics and Policies | Cite | Mendeley
© The Authors. Blue Eyes Intelligence Engineering and Sciences Publication (BEIESP). This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/)

Abstract: In the recent few years, text analyses with neural models have become more popular due its versatile usages in different software applications. In order to improve the performance of text analytics, there is a huge collection of methods that have been identified and justified by the researchers. Most of these techniques have been efficiently used for text categorization, text generation, text summarization, query formulation, query answering, sentiment analysis and etc. In this review paper, we consolidate a recent literature along with the technical survey on different neural models such as Neural Language Model (NLM), sequence to sequence model (seq2seq), text generation, Bidirectional Encoder Representations from Transformers (BERT), machine translation model (MT), transformation model, attention model from the perception of applying deep machine learning algorithms for text analysis. Applied extensive experiments were conducted on the deep learning model such as Recurrent Neural Network (RNN) / Long Short-Term Memory (LSTM) / Convolutional Neural Network (CNN) and Attentive Transformation model to examine the efficacy of different neural models with the implementation using tensor flow and keras. 
Keywords: BERT, RNN, CNN, Language model, seq2seq, text summarization, text generation, text mining.