Loading

Sentiment Analysis using Multiple Word Embedding for Words
S. K. Shirgave1, P. M. Gavali2
1MDr. S. K. Shirgave*, Computer Science and Engineering, DKTE Society’s Textile and Engineering Institute, Ichalkaranji, India.
2Mr. P. M. Gavali, Computer Science and Engineering, DKTE Society’s Textile and Engineering Institute, Ichalkaranji, India.

Manuscript received on April 30, 2020. | Revised Manuscript received on May 06, 2020. | Manuscript published on May 30, 2020. | PP: 1689-1693 | Volume-9 Issue-1, May 2020. | Retrieval Number: A2606059120/2020©BEIESP | DOI: 10.35940/ijrte.A2606.059120
Open Access | Ethics and Policies | Cite | Mendeley
© The Authors. Blue Eyes Intelligence Engineering and Sciences Publication (BEIESP). This is an open access article under the CC-BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/)

Abstract: Nowadays users express their opinions on different websites like e-commerce and special review websites. Analyzing customers’ opinions and their responses is important for decision making. So the researchers worked on analyzing these reviews automatically using a classical machine learning approach like Support Vector Machine (SVM) and various modern deep neural networks. For these networks, words are represented by using vectors called word embeddings. The required word embeddings are taken from pre-trained Word2Vec or learned from a corpus of the given main task. But, each method has its demerits. In the case of pre-trained word embeddings, embeddings are learned from large general corpus so these embeddings are not task specific. While in the case of learning words from the corpus of the main task, it does not reflect the true semantics. To deal with these problems, we have proposed an embedding developer model. This model develops task specific word embedding which also reflects true semantics. Task specific word embeddings are generated from the given corpus using the embedding layer in Keras. It builds the embeddings by considering relationships between words in the window. While true semantics are taken from Word2Vec embeddings. The proposed model combines these two embeddings to generate true semantics and task specific word embeddings. Result analysis shows that the proposed system works better on many benchmark datasets. 
Keywords: Deep Learning, Embedding, Pre-trained Model, Sentiment Analysis
Scope of the Article: Deep Learning