An Automated Research for Emotion Recognition and Generation
Vivek Venugopal1, M.R. Stalin John2, Vasanth Kumar. CH3
1Vivek Venugopal, Department of Mechanical Engineering, SRM Institute of Science and Technology Chennai (Tamil Nadu), India.
2M.R. Stalin John, Department of Mechanical Engineering, SRM Institute of Science and Technology Chennai (Tamil Nadu), India.
3Vasanth Kumar. CH, Department of Mechanical Engineering, SRM Institute of Science and Technology Chennai (Tamil Nadu), India.
Manuscript received on 19 July 2019 | Revised Manuscript received on 03 August 2019 | Manuscript Published on 10 August 2019 | PP: 571-576 | Volume-8 Issue-2S3 July 2019 | Retrieval Number: B11050782S319/2019©BEIESP | DOI: 10.35940/ijrte.B1105.0782S319
Open Access | Editorial and Publishing Policies | Cite | Mendeley | Indexing and Abstracting
© The Authors. Blue Eyes Intelligence Engineering and Sciences Publication (BEIESP). This is an open access article under the CC-BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/)
Abstract: As of today, Human Robot Interaction (HRI) is considered as an on-going and challenging trend. The development in this field is vast and still in budding phase with respect to research. Several Algorithms and Models are being designed that allow the robots to precept think and behave like a human being. Scientists are working regressively on creating an ‘Artificial Mind’ as an infrastructure using computational methods which are specifically delineated to guide a robot. Artificial Intelligence has taken the next level which is called as ‘Embodied Artificial Intelligence’ in order to make the humanoids smarter. This research deals with, a practical robot head which recognizes and mirrors the same basic emotions from the user with the presence of visual inputs and Facial Action Code System (FACS) control points on the human face. This research work deals with the design and development of three major modules in which, one module involves the fabrication of the practical robot head to represent the emotions. The next major module is the Artificial Intelligence segment including a Fuzzy Logic. In this segment, the inputs will be taken from the user in the form intensities of visual signals. Based on the Fuzzy Logic classifications; the respective emotion from the user is recognized and fed forward to the practical head. The third module of this research deals with the interfacing mechanism between the Fuzzy Logic module and the Mechanical Humanoid head for traversing the signals generated from the Fuzzy Logic Chunk to the practical robot head in the form mechanical actions. This proposed research model will have a huge impact in the study of interaction between the Human and Robots and research studies on certain critical diseases or syndromes like ASD, Alzheimer’s disease, and Bell’s palsy.
Keywords: Artificial Intelligence; Human Robot Interaction (HRI); Artificial Mind; Fuzzy Logic and Facial Action Code System (FACS).
Scope of the Article: Robotics Engineering