Loading

Robot Manipulator Programming Interface based on Augmened Reality
Alexander Schwandt1, Arkady Yuschenko2

1Alexander Schwandt, Robotic Center of Education and Research, Bauman Moscow State Technical University, Moscow, Russia Izmaylovskaya, Moscow, Russian Federation.
2Arkady Yuschenko, Professor Robotic Center of Education and Research, Bauman Moscow State Technical University, Moscow, Russia Izmaylovskaya Moscow, Russian Federation.
Manuscript received on 12 October 2019 | Revised Manuscript received on 21 October 2019 | Manuscript Published on 02 November 2019 | PP: 819-823 | Volume-8 Issue-2S11 September 2019 | Retrieval Number: B11330982S1119/2019©BEIESP | DOI: 10.35940/ijrte.B1133.0982S1119
Open Access | Editorial and Publishing Policies | Cite | Mendeley | Indexing and Abstracting
© The Authors. Blue Eyes Intelligence Engineering and Sciences Publication (BEIESP). This is an open access article under the CC-BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/)

Abstract: The integration cost of industrial robots into small and medium sized enterprises (SMEs) is nowadays one of the main obstacles to make automated solutions profitable. The standard implemented human robot interface (HRI) in industrial robots, is complicated and time consuming even for trained technicians. A well-known trend in robotic manipulator development, known in the field of intelligent HRI, is aimed at to make automated solutions with industrial robots cost-effective. A HRI based on augmented reality (AR) is presented to cut short the programming phase during commissioning of industrial robots in the production line. The system proposed in the given article was implemented on a portable computer in accordance with an AR-Stylus using safety goggles mounted with a high-resolution camera and a wearable display. The hardware setup presented was used in an experiment to show, how the HRI’s basic functions, namely intuitive robot programming by demonstration (PbD), visualisation and simulation of industrial robots program, could be implemented. A high level of encapsulating the programmer into an AR environment was achieved by configuring a camera tracking the 6D position of the AR-Stylus and a real-time projection of the 3D robot gripper model virtually substituting the AR-Stylus. In contrast to the HRI based on the conventional PbD method, this paper covers a new type of PbD called “virtual robot programming by demonstration” (vPbD). The accuracy of the HRI system in determining the position of the AR-Stylus was experimentally analysed. In a sequential robotic program, apart from visualising the endpoints and the trajectories in WD, a real scale 3D model projection of the robot gripper is aligned in real-world environment from the observer’s perspective. This enables the operator to detect potential collisions of the robot gripper with the surrounding with efficacy. Further conclusions and developments pertaining to the study will be followed.
Keywords: Robot Manipulator • Human Robot Interface • Augmented Reality • Robot Programming by Demonstration • Collaborative Robotics • Intuitive Programming.
Scope of the Article: Autonomous Robots