Innovations in Robotics Education: Augmented Reality

 

Jairo Andrés Mantilla Villalobos[1]

jairo2218051@correo.uis.edu.co

https://orcid.org/0000-0003-1933-8362

Universidad Indutrial de Santander

Colombia

Estefanía Gómez Gamboa

estefaniagomez070@gmail.com

https://orcid.org/0000-0002-2581-1008

Universidad Indutrial de Santander

Colombia

William Pinto Hernandez

wpintoh@uis.edu.co

https://orcid.org/0000-0003-0126-6372

Universidad Indutrial de Santander

Colombia

 

 

ABSTRACT

Augmented Reality (AR) is a technological tool that provides interactive experiences to the user by overlapping virtual objects in a real environment. This paper presents a tool developed using augmented reality to study and teach robotic manipulators, focused on universities without the economic resources to acquire a serial manipulator. Three programs were developed that allow the student to visualize and rectify essential concepts of robotics such as the transformation of coordinate systems, Denavit - Hartenberg methodology, and configuration of a robotic manipulator, among others. The application allows the student to interact virtually with different manipulators, control the robot’s joints using the keyboard, and visualize real-time distances and parameters concerning a fixed coordinate system on the screen. Considering that the manipulators are rendered, it is possible to perform path planning and collision analysis on a path given by the user. The application is easily modifiable, including new manipulators and developing didactic strategies for students.

 

Keywords: : applications of augmented reality in education; augmented reality (ar); path planning; robotic manipulators


 

Innovaciones en la Enseñanza de la Robótica: Realidad Aumentada

 

RESUMEN

La Realidad Aumentada (RA) es una tecnología que proporciona experiencias interactivas al usuario superponiendo objetos virtuales en un ambiente real. En este trabajo se presenta una herramienta desarrollada usando realidad aumentada para el estudio y enseñanza de manipuladores robóticos, dirigida especialmente a universidades sin los recursos económicos para adquirir un manipulador serial. Fueron desarrollados tres programas que le permiten al estudiante visualizar y rectificar conceptos esenciales de la robótica como: transformación de sistemas coordenados, metodología de Denavit – Hartenberg, configuración de un manipulador robótico, entre otros. El aplicativo permite al estudiante interactuar de manera virtual con diferentes manipuladores, controlando mediante el uso del teclado las articulaciones del robot, y visualizando en pantalla en tiempo real distancias y parámetros respecto a un sistema coordenado fijo. Teniendo en cuenta que los manipuladores se encuentran renderizados, es posible realizar análisis de trayectoria y colisiones en un recorrido dado por el usuario. El aplicativo es fácilmente modificable, lo que permite la inclusión de nuevos manipuladores y el desarrollo de estrategias innovadoras.

 

Palabras clave: aplicaciones de la realidad aumentada en la educación; manipuladores robóticos; planeación de trayectoria; realidad aumentada (RA)

 

 

 

 

Artículo recibido 15 noviembre 2023

Aceptado para publicación: 20 diciembre 2023


 

INTRODUCTION

Technological advancements have compelled traditional teaching models to incorporate them for educational interaction (Viñas, 2021). Augmented reality (AR) combines physical objects with virtual elements in real time, creating an immersive educational experience (Hernández et al., 2017). Multiple studies have shown that integrating AR into the educational process generates formative experiences, improves student motivation, and provides satisfaction (Rivadulla & Rodríguez, 2020). This modality also fosters users’ imagination, creativity, and engagement with the subject (Cabero-Almenara et al., 2018). Additionally, the economic benefits of AR implementation are recognized, as it consumes 90% less energy than traditional methods and avoids expensive equipment acquisition through virtual components (Alahmari et al., 2019, Nuñez, 2014). The use of AR in robotics can be categorized into route planning, which involves incorporating trajectories with obstacles and preferring the path that consumes less energy (Fang et al., 2012); training, where the operation of the manipulator is indicated to the user through signals, reducing training hours (Makhataeva & Varol, 2020); and determining the orientation of the end effector concerning an established reference system (Chong et al., 2009). The application promotes the implementation of economically viable and accessible educational processes by allowing users to interact with serial manipulators in a physical environment (Akçayır, & Akçayır, 2017). It aims to strengthen fundamental concepts in direct kinematics, such as linear transformations, homogeneous transformation matrices, the Denavit-Hartenberg methodology, and types of joints, among others. Its focus is especially relevant for universities and educational institutes facing economic limitations in acquiring commercial manipulators.

METHODOLOGY

The application aims to strengthen concepts related to the operation of serial manipulators, and further elaboration proceeds.

Denavit-Hartenberg Methodology

When converting a three-dimensional object to a virtual space, the use of homogeneous coordinates is appropriate because, when scaled, they maintain their relative position concerning other points. They are ideal for performing matrix transformations that represent and manipulate objects in a virtual environment (López-Villagómez et al., 2022). The Denavit Hartenberg methodology allows determining the location of the inertial systems of the joints of a serial manipulator, this is possible by defining the relative transformations between consecutive coordinate systems SC_((i-1)) and SC_((i)), through four parameters (Ding & Liu, 2018). Below are presented all the transformations between

coordinate systems expressed by these parameters.

A translation of d units in the direction Zi−1.

A rotation of θ degrees around the axis Zi−1.

A translation of a units in the Xi−1 direction.

A rotation of α degrees around the Xi axis.

Where i refers to the current joint, i = {1,2, . . . , #Joints}.

The matrix with the same general expression can express the transformation between consecutive coordinate systems (Corke, 2007, Steinparz, 1985) The following equation obtains the transformation between the inertial coordinate system and the end effector for a robotic manipulator with n degrees of freedom.

Equation 1 determines the complete transformation, where the matrixH0n contains the relative rotation and translation operations between the coordinate systems.  is the Denavit-Hartenberg matrix for the consecutive coordinate systems of the joints established in the methodology.

Homogeneous transformation matrix

A transformation matrix contains information about the rotation, translation, or deformation of a coordinate system concerning another, in this case, the transformation between the coordinate system of the camera   and the one located at the center of the marker  (Figure 1).

The homogeneous transformation matrix (Figure 2) presents a tensor that divides into two parts:

An  array with dimensions 3 x 3 that stores all the rotations and a column vector  that contains the necessary translation between the coordinate systems in each reference axis (Pérez et al., 2014). Throughout the study, the researchers used homogeneous transformation matrices that were obtained in two ways depending on their application. Matrices representing the transformation be- tween the camera and the coordinate system of the marker were used. (Figure 1) were calculated using Processing software, and the transformation matrices of the robotic manipulators were obtained using the Denavit-Hartenberg methodology.

Figure 1. Coordinate systems and transformation matrices between the camera and augmented reality markers

Gráfico, Gráfico de líneas

Descripción generada automáticamente con confianza media

Source: Authors

 

Figure 2. Homogeneous transformation matrix.

Imagen que contiene Texto

Descripción generada automáticamente

Source: Authors

 

Augmented Reality

Figure 3 displays the essential components required to initiate the application and a general outline of its functionality. Firstly, the camera captures an image of the environment where the activity will take place. Then it undergoes a segmentation and identification process, following the approach proposed by (Gupta & Rohil, 2017), to recognize predetermined markers. These allow the dimensioning of the workspace based on the methodology described by (Hirzer, 2008). After identifying the marker’s, rendered objects are displayed for the selected application.

The elements shown in Figure 2 are economically accessible for educational purposes and do not represent a significant expense. The camera used in the implementation does not require high resolution, allowing for the use of the built-in camera in the computer. The user can choose the size of the markers and obstacles, which allows for flexibility in their design. This study employed markers of 7 x 7 cm and a USB camera with a resolution of 1080 x 720 px.

Figure 3. Components used in augmented reality. a) Computer. b) Pre-established Kanji marker. c) Obstacle. d) High-definition USB camera, configured at a resolution of 640 x 360.

Source: Authors

 

DEVELOPMENT OF THE APPLICATION

CAD modeling

Three manipulators with different degrees of freedom (DOF) were modeled. The second program includes the SCARA robot and a robot proposed by the authors. The third program implements the STANFORD robot (Figure 4). All components are exported in the 3D object (.obj) format to ensure compatibility with Processing software.

Figure 4 Robotic manipulators used in program 2. a) Proposed robot of 4 DOF. b) Robot SCARA of 4 DOF. C) 6 DOF STANFORD robot.

Source: Authors

 

Obtaining the homogeneous transformation matrix between the coordinate system of the cam- era and each of the markers is done through the NyARToolkit 3.0.7 library (GitHub, 2016). The development of transformation matrices between the base and the end effector utilizes the Denavit- Hartenberg methodology.

Figure 5 shows the functional diagram and the development of the Denavit-Hartenberg methodology for the proposed 4-DOF manipulator (Figure 4a). Presents the dimensions, coordinate systems, and necessary parameters. The process follows for the other manipulator models.

Figure 5. Robotic manipulators used in program 2. a) Proposed robot of 4 DOF. b) Robot SCARA of 4 DOF. C) 6 DOF STANFORD robot.

Source: Authors.

 

Equations (2), (3), and (4) present the coordinates of the end effector concerning the manipulator’s base. These coordinates are used to calculate the transformations of the end effector and the camera, including the matrices presented in Figure 1.

Implementation of augmented reality in processing

The programs were developed in the open-source software Processing (Processing Foundation, 2022) in version 4.0b8, which works with the JavaScript language (Netscape Communications & Foundation Mozilla, 2021). The program executes the following functions:

a)       Camera calibration is necessary to extract 3D information and metrics from images. There are different techniques to obtain the intrinsic and extrinsic parameters of the camera (Viala & Salmerón, 2008). The Single Camera Calibrator Toolbox from MATLAB (The MathWorks, Inc, 2013) was implemented to obtain the parameters through a calibration session. For the pinhole camera (Juarez-Salazar et al, 2020), the following values were obtained: focal distances in pixels  = 523.34,  = 524.64, principal point  = 304.69,  = 196.85, radial distortion coefficients  = 0.3757,  = 0.1167, and tangential distortion coefficients  = −5.271E 4,  = 2.281E 4.

b)      Functions related to the camera’s operation: link, capture, and rendering, implement the Processing video 2.0 library (Processing Foundation, 2022).

c)       The NyARToolkit 3.0.7 library (GitHub, 2016) executes AR functions such as marker tracking, camera position estimation, and 3D object rendering.

Four complementary programs are developed for the learning of serial manipulators. These programs enable real-time interaction with the user through marker detection and keyboard usage.

RESULTS

Program 1

"Marker coordinate system" uses augmented reality (AR) to find the homogeneous transformation matrix between the camera and the markers, also calculates the parameters of interest between the markers, considering the camera as an inertial system. The program displays real-time information to the user at the bottom of the screen. This program is closely related to Program 1, "Marker-based coordinate system" which displays coordinate systems in the center of the markers, with the z-axis perpendicular to each marker and its origin at the center of the figure. The program presents the parameters of each marker measured concerning the reference system at the bottom of the screen, and any changes in the orientation or position of the markers alter the parameters. Additionally, the program displays the distance between markers.

The user can observe these values in real-time and interact with the markers to see how each parameter changes.

Figure 6. Program 1, "Marker-based coordinate system"

Interfaz de usuario gráfica, Sitio web

Descripción generada automáticamente

Source: Authors

 

Program 2

"Serial manipulators": This program aims to reinforce concepts such as types of joints, geometry, workspace, and the Denavit-Hartenberg methodology. The user can observe and operate the Scara manipulator and the model proposed through the computer keyboard, where each of the robot’s joints moves. the distances, coordinates and angles between the manipulators’ effectors are computed in real-time and displayed at the bottom of the screen, and the D-H matrices were calculated for each manipulator.


 

Figure 7. Program 2 "Serial Manipulators": The program prints the angles and distances of the joints from left to right and the separation between the end effector concerning the marker. The program presents the same data for the next manipulator.

Interfaz de usuario gráfica

Descripción generada automáticamente con confianza media

Source: Authors

 

Program 3

The operation of the Stanford robot in Program 3 is like Program 2, but it differs in the implementation of the "kanji" marker and the Stanford manipulator with 6 degrees of freedom (Figure 4c).

Figure 8. Program 3 ’Stanford Robot’

Sitio web

Descripción generada automáticamente con confianza media

Source: Authors

 

 

Program 4

"Stanford Robot for Trajectory Analysis," employs the manipulator from Program 3 to allow the user to control the robot joints via the keyboard and record the desired movement. The joint variables for each configuration the robot frames are stored, and once the trajectory is over, the path that the virtual object that displaces the effector made is projected on the screen. The user can move the camera to obtain new perspectives and determine if the trajectory was performed without obstacles or unwanted collisions, as shown in Figure 9. Additionally, the program displays the workspace of this manipulator. It is noteworthy that any robotic manipulator can be modeled and integrated into the application with some programming modifications.

Figure 9. Program 4 ’Stanford Robot for trajectory analysis’. Example of collision-free trajectory: a) Initial configuration of the manipulator, b) Front view of the trajectory, c) Side view, d) Top view, e) Workspace.

Source: Authors

 

 

DISCUSSION

The results obtained in the development of the program are satisfactory, since, as mentioned before, the applications strengthen the fundamental concepts, and the necessary elements for their implementation are affordable and convenient for educational institutions. In the future, a test with be performed regarding the robotics course students at Universidad Industrial de Santander to measure the influence of the application on the assimilation of concepts.

Although the test has not been performed yet, by reviewing the results of other authors s (Basogain et al., 2007, Yarin & Gamarra, 2022), it’s possible to affirm that augmented reality is particularly enriching in fields that require the development of spatial skills. Additionally, it helps the teacher identify critical areas in (Mayol et al., 2020). In future projects, it’s planned to implement a series of activities with specific goals to evaluate the student’s performance in solving a problem in a similar modality as exposed by these authors.

 Compared to a similar work (Mayol et al., 2020), the developed application simulates the trajectory based on a robot located at the educational center. Similar results were obtained in the software development, although the way the trajectory is recorded in three dimensions through different cam- eras and the implementation of programming in C were highlighted. For future work, registering the path in three dimensions will be added to evaluate the student’s success, and a fixed obstacle guide to assessing the student’s accuracy.

Finally, a study implementing mixed reality (Fang et al., 2012), where the student controls the manipulator through AR glasses for trajectory planning while simultaneously a robot in the facility performs the same path. Although this modality allows testing the user’s training, it won’t be implemented in future work because it makes the presence of the manipulator mandatory and doesn’t show a significant advantage in the learning experience.

In conclusion, by reviewing the work of other authors, greater scope for the project in the future can be dimensioned; however, the test with students to verify how it impacts their learning experience is a priority. It is satisfactory to achieve the goal of creating a low-cost application since it operates on freely available software, the resources required are accessible, and its computational cost is moderate.

 

CONCLUSIONS

Through the various programs presented, the student promotes learning of augmented reality applications, coordinate system transformation, robotic manipulators, types of joints, degrees of freedom, workspace, and trajectory planning.

The ability to instantly update the parameters measured by the camera, the configuration of the manipulators, and the coordinates of their effectors make AR an interactive technology for learning. By visually representing robotic manipulators in a physical environment, relevant advantages are obtained for trajectory analysis. It allows for direct verification that no collisions or unwanted paths are generated during the course.

The software development is flexible, allowing the user to incorporate any robotic manipulator with few changes in programming. The program corresponding to the trajectory analysis exports the data of each joint for each stored position, the same data that serves as input in the development of control processes.

BIBLIOGRAPHIC REFERENCES

Alahmari, M., Issa, T., Issa, T., & Nau, S. Z. (2019). Faculty awareness of the economic and environmental benefits of augmented reality for sustainability in Saudi Arabian universities. Journal of Cleaner Production, 226, 259-269. [DOI: 10.1016/j.jclepro.2019.04.09]

Akçayır, M., & Akçayır, G. (2017). Advantages and challenges associated with augmented reality for education: A systematic review of the literature. Educational Research Review, 20, 1-11. [DOI: 10.1016/j.edurev.2016.11.002]

Basogain, X., Olabe, M., Espinosa, K., Rouèche, C., & Olabe, J. C. (2007). Realidad Aumentada en la Educación: una tecnología emergente. Escuela Superior de Ingeniería de Bilbao, EHU. [Enlace: http://bit.ly/2hpZokY]

Cabero-Almenara, J., Vázquez-Cano, E., & López Meneses, E. (2018). Uso de la realidad aumentada como recurso didáctico en la enseñanza universitaria. Formación universitaria, 11(1), 25-34. [DOI: 10.4067/S071850062018000100025]

Chong, J. W. S., Ong, S. K., Nee, A. Y., & Youcef-Youmi, K. B. (2009). Robot programming using augmented reality: An interactive method for planning collision-free paths. Robotics and Computer-Integrated Manufacturing, 25(3), 689-701. [DOI: 10.1016/j.rcim.2008.05.002]

Corke, P. I. (2007). A simple and systematic approach to assigning Denavit–Hartenberg parameters. IEEE transactions on robotics, 23(3), 590-594. [DOI: 10.1109/TRO.2007.896765]

Ding, F., & Liu, C. (2018). Applying coordinate fixed Denavit–Hartenberg method to solve the workspace of drilling robot arm. International journal of advanced robotic systems, 15(4). [DOI: 10.1177/1729881418793283]

Fang, H. C., Ong, S. K., & Nee, A. Y. C. (2012). Robot path and endeffector orientation planning using augmented reality. *Procedia CIRP, 3*, 191-196. [DOI:10.1016/j.procir.2012.07.034]

Fang, H. C., Ong, S. K., & Nee, A. Y. C. (2012). Interactive robot trajectory planning and simulation using augmented reality. Robotics and Computer-Integrated Manufacturing, 28(2), 227-237. [DOI: 10.1016/j.rcim.2011.09.003]

GitHub. (2016). NyARToolkit for Processing. [Enlace: https://github.com/nyatla/NyARToolkit-forProcessing/blob/master/README.EN.md]

Gupta, N., & Rohil, M. K. (2017). Exploring possible applications of augmented reality in education. In 2017 4th International Conference on Signal Processing and Integrated Networks (SPIN), 437-441. [DOI: 10.1109/SPIN.2017.8049989]

Hernández, C. A. V., Martínez, A. R., & Ceballos, N. D. M. (2017). Caracterización de marcadores de realidad aumentada para su uso en robótica. Revista Politécnica, 13(25), 87-102. [DOI: 10.33571/rpolitec.v13n25a7]

Hirzer, M. (2018). Marker detection for augmented reality applications. In Seminar/Project Image Analysis Graz. [Enlace:

https://www.researchgate.net/profile/MartinHirzer/publication/321253407_Marker_Detection_for_Augmented_Reality_Applications/links/5a1700050f7e9be37f957b25/Marker-Detection-for-Augmented-Reality-Applications.pdf]

Juarez-Salazar, R., Zheng, J., & Diaz-Ramirez, V. H. (2020). Distorted pinhole camera modeling and calibration. Applied Optics, 59(36), 11310-11318. [DOI: 10.1364/AO.412159]

López-Villagómez, J. M., Rodríguez-Doñate, C., & Mata-Chávez, R. I. (2022). Análisis de la exactitud para un robot de tres grados de libertad. Jovenes en la ciencia, (18), 1-7. [Enlace: https://www.jovenesenlaciencia.ugto.mx/index.php/jovenesenlaciencia/article/view/3858]

Makhataeva, Z., & Varol, H. A. (2020). Augmented reality for robotics: A review. Robotics, 9(2), 21. [DOI: 10.3390/robotics9020021]

Mayol Céspedes, I., Leyva Regalón, J. A., & Leyva Reyes, J. A. (2020). Software de Realidad Aumentada para la enseñanza-aprendizaje de la asignatura Informática en la Ingeniería Mecánica. Enseñanza y Aprendizaje de Ingeniería de Computadores, 67(10). 10.30827/Digibug.64783

Netscape Communications & Foundation Mozilla. (2021). JavaScript ECMAScript 202. [Enlace: https://www.javascript.com]

Núñez, Y. J. D. (2014). Factibilidad tecnológica de aplicar realidad aumentada en la carrera ingeniería en ciencias informáticas. 3c TIC: cuadernos de desarrollo aplicados a las TIC, 3(4), 228-239. [Enlace: https://dialnet.unirioja.es/servlet/articulo?codigo=4903918]

Rivadulla López, J. C., & Rodríguez Correa, M. (2020). La incorporación de la realidad aumentada en las clases de ciencias. Contextos educativos: revista de educación, 25, 237-255. [DOI: 10.18172/con.3865]

Pérez Cisneros, M. A., Cuevas, E., Zaldívar Navarro, D. (2014). Fundamentos de robótica y mecatrónica con Matlab y Simulink, 1st ed. Ra-Ma S.A. Editorial y Publicaciones.

Processing Foundation. (2022). Processing 4.0. [Enlace: https://processing.org]

Processing Foundation. (2022). Video. [Enlace:

https://processing.org/reference/libraries/video/index.html]

Quintero, C. P., Li, S., Pan, M. K., Chan, W. P., Van der Loos, H. M., & Croft, E. (2018). Robot programming through augmented trajectories in augmented reality. In IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 1838-1844.

Steinparz, F. X. (1985). Co-ordinate transformation and robot control with Denavit–Hartenberg matrices. Journal of microcomputer applications, 303-316. [DOI: 10.1016/0745-7138(85)90031-4]

The MathWorks, Inc. (2013). Camera Calibrator. [Enlace:

https://la.mathworks.com/help/vision/ref/cameracalibratorapp.html]

Viala, C. R., & Salmerón, A. J. S. (2008). Procedimiento completo para el calibrado de cámaras utilizando una plantilla plana. Revista Iberoamericana de Automática e Informática Industrial RIAI, 5(1), 93-101. [DOI: 10.1016/S1697-7912(08)70126-2]

Viñas, M. (2021). Retos y posibilidades de la educación híbrida en tiempos de pandemia. Plurentes, 11. [DOI: 10.24215/18536212e027]

Yarin Achachagua, Y.H., & Gamarra Chinchay, H.E. (2022). La realidad aumentada y su efecto en la habilidad espacial de estudiantes de ingeniería mecánica. Revista de educación a distancia, 22(70). [Enlace: http://dx.doi.org/10.6018/red.509931]



[1] Autor principal

Correspondencia: jairo2218051@correo.uis.edu.co