Human-Robot Interaction Using Open BCI Through Visual Simulation Using Python Programming

Authors

  • Anantha Kamal Nallamothu Mechanical Engineering Department, School of Industrial and Information Engineering, Politecnico Di Milano, Milan, Italy
  • Seshu Kishan Nallamothu Mechanical Engineering Department, School of Industrial and Information Engineering, Politecnico Di Milano, Milan, Italy

Keywords:

Human robot interaction (HRI), Brain-computer interfaces (BCIs), brain signals

Abstract

In order to improve the efficiency of robots that interact with humans in the real world, HRI (human robot interaction) researchers may choose to work as roboticists, designing cutting-edge robotic systems that might have real-world applications. Motivated by the aim of restoring independence to severely injured persons and by curiosity in further extending the human control of the external systems, researchers from many domains are participating in this hard new effort. BCI research & development have evolved explosively during the last two decades. "Brain-computer interfaces" (BCIs) enable its users to communicate as well as control the external devices using brain impulses rather than the brain's regular output channels of peripheral nerves and muscles. In this study, we evaluate the present state and future possibilities of BCI technology plus investigate impending issues that are projected to have a substantial influence on the area.

References

[1] M. Ciccarelli et al., "A system to improve the physical ergonomics in Human-Robot Collaboration," Procedia Comput. Sci., vol. 200, no. 2019, pp. 689-698, 2022, https://doi.org/10.1016/j.procs.2022.01.267

[2] A. Chiurco et al., "Real-time Detection of Worker's Emotions for Advanced Human-Robot Interaction during Collaborative Tasks in Smart Factories," Procedia Comput. Sci., vol. 200, no. 2019, pp. 1875-1884, 2022, https://doi.org/10.1016/j.procs.2022.01.388

[3] B. Bai, "pro of 校 Jou rna lP," Softw. Impacts, p. 100305, 2021, https://doi.org/10.1016/j.simpa.2022.100305

[4] P. Segura, O. Lobato-Calleros, A. Ramírez-Serrano, and I. Soria, "Human-robot collaborative systems: Structural components for current manufacturing applications," Adv. Ind. Manuf. Eng., vol. 3, p. 100060, 2021, https://doi.org/10.1016/j.aime.2021.100060

[5] E. Rosen, T. Groechel, M. E. Walker, C. T. Chang, and J. Z. Forde, "Virtual, augmented, and mixed reality for human-robot interaction (VAM-HRI)," ACM/IEEE Int. Conf. Human-Robot Interact., pp. 721-723, 2021, https://doi.org/10.1145/3434074.3444879

[6] H. S. Choi et al., "On the use of simulation in robotics: Opportunities, challenges, and suggestions formoving forward," Proc. Natl. Acad. Sci. U. S. A., vol. 118, no. 1, pp. 1-9, 2021, https://doi.org/10.1073/pnas.1907856118

[7] M. Val-Calvo, J. R. Alvarez-Sanchez, J. M. Ferrandez-Vicente, and E. Fernandez, "Affective Robot Story-Telling Human-Robot Interaction: Exploratory Real-Time Emotion Estimation Analysis Using Facial Expressions and Physiological Signals," IEEE Access, vol. 8, pp. 134051-134066, 2020 https://doi.org/10.1109/ACCESS.2020.3007109

[8] Z. Makhataeva and H. A. Varol, "Augmented reality for robotics: A review," Robotics, vol. 9, no. 2, 2020, https://doi.org/10.3390/robotics9020021

[9] W. Zhao, J. P. Queralta, and T. Westerlund, "Sim-to-Real Transfer in Deep Reinforcement Learning for Robotics: A Survey," 2020 IEEE Symp. Ser. Comput. Intell. SSCI 2020, pp. 737-744, 2020, https://doi.org/10.1109/SSCI47803.2020.9308468

[10] W. F. Shih, K. Naruse, and S. H. Wu, "Implement human-robot interaction via robot's emotion model," Proc. - 2017 IEEE 8th Int. Conf. Aware. Sci. Technol. iCAST 2017, vol. 2018-January, no. iCAST, pp. 580-585, 2017, https://doi.org/10.1109/ICAwST.2017.8256522

[11] L. Fichera, F. Messina, G. Pappalardo, and C. Santoro, "A Python framework for programming autonomous robots using a declarative approach," Sci. Comput. Program., vol. 139, pp. 36-55, https://doi.org/10.1016/j.scico.2017.01.003

[12] Z. Wang, E. Giannopoulos, M. Slater, A. Peer, and M. Buss, "Handshake: Realistic human-robot interaction in haptic enhanced virtual reality," Presence Teleoperators Virtual Environ., vol. 20, no. 4, pp. 371-392, 2011, https://doi.org/10.1162/PRES_a_00061

Downloads

Published

2022-04-16

How to Cite

[1]
Anantha Kamal Nallamothu and Seshu Kishan Nallamothu 2022. Human-Robot Interaction Using Open BCI Through Visual Simulation Using Python Programming. AG Volumes. (Apr. 2022), 45–55.