##plugins.themes.bootstrap3.article.main##

Muhammad Ilhamdi Rusydi
Andre Paskah Gultom
Adam Jordan
Rahmad Novan Nurhadi
Darwison Darwison

Abstract

This study presents an assistive control system for a four-degree-of-freedom (4-DoF) robotic manipulator that integrates image-based spatial perception with electrooculography (EOG)-based human–machine interaction for three-dimensional object retrieval. The system is motivated by the need for intuitive, non-contact assistive technologies to support individuals with severe motor impairments, such as tetraplegia, in performing basic manipulation tasks. The proposed framework employs an orthogonal dual-camera vision configuration to achieve explicit 3D target localization, where planar object positions on the XY plane and depth along the Z axis are estimated using focal length–based geometric modeling. User commands are generated through an EOG interface, in which eye movements and voluntary blinks are classified using a K-Nearest Neighbor (KNN) algorithm to control manipulator motion. Compared to conventional assistive robotic systems that rely on depth sensors or high-degree-of-freedom manipulators, the proposed approach utilizes asymmetric monocular viewpoints and a minimal 4-DoF architecture to reduce system complexity. Experimental results demonstrate high performance, achieving average localization accuracies of 99.52% on the XY plane and 95.88% along the Z axis, as well as an EOG classification accuracy of 94.38%. Manipulation experiments confirmed reliable operation with a 100% task success rate, while task completion time and positional error increased gradually with target distance. These findings validate the feasibility of the proposed system as a low-complexity, high-accuracy assistive robotic solution for rehabilitation and human–machine interaction applications.

##plugins.themes.bootstrap3.article.details##

How to Cite
Rusydi, M. I., Gultom, A. P., Jordan, A., Nurhadi, R. N., & Darwison, D. (2026). Electrooculography Based Control of a Robotic Manipulator with Dual Cameras for Object Retrieval. International Journal of Basic and Applied Science, 14(4), 137–151. https://doi.org/10.35335/ijobas.v14i4.798
References
[1] K. Yun, J. C. Lim, and O. Kim, “Significance of physical factors on activities of daily living in patients with tetraplegia after spinal cord injury: a retrospective study,” BMC Sports Sci. Med. Rehabil., vol. 16, no. 1, pp. 1–9, 2024, doi: 10.1186/s13102-024-00928-z.
[2] R. Betz et al., “The 2019 revision of the International Standards for Neurological Classification of Spinal Cord Injury (ISNCSCI)—What’s new?,” Spinal Cord, vol. 57, no. 10, pp. 815–817, 2019, doi: 10.1038/s41393-019-0350-9.
[3] U. Hassan, H. Mughal, I. Mohsin, and Z. H. Khan, “Real-time control of a mobile robot using electrooculogram based eye tracking system,” 5th Int. Multi-Topic ICT Conf. Technol. Futur. Gener. IMTIC 2018 - Proc., pp. 1–6, 2018, doi: 10.1109/IMTIC.2018.8467232.
[4] H. A. Khan et al., “Design and development of machine vision robotic arm for vegetable crops in hydroponics,” Smart Agric. Technol., vol. 9, no. September, p. 100628, 2024, doi: 10.1016/j.atech.2024.100628.
[5] I. Rulik et al., “Control of a Wheelchair-Mounted 6DOF Assistive Robot With Chin and Finger Joysticks,” Front. Robot. AI, vol. 9, Jul. 2022, doi: 10.3389/frobt.2022.885610.
[6] Y. Mishchenko, M. Kaya, E. Ozbay, and H. Yanar, “Developing a Three- to Six-State EEG-Based Brain-Computer Interface for a Virtual Robotic Manipulator Control,” IEEE Trans. Biomed. Eng., vol. 66, no. 4, pp. 977–987, Apr. 2019, doi: 10.1109/TBME.2018.2865941.
[7] M. I. Rusydi, T. Okamoto, M. Sasaki, and S. Ito, “Line of Sight Estimation from EOG Signal with Variation of Electrode Position for Human Machine Interface,” in Proceedings of the 6th Conference of the Rehabilitation Engineering and Assistive Technology Society of Korea (RESKO), Jeonju, Korea, Nov. 2012, pp. 234–239.
[8] L. V. Herlant, R. M. Holladay, and S. S. Srinivasa, “Assistive teleoperation of robot arms via automatic time-optimal mode switching,” ACM/IEEE Int. Conf. Human-Robot Interact., vol. 2016-April, pp. 35–42, 2016, doi: 10.1109/HRI.2016.7451731.
[9] A. Francis, N. Mohan, and R. Roy, “Multi-Tasking EMG Controlled Robotic Arm,” Int. J. Adv. Res. Comput. Commun. Eng., vol. 6, 2017, doi: 10.17148/IJARCCE.
[10] S. Crea et al., “Feasibility and safety of shared EEG/EOG and vision-guided autonomous whole-arm exoskeleton control to perform activities of daily living,” Sci. Rep., vol. 8, no. 1, pp. 1–9, 2018, doi: 10.1038/s41598-018-29091-5.
[11] Y. Zhou et al., “Shared Three-Dimensional Robotic Arm Control Based on Asynchronous BCI and Computer Vision,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 31, pp. 3163–3175, 2023, doi: 10.1109/TNSRE.2023.3299350.
[12] M. I. Rusydi, T. Okamoto, S. Ito, and M. Sasaki, “Controlling 3-D movement of robot manipulator using electrooculography,” Int. J. Electr. Eng. Informatics, vol. 10, no. 1, pp. 170–185, 2018, doi: 10.15676/ijeei.2018.10.1.12.
[13] Y. Zhu, Y. Li, J. Lu, and P. Li, “A Hybrid BCI Based on SSVEP and EOG for Robotic Arm Control,” Front. Neurorobot., vol. 14, Nov. 2020, doi: 10.3389/fnbot.2020.583641.
[14] M. S. Amri bin Suhaimi, K. Matsushita, T. Kitamura, P. W. Laksono, and M. Sasaki, “Object Grasp Control of a 3D Robot Arm by Combining EOG Gaze Estimation and Camera-Based Object Recognition,” Biomimetics, vol. 8, no. 2, Jun. 2023, doi: 10.3390/biomimetics8020208.
[15] A. Anandika, P. D. Laksono, M. S. A. bin Suhaimi, J. Muguro, and M. I. Rusydi, “Enhancing Interface Efficiency: Adaptive Virtual Keyboard Minimizing Keystrokes in Electrooculography-Based Control,” J. Nas. Tek. Elektro, pp. 64–72, Dec. 2023, doi: 10.25077/jnte.v12n3.1160.2023.
[16] M. I. Rusydi, Y. Mori, T. Okamoto, M. Sasaki, and S. Ito, “Development of an Eog Based Robot Manipulator and End Point Direction Control System,” J. Japan Soc. Appl. Electromagn. Mech., vol. 22, no. 2, pp. 293–299, 2014, doi: 10.14243/jsaem.22.293.
[17] Y. Mishchenko, M. Kaya, E. Ozbay, and H. Yanar, “Developing a Three- to Six-State EEG-Based Brain-Computer Interface for a Virtual Robotic Manipulator Control,” IEEE Trans. Biomed. Eng., vol. 66, no. 4, pp. 977–987, 2019, doi: 10.1109/TBME.2018.2865941.
[18] M. I. Rusydi, T. Okamoto, S. Ito, and M. Sasaki, “Controlling 3-D Movement of Robot Manipulator using Electrooculography,” vol. 10, no. 1, pp. 170–186, 2018, doi: 10.15676/ijeei.2018.10.1.12.
[19] M. I. Rusydi, M. Sasaki, and S. Ito, “Affine transform to reform pixel coordinates of EOG signals for controlling robot manipulators using gaze motions,” Sensors (Switzerland), vol. 14, no. 6, pp. 10107–10123, 2014, doi: 10.3390/s140610107.
[20] M. Anschober, R. Edlinger, R. Froschauer, and A. Nüchter, “Inverse Kinematics of an Anthropomorphic 6R Robot Manipulator Based on a Simple Geometric Approach for Embedded Systems,” Robotics, vol. 12, no. 4, Aug. 2023, doi: 10.3390/robotics12040101.
[21] R. K. Megalingam, V. Shriram, B. Likhith, G. Rajesh, and S. Ghanta, “Monocular distance estimation using pinhole camera approximation to avoid vehicle crash and back-over accidents,” Proc. 10th Int. Conf. Intell. Syst. Control. ISCO 2016, 2016, doi: 10.1109/ISCO.2016.7727017.
[22] Z. Zhang, R. Zhao, E. Liu, K. Yan, and Y. Ma, “A single-image linear calibration method for,” Measurement, no. July, 2018, doi: 10.1016/j.measurement.2018.07.085.
[23] Z. Zhang, “A Flexible New Technique for Camera Calibration,” vol. 1998, 2008.
[24] M. I. Rusydi et al., “Electrooculography signal as alternative method to operate wheelchair based on SVM classifier,” in 3rd Conference on Innovation in Technology and Engineering Science 2022 (CITES2022) — AIP Conference Proceedings, AIP Publishing, 2024. doi: 10.1063/5.0200941.
[25] M. I. Rusydi, T. Okamoto, S. Ito, and M. Sasaki, “Controlling 3-D movement of robot manipulator using electrooculography,” Int. J. Electr. Eng. Informatics, vol. 10, no. 1, pp. 170–185, Mar. 2018, doi: 10.15676/ijeei.2018.10.1.12.
[26] A. Papetti, M. Ciccarelli, A. Manni, A. Caroppo, and G. Rescio, “Investigating the Use of Electrooculography Sensors to Detect Stress During Working Activities,” Sensors, vol. 25, no. 10, May 2025, doi: 10.3390/s25103015.