Gaze Controlled Robotic Arm for Block Printing

Springer International Journal of Intelligent Robotics and Applications
*Indicates Equal Contribution

The proposed method aims to bridge the mobility gap for individuals with Severe Speech and Motor Impairment (SSMI). This is achieved through the deployment of a safe and intelligent robotic arm, which users can control using their eye-gaze.

Abstract

Recent advancements in assistive robotic systems address the challenges faced by individuals with severe speech and motor impairment (SSMI), caused by neuro-motor disabilities such as ALS (Amyotrophic Lateral Sclerosis) and Cerebral Palsy, which often hinder natural interactions and object manipulation, requiring dependence on caretakers for basic tasks. For the rehabilitation of individuals with motor impairments, robotic manipulators offer a method for practicing grasping, holding, and manipulating objects, thereby promoting motor skill improvement. Block printing is employed to create a notion of self-independence and to aid in rehabilitation through vocational training. Many challenges in block and fabric printing, such as unaligned prints, inconsistent dyeing, and difficult-to-grasp print holders, make the task physically demanding and monotonous, inhibiting natural creativity. The proposed system integrates a video see-through eye gaze controlled graphical user interface for operating a robotic arm for block printing, along with an Adaptive Stamp Localization Algorithm (ASLA) to determine stamp printing locations. Safety is crucial, particularly in educational settings where students with SSMI may utilize robot manipulators. To ensure the safety of the end user, the system generates optimal obstacle-avoiding trajectories using various path planning algorithms. The Dynamic A* algorithm is specifically chosen for its capability to maintain a minimum safe distance, averaging 6.084 cm, between the robot and obstacles while prioritizing the shortest path to the printing location. A study encompassing 11 individuals (5 individuals with SSMI and 6 able-bodied participants) demonstrated the system's safety and usability across all users, with task completion on average achieved within 2 minutes. With practice, users became more familiar with the system, leading to a 29.41% decrease in task completion time.

Overview of the Proposed System

MY ALT TEXT

The proposed assistive technology system features a video see-through interface designed to control a robotic arm using eye gaze for stamp printing tasks. The UI facilitates stamp design selection, printing location choice, and real-time feedback on stamped areas by exploiting a novel Adaptive Stamp Localization Algorithm (ASLA), enhancing user autonomy. Path planning algorithms, along with upper limb segmentation using instance segmentation models, are utilized to generate obstacle-free trajectories to the printing locations, mitigating interference from hand and forearm.

Adaptive Stamp Localization Algorithm

Upon detecting a stamp presence within the bounding box of a button representing a printing location among the five designated regions, the button's bounding box changes to red, serving as a visual cue for users. This visual feedback aids users in recognizing existing stamps within the UI interface. Additionally, the system streamlines user interactions by calculating the Euclidean distance between clicked buttons and nearby enabled printing locations. This automated selection mechanism ensures that even if already printed location is chosen, the nearest enabled printing location is automatically chosen as goal location for trajectory planning of the robot and thus enhancing user efficiency.