BlindPilot: A Robotic Local Navigation System that Leads Blind People to a Landmark Object
Seita Kayukawa, Tatsuya Ishihara, Hironobu Takagi, Shigeo Morishima, and Chieko Asakawa
We propose an assistive robot, BlindPilot, which directly guides blind users to landmark objects using an intuitive handle.
We propose an assistive robot, BlindPilot, which directly guides blind users to landmark objects using an intuitive handle.
BlindPilot uses an RGB-D camera to detect the position of an empty chair and uses a LiDAR to build a 2D map of the surrounding area. BlindPilot then generates a path to the chair and moves along the generated path.
BlindPilot uses an RGB-D camera to detect the position of an empty chair and uses a LiDAR to build a 2D map of the surrounding area. BlindPilot then generates a path to the chair and moves along the generated path.
Our user study showed that BlindPilot enabled users to approach a chair faster with a greater feeling of security and less effort compared to a sound-based local navigation system.
Our user study showed that BlindPilot enabled users to approach a chair faster with a greater feeling of security and less effort compared to a sound-based local navigation system.
30-sec Preview Video
Main Video [1:44]
Abstract
Blind people face various local navigation challenges in their daily lives such as identifying empty seats in crowded stations, navigating toward a seat, and stopping and sitting at the correct spot. Although voice navigation is a commonly used solution, it requires users to carefully follow frequent navigational sounds over short distances. Therefore, we presented an assistive robot, BlindPilot, which guides blind users to landmark objects using an intuitive handle. BlindPilot employs an RGB-D camera to detect the positions of target objects and uses LiDAR to build a 2D map of the surrounding area. On the basis of the sensing results, BlindPilot then generates a path to the object and guides the user safely. To evaluate our system, we also implemented a sound-based navigation system as a baseline system, and asked six blind participants to approach an empty chair using the two systems. We observed that BlindPilot enabled users to approach a chair faster with a greater feeling of security and less effort compared to the baseline system.
Publication
Seita Kayukawa, Tatsuya Ishihara, Hironobu Takagi, Shigeo Morishima, and Chieko Asakawa. 2020. BlindPilot: A Robotic Local Navigation System that Leads Blind People to a Landmark Object. Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems (CHI EA 2020).
CHI EA 2020 DOI Paper BibTeX
Authors
Waseda University
IBM Research
IBM Research
Waseda University
IBM Research
IBM Research
Waseda Research Institute for Science and Engineering
IBM Research
Acknowledgements
This work was supported by JST ACCEL (JPMJAC1602) and JST-Mirai Program (JPMJMI19B2).