Guiding Blind Pedestrians in Public Spaces by Understanding Walking Behavior of Nearby Pedestrians
Seita Kayukawa, Tatsuya Ishihara, Hironobu Takagi, Shigeo Morishima, and Chieko Asakawa
We present a guiding system to help blind people walk in public spaces while making their walking seamless with nearby pedestrians.
We present a guiding system to help blind people walk in public spaces while making their walking seamless with nearby pedestrians.
The system has two modes:
  • the on-path mode that helps users avoid collisions without changing their path by adapting their walking speed
  • the off-path mode that navigates an alternative path to go around pedestrians standing in the way
The system has two modes:
  • the on-path mode that helps users avoid collisions without changing their path by adapting their walking speed
  • the off-path mode that navigates an alternative path to go around pedestrians standing in the way
The system senses the behavior of surrounding pedestrians and predicts risks of collisions.
The system senses the behavior of surrounding pedestrians and predicts risks of collisions.
We implemented tactile and audio interfaces to evaluate the effectiveness of each modality for collision avoidance.
The tactile interface navigates users with a newly developed directional lever, which shows the correct direction.
We implemented tactile and audio interfaces to evaluate the effectiveness of each modality for collision avoidance.
The tactile interface navigates users with a newly developed directional lever, which shows the correct direction.
Main Video
Abstract
We present a guiding system to help blind people walk in public spaces while making their walking seamless with nearby pedestrians. Blind users carry a rolling suitcase-shaped system that has two RGBD Cameras, an inertial measurement unit (IMU) sensor, and light detection and ranging (LiDAR) sensor. The system senses the behavior of surrounding pedestrians, predicts risks of collisions, and alerts users to help them avoid collisions. It has two modes: the "on-path" mode that helps users avoid collisions without changing their path by adapting their walking speed; and the "off-path" mode that navigates an alternative path to go around pedestrians standing in the way. Auditory and tactile modalities have been commonly used for non-visual navigation systems, so we implemented two interfaces to evaluate the effectiveness of each modality for collision avoidance. A user study with 14 blind participants in public spaces revealed that participants could successfully avoid collisions with both modalities. We detail the characteristics of each modality.
Publications
Seita Kayukawa, Tatsuya Ishihara, Hironobu Takagi, Shigeo Morishima, and Chieko Asakawa. 2020. Guiding Blind Pedestrians in Public Spaces by Understanding Walking Behavior of Nearby Pedestrians. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies (IMWUT), 4, 3, Article 85 (September 2020), 22 pages.
IMWUT (UbiComp 2021) DOI Paper BibTeX

粥川青汰, 石原辰也, 髙木啓伸, 森島繁生, 浅川智恵子. 歩行者の動きの解析と衝突予測に基づく公共空間における視覚障害者向け歩行支援システム. インタラクション. 2020. (査読あり, 口頭発表)
インタラクション2020 Talk Slide Paper
Authors
Waseda University
IBM Research
IBM Research
Waseda University
IBM Research
IBM Research
Waseda Research Institute for Science and Engineering
IBM Research
Related Project
Seita Kayukawa, Keita Higuchi, João Guerreiro, Shigeo Morishima, Yoichi Sato, Kris Kitani, and Chieko Asakawa
CHI 2019
Acknowledgements
We would like to thank all participants who took part in our user study. We would also like to thank the anonymous reviewers for their helpful comments.This work was supported by JSPS KAKENHI Grant Number JP20J23018, Grant-in-Aid for Young Scientists (Early Bird, Waseda Research Institute for Science and Engineering), and JST-Mirai Program Grant Number JPMJMI19B2.