A bio-inspired model for robust navigation assistive devices
Résumé
This paper proposes a new implementation and evaluation in a real-world environment of a bio-inspired predictive navigation model for mobility control, suitable especially for assistance of visually impaired people and autonomous mobile systems. This bio-inspired model relies on the interactions between formal models of three types of neurons identified in the mammals’ brain implied in navigation tasks, namely place cells, grid cells, and head direction cells, to construct a topological model of the environment under the form of a decentralized navigation graph. Previously tested in virtual environments, this model demonstrated a high tolerance to motion drift, making possible to map large environments without the need to correct it to handle such drifts, and robustness to environment changes. The presented implementation is based on a stereoscopic camera, and is evaluated on its possibilities to map and guide a person or an autonomous mobile robot in an unknown real environment. The evaluation results confirm the effectiveness of the proposed bio-inspired navigation model to build a path map, localize and guide a person through this path. The model predictions remain robust to environment changes, and allow to estimate traveled distances with an error rate below 3% over test paths, up to 100m. The tests performed on a robotic platform also demonstrated the pertinence of navigation data produced by this navigation model to guide an autonomous system. These results open the way toward efficient wearable assistive devices for visually impaired people independent navigation.
Mots clés
Bio-inspired navigation; Artificial place cells; Artificial grid cells; Artificial head direction cells; Visual localization; Navigation assistive devices for visually impaired mobility
Bio-inspired navigation Artificial place cells Artificial grid cells Artificial head direction cells Visual localization
Domaines
Informatique [cs]Origine | Fichiers éditeurs autorisés sur une archive ouverte |
---|---|
licence |