Groupe Audio Acoustique

Wiki du groupe de recherche en audio et acoustique du LIMSI

Outils pour utilisateurs

Outils du site


en:projets:navig:start

NAVIG

Project presentation

The ANR-NAVIG project aims to increase the visually impaired autonomy in a primary and particularly problematic action: the navigation. It aims at providing an augmented audio rendering to blind people to help them in their daily life but without preventing them from perceiving their normal audio environment.

Through a participatory design method, we expected to enable visually impaired to move to a desired destination reliably and securely, without interfering with their normal travel behaviour. Furthermore, the device will allow the possibility to locate and grasp object without the necessity to pre-equip them with electronic component. The AA team objective on the project NAVIG is to develop a binaural synthesis engine that increase reality with auditory information to locate visual targets and to reach a destination while avoiding obstacles.

Prototype du système NAVIG.
Prototype of NAVIG system.

3D auditory perception

Auditory 3D information restitution is made using a binaural synthesis engine developed at LIMSI and running on the real-time convolution of HRIR. To obtain better localization performance with non-individualized HRTFs and reduce the up / down and front / rear confusion problem specific to the binaural technology, we developed a game that, based on the plasticity of the hearing system, allows the user to adapt to HRTFs that are not hers.

Auditory guidance

The objective of auditory guidance is to provide visual information in the form of audio information. We envisage various types of information depending on the type of guidance.

  • Near field guidance: The near field guidance is intended to convey the position, the size and the shape of an object. To better guide the grasping task, guidance sound must also be able to give information on the congestion of the path between the hand and the object. We have previously examined the precision of hand reaching movement towards nearby real acoustic sources through a localization accuracy task. Results showed that the accuracy of localization varies relative to the source stimulus, azimuth, and distance. Taking into account these results we are now preparing a grasping task experiment with virtual sounds and different stimulus to optimize the accuracy of localization.
  • Far field guidance: The guidance during navigation should allow the user to know the indications on the future trajectory, the reference points near the route and all the necessary information to provide a good mental space representation to the user.Tests are underway on spatialised text to speech and metaphors of sonification.

Ergonomics and Sound Design

  • Restitution sound choice: In order not to be regarded as unpleasant, sound design will be based on style sheets allowing the user to select different types of sounds that allow the guidance. We seek to avoid existing systems approaches based on Text to speech and sound tags that too often causing cognitive overload. First meetings with users show that, while some prefer to be guided with electronic sounds (although differentiable sounds of the environment), other prefer natural sounds (considered less unpleasant). This tends to demonstrate the usefulness of style sheets.
  • Headphone choice: To avoid the masking problem of real environmental sound by the system's sounds, we study binaural spatialisation quality with bones-phone and air-tubes.

Publications

Journal publications

  1. Parseihian, G. and Katz, B.F.G. "Morphocons: A new sonification concept based on morphological earcons" J. Audio Eng. Soc., Volume 60, Issue 6, 2012.
  2. Parseihian, G. and Katz, B.F.G. "Rapid Head-Related Transfer Function adaptation using a virtual auditory environment" J. Acoust. Soc. Am., Volume 131, Issue 4, 2012.
  3. Katz, B.F.G. and Parseihian, G. "Perceptually based head-related transfer function database optimization" J. Acoust. Soc. Am., Volume 131, Issue 2 pp. EL99-EL105, 2012.
  4. Katz, B.F.G. and Kammoun, S. and Parseihian, G and Gutierrez, O. and Brilhault, A. and Auvray, M. and Truillet, P. and Denis, M. and Thorpe, S. and Jouffrais, C. "NAVIG: augmented reality guidance system for the visually impaired. Combining object localization, GNSS, and spatial audio" Virtual Reality, Volume 16, Issue 2 2012.
  5. Katz, B.F.G. and Dramas, F. and Parseihian, G and Gutierrez, O. and Kammoun, S. and Brilhault, A. and Brunet, L. and Gallay, M. and Oriola, B. and Auvray, M. and Truillet, P. and Denis, M. and Thorpe, S. and Jouffrais, C. "NAVIG: Guidance system for the visually impaired using virtual augmented reality" J. Technology and Disability, Volume 24, Issue 2, 2012.
  6. Kammoun, S. and Parseihian, G and Gutierrez, O. and Brilhault, A. and Serpa, A. and Raynal, M. and Oriola, B. and Macé, M. and Auvray, M. and Denis, M. and Thorpe, S. and Truillet, P. and Katz, B.F.G. and Jouffrais, C. "Navigation and space perception assistance for the visually impaired: The NAVIG project" IRBM, Volume 33, Issue 2, 2012.

International conferences with actes

  1. Parseihian, G. and Conan, S. and Katz, B.F.G. "Sound effect metaphors for near field distance sonification", Proceedings of the 18th international conference on Auditory display (ICAD 2012), Atlanta, USA, June 18-22, 2012.
  2. Caramiaux, B. and Fdili Alaoui, S. and Bouchara, T. and Parseihian, G. and Rébillat, M. "Gestural auditory and visual interactive platform", Proceedings of the 14th International Conference on Digital Audio Effects (DAFx-11), Paris, France, September 19-23, 2011.
  3. Parseihian, G. and Brilhault, A and Dramas, F. "NAVIG: An object localization system for the blind". Workshop Pervasive 2010: Multimodal Location Based Techniques for Extreme Navigation, Helsinki, May 17, 2010.
  4. Katz, B.F.G. and Truillet and P. Thorpe, S. and Jouffrais, C. "NAVIG: Navigation Assisted by Artificial Vision and GNSS". Workshop Pervasive 2010: Multimodal Location Based Techniques for Extreme Navigation, Helsinki, May 17, 2010.
  5. Dramas, F. and Oriola, B. and Katz, B.F.G. Thorpe, S. and Jouffrais, C. "Designing an assistive device for the blind based on object localization and augmented auditory reality". ACM Conference on Computers and Accessibility (ASSETS 2008), Halifax, Canada, 13/10/08-15/10/08.

International conferences without actes

  1. Parseihian, G. and Katz, B.F.G. "Rapid auditory system adaptation using a virtual auditory environment". International Multisensory Research Forum, Fukuoka, Japan, October 17-20, 2011.
  2. Parseihian, G. and Katz, B.F.G. "Recalibrating the auditory system through audio-kinesthesic training". European Workshop on Imagery & Cognition, Helsinki, Finland, June 16-19, 2010.
  3. Gallay, M. and Denis, M. and Parseihian, G. and Auvray, M. "Egocentric and allocentric reference frames in a virtual auditory environment: differences in navigation skills between blind and sighted individuals". European Workshop on Imagery & Cognition, Helsinki, Finland, June 16-19, 2010.
  4. Dramas, F. and Katz, B.F.G. and Jouffrais, C. "Auditory-guided reaching movements in the peripersonal frontal space''. Acoustics, Paris, Vol. 123, Acoustical Society of America, p. 3723, 2008.

en/projets/navig/start.txt · Dernière modification: 2012/08/06 15:13 (modification externe)