This project investigates to what extend an audio-haptic navigation system can improve first person navigation in virtual environments. This project uses sound as directional cues and the vibration of a game controller to communicate distance, in order to lead towards certain destination in our virtual environment.
The audio-haptic navigation system and the virtual environment were based on human wayfinding theories and tested against an intrusive map system. The intrusive map meant that the participants had to stop moving and pull out the map to get a top down overview of the environment.
It was shown through testing, that the audio-haptic navigation system reduced the total time needed to complete several collection tasks in a virtual environment. The audio-haptic navigation system also reduced the use of the intrusive visual map.