Research Article Open Access

Hybrid Auditory Based Interaction Framework for Driver Assistance System

Dalbir Singh

Abstract

Problem statement: The rapid development of Driver Assistance System (DAS) provides drivers with radically enhanced information and functionality. The nature of the current DAS requires a complex human-machine interaction which is distracting and may increase the risk of road accidents. The interaction between the driver and DAS should aid the driving process without interfering with safety and ease of vehicle operation. Speech based interaction mechanisms employed are not sufficiently robust to deal with the distraction and noise present in the interior environment of the vehicle. Approach: Thus, suitable hybrid earcon/auditory icon design principles for DAS are developed. These interfaces are investigated in driving simulators in-order to test their durability and robustness. Several evaluation parameters will be applied. This will ensure the driving-related information from the DAS was delivered to the driver without affecting the overall driving process. Results: This study produces auditory design principles for information mapping (visual into nonspeech based interaction) and presentation framework was produced. It outlines representation architecture that enables concurrent auditory driving related information to be transmitted from four different sources in the vehicle’s interior environment. It outlines a set of hybrid design principles that integrates auditory icons with earcons to map and present real-time driving related data from a visual to a non-speech auditory interface. Conclusion/Recommendations: The major contribution of this research project takes a genuine approach by considering the entire DAS (safety, navigation and entertainment subsystem). It proposes hybrid representation strategy based on the cognitive availability of the driver and cognitive workload. It was a significant discovery that aid future DAS auditory interaction design.

Journal of Computer Science
Volume 6 No. 12, 2010, 1499-1504

DOI: https://doi.org/10.3844/jcssp.2010.1499.1504

Submitted On: 17 October 2010 Published On: 22 November 2010

How to Cite: Singh, D. (2010). Hybrid Auditory Based Interaction Framework for Driver Assistance System. Journal of Computer Science, 6(12), 1499-1504. https://doi.org/10.3844/jcssp.2010.1499.1504

  • 3,598 Views
  • 2,629 Downloads
  • 2 Citations

Download

Keywords

  • Earcon
  • hybrid
  • auditory icon
  • Driver Assistance System (DAS)
  • synthetic speech
  • hierarchical architecture
  • earcon recognition
  • synchronous
  • psychoacoustics basis