Audiokinetic, provider of spatial audio and sonification solutions, has become a member of Basemark’s Rocksolid Ecosystem. The partnership positions both companies at the forefront of developing audiovisual immersion solutions. In this article, we go deeper into the concept of audiovisual immersion, shedding light on its purpose and future applications.
What is audiovisual immersion?
According to statistics from National Highway Traffic Safety Administration (NHTSA), around 8% of all the fatal car crashes in 2018 across the United States were distraction-affected crashes. Whether it is someone scrolling on the phone, fumbling on the infotainment screen trying to find the navigation settings, or intense conversations with the kids in the backseat, we all know how easy it is to lose your concentration while driving. To address this issue, car manufacturers are trying to find solutions that help drivers keep their focus and minimize distractions. Thus, the goal is not just to detect potential hazards on the road but also communicate warnings in the fastest and most intuitive way to the drivers.
Audiovisual immersion is seen as an ideal solution, as it combines advanced spatial audio with elegant spatial compute such as Augmented Reality (AR) on windshields and AR heads-up displays (AR HUD). This helps to direct the driver’s focus to the most critical information, making driving safer and more intuitive.
Before we go deeper into the concept of audiovisual immersion, let’s first clear out what automotive AR and sonification are.
Augmented Reality for automotive
Automotive AR applications are becoming a more common feature found in cars today. With automotive AR, you transform real-time vehicle data into visual representations that are overlaid on a vehicle screen, such as the windshield, instrument cluster or infotainment screen. With AR it is possible to increase the situational awareness by augmenting the surroundings around the vehicle through AR visualizations. Basemark is a forerunning provider of software solutions for automotive AR application development.
Example of AR visualizations on wide Field of View AR HUD.
Sonification refers to the process of transforming data into an acoustic signal to provide information. In the automotive context, it means that you use data from the embedded systems, like ADAS, and generate a sound based on the data. Sonification is a common technology found in vehicles today, for example, in a situation where you mistakenly embark from a lane while driving, the ADAS is activated, and you can hear alarming sound cues. This grabs the driver’s attention to indicate that the vehicle crossed the lane unintentionally. Audiokinetic is a pioneering provider in automotive audio solutions that help with the design and interactive audio rendering of such sound alarms including spatial audio information.
Although the concept of audiovisual immersion is somewhat new to the automotive sector, the technology already exists and is independently employed in vehicles today. By combining these technologies together, you have a rigid support system triggering multiple human senses that help drivers keep their concentration.
Sound spatialization adds hazard localization information for the driver.
Why audiovisual immersion?
Sonification serves as an excellent complement to AR visualizations. The sound cues generated by the sonification catches the attention of the driver and directs the eyes to the windshield. The strength of using audio, is that the human auditory system is always on and doesn’t require a certain precondition, such as, light, which is needed for the visual sense. Consequently, as the driver hears the sound cue and directs his gaze to the windshield, the overlaid AR visualizations on the HUD immediately informs the driver of the situation, allowing them to make an informed maneuver.
Some of the main benefits of audiovisual immersion are heavily related to safety and enhancement of driving experiences:
- Improving driver safety: Through audiovisual immersion the driver will be in a constant feedback loop, receiving information from the vehicle in the form of intuitive audio cues and visual AR overlays. This helps focus the drivers attention to the most important information, and helps them react faster in different driving situations.
- Ease complex driving situations: Audiovisual immersion can significantly help drivers in complex driving scenarios. For instance, while navigating city streets, it can provide clear guidance for lane changes and upcoming turns. Over time, such technology has the potential to serve as an educational tool, aiding drivers in mastering complex driving situations and encouraging safer driving habits, ultimately reducing the likelihood of encountering risky driving situations.
- Build trust between drivers and autonomous vehicles: As vehicles reach higher levels of autonomy, it’s essential to inform drivers about the vehicle’s decisions and actions, fostering trust. A clear alert system for when driver-assist systems take control is crucial to reduce confusion. Using audiovisual immersion, including AR content and spatial audio, helps maintain driver awareness, enhancing trust in the vehicle.
- Personalized and branded driving experiences: Finally, audiovisual immersion allows you to create distinctive, branded experiences that align seamlessly with a brand’s unique attributes. For instance, envision a scenario where your brand claims a distinctive sound profile and interior ambiance. With audiovisual immersion, you can effortlessly extend these brand-specific elements into the driving experience, creating a cohesive and memorable brand experience.
Although audiovisual immersion holds immense potential, whether for enhancing driver safety, elevating the driving experience, or entertainment, it’s evident that the foremost priority will be to enhance driver safety and improve driver intuition.
Supporting ADAS functions
In the context of ADAS enhancements, audiovisual immersion is a great function supporting the driver in critical driving situations.
- Blind spot monitoring: When a driver activates the turn signal to change lanes or merge, audiovisual immersion can make blind spot monitoring more intuitive. For instance, if there is a vehicle driving on the right side of the vehicle, a sound cue will merge from the right side of the vehicle, alerting the driver of the presence of another vehicle in the blind spot. Upon hearing the sound cue, the driver glances at the windshield, where an AR overlay on the display confirms the presence of a vehicle in the blind spot and lets them know when it’s safe to resume.
- Collision warning: In a situation where a vehicle is close to colliding with another vehicle or object, audiovisual immersion can play a crucial role in alerting the driver to potential collisions. For instance, when the system detects a forthcoming collision, it can trigger a combination of visual and auditory alerts. A visual warning through an AR overlay, such as a flashing red indicator, can appear on the display directing the driver’s attention to the appearing danger. Simultaneously, an urgent alarm will sound, drawing immediate attention to the potential collision scenario. This synchronized audiovisual feedback ensures that the driver is promptly informed of the hazard, allowing for quick and critical collision avoidance actions.
- Protecting pedestrians & cyclists: For urban driving, audiovisual immersion can offer visual and auditory alerts for pedestrians and cyclists near the vehicle, even if they are not within the driver’s direct line of sight, thus making the driver aware and more alert of the situation.
Example of an pedestrian highlight on a wide-angle AR HUD.
Intuitive navigation experiences
Audiovisual immersion enhances navigation intuitiveness. Sound localization cues are employed to determine the direction of auditory information, effectively enabling the driver to understand the location of the next action and directing their attention to the appropriate position on the display. Furthermore, various sound cues can be utilized for different navigation maneuvers or situations. For instance, when approaching a highway exit, different sound variations can indicate the distance from the exit and the appropriate location for executing the maneuver, while AR content simultaneously visualizes the provided directions.
Points of interest and location-based entertainment
In dense city environments, it may sometimes be difficult to handle driving while searching for specific shops or parking spaces for example. Sound playback of an earcon (the auditory equivalent of visual icons) in combination with an AR visualization indicating the localization of the desired point of interest with spatial sound can make the navigation task more intuitive and reduce the cognitive load of these complex driving situations. Sound spatialization can be equally used on informative speech content triggered interactively based on the location, for creating tourism audio city guide applications (similar to audio guides in museums).
Enhancing driving experiences with audiovisual immersion
Audiovisual immersion, supported by AR and sonification technologies, offers a promising solution to improve driver safety, reduce distractions, and enhance driving experiences. This innovative approach engages multiple senses, transforming how drivers interact with their vehicles and their surroundings.
Pre-integrated audiovisual solutions speed up the delivery of great HMI user experiences and reduce the risk in developing those solutions. By integrating Audiokinetic’s technology into Basemark’s automotive AR solutions customers get access to turnkey audio-visual solutions to develop highly intuitive and effective next-generation multi-modal automotive HMI interfaces.
This blog post has been co-created together with Audiokinetic.