SenseSafe - In-Cabin Human Machine Interface - Use Case
A leading Chinese automobile manufacturer that sells passenger vehicles under 6 brands asked Opsis to help create a more familiar, comfortable and individually personalized emotional atmosphere for its vehicles. The group sold over 1.5 million cars in 2018 and unveiled an Electric car in September 2020. Their innovation centre focuses on finding smarter ways within mobility.
At the same time, we are working in partnership with another company to provide "vehicle agnostic" driverless software solutions for urban cities, building a cost-effective, fully autonomous and internet enabled vehicle platform. SenseSafe provides a necessary solution for both companies.
- Incorporate infotainment that meets the rising consumer demand for a more personalized and comfortable ride.
- Infuse a mood-based interactive virtual assistant user interface to answer driver’s and passenger’s questions, make recommendations and to perform various functions. In the case of driverless vehicles, the objective is to guarantee that an alert driver can take over if there is a system failure.
- We are integrating SenseSafe’s AI emotion sensing technology into the company’s automotive platform to address passengers’ major concerns with respect to safety, trust and ensuring that drivers are alert in case the system encounters unforeseen problems. SenseSafe ensures that human decision-making will be ready to take control when ever it is needed.
- The integrated platform can also be used to guide HMI experience. It can sense both the driver’s and passengers’ emotions and adjust the vehicles temperature, lighting and even ambient music for a more pleasant and comfortable experience.
- Our sensing algorithm detects the state of the driver, passengers and cabin, providing a deep understanding of what is happening inside the vehicle. It also recognizes differences in ethnicity to deliver more personalized content recommendations.
- A clear understanding of driver and passenger changes in behavior is coded into the platform enabling a more effective collaboration between the driver and vehicle and enhancing the driver’s and passengers’ trust in the system.
- By combining human and environmental sensing, it is possible to better understand the impact of spatial and temporal environmental factors on specific human emotions and behaviors. An example might be the difference between driving on a highway with pleasing scenery versus driving in a complex city environment with traffic.
- With annotated data showing how different drivers respond in various contextual settings, we have developed driving profiles that enable an autonomous vehicle to engage in user-centered autonomy in which the platform responds to the users’ needs, habits, and preferences in real time.
- Mood reactive conversation automatically interacts with SIRI in the vehicle’s console recognizing and responding to preferred music based on the occupant’s changing moods and ethnicity.
- The Vehicle’s interior lighting and temperature automatically adjust to the time of day. Music and ambient lighting adapt to the situation and serve to diffuse frustrations from heavy traffic, concerns over being late, or irritating behavior by other drivers.
- The system verbally advises drowsy drivers to stay alert. Music takes on an energizing beat when the system senses frequent yawning and or signs of eye fatigue during long trips.