US20240181876
2024-06-06
Performing operations; transporting
B60K35/00
Systems and methods are designed to create an emotional human-machine interface (HMI) within vehicles. This interface utilizes a display to present emotional insights based on biometric data collected from occupants. By identifying each occupant and their emotional states, the system generates a graphical interface that incorporates color schemes reflecting these emotions, enhancing the driving experience.
Recent trends indicate a decline in younger individuals obtaining driving licenses, often opting for public transport or virtual interactions instead. The emotional aspect of driving is underexplored, as many now prefer to connect through devices rather than engage socially while driving. Enhancing the emotional connection during travel can potentially rekindle interest in driving among younger demographics.
The emotional HMI can be integrated into vehicle infotainment systems, allowing occupants to interact with shared apps and preferences on a common display. It can also generate context-aware graphical elements like color schemes, which adapt based on environmental factors and biometric data, thereby creating a more personalized experience for each occupant.
An emotion engine analyzes biometric data to determine the emotional states of occupants, influencing the graphical interface displayed. For example, if sensors detect a sudden event that causes shock, the interface may change to reflect this emotion through appropriate colors. Additionally, it can track interactions between occupants, displaying shared experiences and statistics to foster connection.
The systems described are adaptable and could evolve as technology advances, ensuring safety while providing an engaging user experience. The integration of real-time biometric feedback and personalized content offers a unique opportunity to redefine how individuals experience travel together, making it more interactive and emotionally resonant.