February 11, 2025
The car of the future might listen to every word you say — not to change the radio or update directions, but to assess your emotional state.
Road rage is a global phenomenon. About 60 percent of drivers in China have experienced road rage. In the U.S., nearly 80 percent of drivers have reported feeling significant anger or aggression behind the wheel. And when those feelings lead to aggressive driving, road rage causes a significant percentage of auto accidents.
Recent research explores ways to use the human voice to determine if drivers are too angry to drive. A recent paper published by IEEE Access discusses one method that achieves about 95% accuracy in a lab using the voice, raising questions about whether this technology could be close to real-world applications.
Sensors That Track More Than the Road
Cars are increasingly outfitted with an array of sensors that monitor conditions inside the vehicle, under the hood and in the environment. Some vehicles, for example, have heart rate monitors embedded in steering wheels to detect drowsy driving and possible cardiac emergencies. Some have cameras that check to see if drivers are paying attention.
While they can evaluate the physical condition of drivers, determining their emotional state has proven trickier, ccording to IEEE Member Ning Hu, who notes that mixed emotions are common.
“If the categorization is binary, that is, angry versus not angry, the accuracy is 50% even if the system is guessing.”
The Role of AI in Reading Emotion
Using computers to identify and classify human emotions is known as affective computing. Researchers in this field leverage various forms of artificial intelligence trained on data sets, including images, social media text and voice. Industries like financial services and healthcare have relied on affective computing. Banks, for example, may use the technology in call centers to determine if customers are getting angry. Marketing companies use facial analysis in focus groups to classify reactions to advertisements and to understand how audiences might feel about movies.
Various approaches to affective computing have drawbacks and have struggled in real-world conditions. Much of the research into identifying road rage behavior has focused on things like aggressive lane changes and speed. Some have used cameras inside the vehicle to analyze facial expressions. Researchers note that angry people can mask their emotions so that they do not show on the face.
Advancing the technology will likely require a multi-modal system, one that integrates facial expressions, driver behavior and voice.
What Happens When Cars Detect Anger
Imagine a future where AI can determine whether a driver is too angry or otherwise impaired to drive. The question of what the vehicle does with that information remains. There are a couple of possibilities. First, artificial intelligence could disable the car or initiate self-driving features.
“One of the key objectives in developing AI algorithms for autonomous driving is safety,” Hu said. “The robustness and explainability of AI is still in question. AI may utilize all the possible data gathered and decide that the driver is “too” angry, but handling the action of disabling the car to AI means giving up control, which may cause harm to the driver sitting in. The question remains: what is the level of trust in AI, and what are the legal consequences when AI makes mistakes?”
Another possibility is that the information could be reported to insurance companies.
“The acceptance of customers depends on the benefits gained by giving personal data from such insurance products,” Hu said. “Regulation and ethical consideration should also be taken into account for insurance products in different jurisdictions.”