Researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) and the Institute for Medical Engineering and Science (IMES) have created an app that can detect variations in the voice tone of a person’s speech.
This AI software, which they installed on a smart watch, registers these changes and tells the user the feelings the interloper is trying to convey. For example, it can tell them if someone is excited or bragging about something.
The system not only analyzes audio but also text transcriptions and physiological movements to determine the speaker’s true nature with 83% accuracy, according to official sources.
The science behind the mood reading app
To come up with this device, researchers studied subjects wearing a Samsung Simband, a gadget that can register heart rate, blood pressure, body movements, blood flow, body temperature, and changes in the voice’s tone and pace.
They asked participants to tell either a ‘happy’ or ‘sad’ story of their choosing, which later helped the AI software to pair these feelings with specific physiological conditions.
After recording 31 distinctive conversations, researchers created two algorithms for the artificial intelligence to apply. The first one classified the recordings as happy and sad, and the other broke communications into ‘positive,’ ‘negative,’ and ‘neutral’ bits.
As a result, the finished prototype successfully indicated via voice tone if a person’s speech is happy, sad, or neutral, which helps humans identify the speaker’s genuine emotion.
Possible real-world uses for this technology
“The team’s usage of consumer market devices for collecting physiological data and speech data shows how close we are to having such tools in everyday devices,” said Björn Schuller, head of Complex and Intelligent Systems at the University of Passau, Germany.
He also remarked that technology “could soon feel much more emotionally intelligent,” or become emotional itself. This outcome is a long shot, but such a technology might have good repercussions for AI devices like home assistants.
Tuka Alhanai, a graduate Ph.D. student that participated in the project, said the app was not ready for social coaching, which might be its most promising field. However, she added that social integration is a goal they are working to achieve.
The MIT research team will also attempt to put this emotion-reading app into devices like an Apple Watch, testing potential commercial uses. Shortly, we might see a finished prototype for this technology in various wearables, and later, in other mobile devices.
Source: MIT News