-
Affective Computing
In 1995, Rosalind Picard published her paper that brought the term "Affective Computing" to the world for the first time. In it, she talked about how for computers and machines to be more effective, they need to have the capacity to understand and respond to human emotion. This is what laid the foundation of the field and also pushed her to start the Affective Computing Group at MIT. -
Kismet
In 2001, Cynthia Breazeal created Kismet, a robot that was designed to respond to human emotions. Kizmet had "expressive" eyes, ears, and a mouth that allowed to demonstrate an emotional intelligence that was new for machines. It was one of the first robots to be able to perform eye and motion detection and could even detect threats in its' visual field. -
Apple's Siri
Although this was not officially marketed as affective computing, many of the elements that Picard talked about were present. Siri was the first integrated voice assistant in phones that was able to respond to tones and with emotions like humor. This was a big step in beginning to integrate conversational assistants into everyday devices. -
DeepMind’s AI (https://www.wired.com/ [https://youtu.be/E_a78HF4Q1c)
In 2016, DeepMind moved into exploring systems that could understand human emotions. According to the article "The Empathy Engine", this would begin their development of an emotionally sensitive AI. As it has developed into a functioning and emotionally adept machine, its uses have also grown with it. For example, there is a virtual therapist that is powered by the AI. During sessions, the AI can make genuine connections that allow better communication with humans. -
Affectiva's Emotion AI for Automotive INdustry
Affectiva is the company that was co-founded by Rosalind Picard and in 2020, created an AI that can detect the emotions of drivers. Affectiva uses deep learning and computer vision to track facial expressions and vocal tones. It is trained by millions of real-world interactions which makes it very accurate but has raised privacy concerns with many wondering how such data could be stored and used.
https://d3.harvard.edu/platform-digit/submission/affectiva-building-ai-that-reads-human-emotions/ -
Emotional Avatars in the Metaverse
As more people interact online, companies like NVIDIA and Meta have begun to create digital avatars that use emotion-sensing AI. This technology allows the avatars to mimic the emotions in real time which creates a much more immersive form of communication. An example of this tech is using these avatars in a virtual meeting to indicate facial expressions that the other members of the meeting would not be able to see. -
Emotional AI in Education
This AI will monitor how the students are engaging and responding during classes that are online. Students will not be able to "hide" behind their screens and will be able to receive more effective feedback even when they are not present with a teacher. If the AI could sense emotions like frustration, it would be able to provide more structured help in assisting those students to have a better learning experience. -
AI with Genuine Emotional Intelligence
In 100 years, AI will be far more advanced than simply understanding emotions. It will be able to experience and express its own emotions and will be able to form real connections with humans. Companies will begin to make AI pets that can express the same emotions as a real pet but come without the cleanup and hassle of animals. This technology could also disrupt many jobs and the structure of society because no longer will humans have complete control over their technology.