Machines are infamous for their flat, emotionless existence. They’re often pure, logical beings that work only for the interest of their programmed directives (I’m looking at you, HAL). Even when a machine or AI, like Alexa or Siri, tries to be fun, it still feels devoid of life. However, researchers at the University of Colorado, Boulder are challenging that belief with the invention of EmoNet.
At first, EmoNet sounds like a machine that only blasts MCR or Blink-182. According to CU Boulder, EmoNet studies and categorizes emotions based on images. Using machine learning, where an AI learns from various experiments, EmoNet can distinguish emotions in a matter of milliseconds. You can see the machine doing so in this video, where EmoNet’s capabilities are tested with amusement park footage.
Machine learning is already used to identify objects with great accuracy. So, researchers at CU Boulder wanted to see if they could apply the same technique to emotions. Using an existing, object-identifying neural network, EmoNet’s programming allows it to identify common emotional responses to a flash of images. Researches show EmoNet 25,000 images and then ask it to sort them into 20 emotional categories, like amusement or sadness.
Overall, EmoNet could accurately sort 11 types of emotions, with some easier to recognize than others. For example, EmoNet can usually distinguish sexual desire or craving with 95% accuracy. However, a complicated, nuanced emotion, like awe, is harder to decipher for the AI system.
Perhaps most exciting about this discovery is that EmoNet helped test humans and their potential ability to identify emotions quickly, too. Brain readings of human subjects who were shown flashes of images proved that yes, humans can identify emotions almost immediately. So, our brain is constantly assigning emotions to what we see, even if we aren’t always hyper aware of it. No wonder we’re filled with immediate love every time a puppy video shows up online.
Featured Image: Disney/Pixar