New Brain-inspired Computer Can Tell a Sad Image from a Happy One

New Brain-inspired Computer Can Tell a Sad Image from a Happy One

We are searching data for your request:

Forums and discussions:
Manuals and reference books:
Data from registers:
Wait the end of the search in all databases.
Upon completion, a link will appear to access the found materials.

University of Colorado Boulder neuroscientists have combined machine learning and neuroscience to create a brain-inspired computer that can tell the difference between sad and happy images.

Recognizing the emotion of images

"Machine learning technology is getting really good at recognizing the content of images -- of deciphering what kind of object it is," said senior author Tor Wager, who worked on the study while a professor of psychology and neuroscience at CU Boulder. "We wanted to ask: Could it do the same with emotions? The answer is yes."

The experiment is an important development in "neural networks," computer systems modeled after the human brain. It also highlights that what we see could have a more severe impact on our emotions than we might think.

"A lot of people assume that humans evaluate their environment in a certain way and emotions follow from specific, ancestrally older brain systems like the limbic system," said lead author Philip Kragel, a postdoctoral research associate at the Institute of Cognitive Science. "We found that the visual cortex itself also plays an important role in the processing and perception of emotion."

For their study, the researchers used a neural network called AlexNet designed to enable computers to recognize objects and retooled it to predict how a person would feel when they see a certain image using previous research. The researchers dubbed the new network EmoNet and proceeded to show it 25,000 images.

The computer was then asked to categorize them into 20 sections such as craving, sexual desire, horror, awe, and surprise. The program was found to better at recognizing some emotions better than others.

It could accurately and consistently categorize 11 of the emotion types. Craving or sexual desire, for instance, were categorized with more than 95 percent accuracy.

However, more nuanced discreet emotions such as confusion, awe, and surprise were harder to pinpoint. EmoNet proved very reliable in rating the intensity of images.

It was also rather good at rating brief movie clips. When asked to categorize them as romantic comedies, action films or horror movies, it got it correct 75% of the time.

Same neural network patterns for human and computer

The researchers then used 18 human subjects and had a functional magnetic resonance imaging (fMRI) machine measure their brain activity when they were shown the same 112 images as EmoNet. Surprisingly, the neural network patterns were the same for human and computer.

"We found a correspondence between patterns of brain activity in the occipital lobe and units in EmoNet that code for specific emotions. This means that EmoNet learned to represent emotions in a way that is biologically plausible, even though we did not explicitly train it to do so," said Kragel.

In the end, the researchers believe their work could be applied to improve computer-human interactions and help advance emotion research. For now, however, the research just proves the importance of monitoring what you are exposed to.

"What you see and what your surroundings are can make a big difference in your emotional life," added Kragel.

The study is published in the journal ScienceAdvances.

Watch the video: How Far is Too Far? The Age of. (July 2022).


  1. Briar

    In my opinion you are mistaken. Let's discuss it. Write to me in PM, we will communicate.

  2. Chatwin

    I can not take part now in discussion - it is very occupied. I will be free - I will necessarily express the opinion.

  3. Rheged

    I confirm. And with this I have come across.

  4. Raphael

    Also that we would do without your brilliant phrase

  5. Tatilar

    In my opinion, mistakes are made. We need to discuss. Write to me in PM, it talks to you.

  6. Yanis

    I'm sorry, but, in my opinion, they were wrong. I propose to discuss it. Write to me in PM.

Write a message