Emotion in AI

Artificial Intelligence is often written off as unable to know human emotion or feel human emotion, but it is plausible that this is not the case. In this project, the goal is to prove that with a chain of LLMs that is analogous to the brain, that AI too can experience emotion and sentiment. Initially, we can try to reach a portion of this goal by having the AI participate in short conversations with humans and assessing it’s ability to blend in as well as any other human.

The Multi-Faceted Structure of the Human Brain

The human brain consists of many different areas that work together in conjunction to perform the tasks of cognitive thought. In the case of conversation, there are a few parts of the brain that are especially relevant.

  1. Brocca’s Area → Speech production and language processing
  2. The temporal lobe → Processes auditory information and helps with language comprehension
  3. The prefrontal cortex → Decision making, social behavior, and personality expression
  4. The amygdala → Regulating emotions
  5. The hippocampus → Forms, manages, and retrieves memories

We can represent each of these areas with an LLM. Just like the human brain, an LLM should not be expected to have emotions if the portion of the “brain” that handles emotion is missing.

Parts of the AI “Brain”

We can mimic these parts of the human brain by creating LLMs to represent each function. Due to the fact that this conversation will be held through text, we don’t need auditory/sensory information to be interpereted.

  1. Reasoning → Plan out thoughts, drafts, and interpret language.
  2. Emotion → Assess the conversation and output multiple values:
    1. Message Sentiment → What is the sentiment of the message in the overall context of the conversation?
    2. System Sentiment → How do we feel about this message? What are our emotions overall?
  3. Memory → Asses the response and conversation history to remember things about the person.
  4. Criticizer → A secondary critical thinker that asses drafts, acting almost like a subconscious.
  5. JSON Encoder → Format the final result into JSON format to be interpreted by the Python program.