Emotion in AI
Artificial Intelligence is often written off as unable to know human emotion or feel human emotion, but it is plausible that this is not the case. In this project, the goal is to prove that with a chain of LLMs that is analogous to the brain, that AI too can experience emotion and sentiment. Initially, we can try to reach a portion of this goal by having the AI participate in short conversations with humans and assessing it’s ability to blend in as well as any other human.
The human brain consists of many different areas that work together in conjunction to perform the tasks of cognitive thought. In the case of conversation, there are a few parts of the brain that are especially relevant.
We can represent each of these areas with an LLM. Just like the human brain, an LLM should not be expected to have emotions if the portion of the “brain” that handles emotion is missing.
We can mimic these parts of the human brain by creating LLMs to represent each function. Due to the fact that this conversation will be held through text, we don’t need auditory/sensory information to be interpereted.