New AI framework mirrors human physiology to understand emotional experiences

Emotions are a fundamental part of human psychology-a complex process that has long distinguished us from machines. Even advanced artificial intelligence (AI) lacks the capacity to feel. However, researchers are now exploring whether the formation of emotions can be computationally modeled, providing machines with a deeper, more human-like understanding of emotional states.

In this vein, Assistant Professor Chie Hieida from the Nara Institute of Science and Technology (NAIST), Japan, in collaboration with Assistant Professor Kazuki Miyazawa and then-master's student Kazuki Tsurumaki from Osaka University, Japan, explore computational approaches to model the formation of emotions. In a recent study, this team of researchers built a computational model that aims to explain how humans may form the concept of emotion. The study was made available online on July 3, 2025, and was published in Volume 16, Issue 4 of the journal IEEE Transactions on Affective Computing on December 3, 2025.

This model is based on the theory of constructed emotion, which proposes that emotions are not innate reactions but are built in the moment by the brain. Emotions arise from integrating internal bodily signals (interoception, like heart rate) with external sensory information (exteroception, like sight and sound), allowing the brain to create a concept, not just a reflex.

"Although there are theoretical frameworks addressing how emotions emerge as concepts through information processing, the computational processes underlying this formation remain underexplored," says Dr. Hieida.

To model this process, the research team used multilayered multimodal latent Dirichlet allocation (mMLDA), a probabilistic generative model designed to discover hidden statistical patterns and categories by analyzing how different types of data co-occur, without being pre-programmed with emotional labels.

The developed model was trained using unlabeled data collected from human participants who viewed emotion-evoking images and videos. The system was not informed about which data corresponded to emotions such as fear, joy, or sadness. Instead, it was allowed to identify patterns on its own.

29 participants viewed 60 images from the International Affective Picture System, which is widely used in psychological research. While viewing the images, researchers recorded physiological responses such as heart rate using wearable sensors and collected verbal descriptions. Together, these data captured how people interpret emotions: what they see, how their bodies respond, and how they describe experiencing them.

When the trained model's emotion concepts were compared with participants' self-reported emotional evaluations, the agreement rate was about 75%. This was significantly higher than would be expected by chance, suggesting that the model categorized emotion concepts that closely matched how people experience emotions.

By modeling emotion formation in a way that mirrors human experience, this research paves the way for more nuanced and responsive AI systems. "Integrating visual, linguistic, and physiological information into interactive robots and emotion-aware AI systems could enable more human-like emotion understanding and context-sensitive responses," says Dr. Hieida.

Moreover, because the model can infer emotional states that people may struggle to express in words, it could be particularly useful in mental health support, healthcare monitoring, and assistive technologies for conditions such as developmental disorders or dementia.

"This research has important implications for both society and industry, as it provides a computational framework that connects emotion theory with empirical validation, addressing the long-standing question of how emotions are formed," concludes Dr. Hieida.

Source:
Journal reference:

Tsurumaki, K., et al. (2025) Study of Emotion Concept Formation by Integrating Vision, Physiology, and Word Information Using Multilayered Multimodal Latent Dirichlet Allocation. IEEE Transactions on Affective Computing. doi: 10.1109/TAFFC.2025.3585882. DOI: 10.1109/TAFFC.2025.3585882. https://ieeexplore.ieee.org/document/11071374

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of News Medical.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
Study finds moderate readiness for AI use among young family physicians in Europe