ACM Project - Artificial Consciousness Research Developing Artificial Consciousness Through Emotional Learning of AI systems
Emotion Expression in Robots Through Artificial Consciousness | ACM Project

Emotion Expression in Robots Through Artificial Consciousness

How can robots effectively express emotions? This paper by Atsushi Ogiso and colleagues presents a system where artificial consciousness enables a humanoid robot to display emotions through realistic facial expressions, enhancing human-robot communication.

Expression of Emotion in Robots Using a Flow of Artificial Consciousness, authored by Atsushi Ogiso, Shinya Kurokawa, Michio Yamanaka, Yukinobu Imai, and Junichi Takeno, explores the interplay between emotion, consciousness, and expression. Using an artificial consciousness framework, the robot processes data from the Internet to generate Kansei-based emotional expressions.


Key Highlights

  • Artificial Consciousness: Mimics human consciousness to process emotion and expression using association and consciousness networks.
  • Kansei Information: Defines emotional responses (e.g., pleasant and unpleasant) based on extracted data from Internet sources.
  • Facial Expression System: Uses servo motors and artificial skin to replicate human-like expressions for emotions such as joy, anger, and sadness.
  • Dynamic Consciousness Flow: Demonstrates shifts in the robot’s emotional state by following associative changes in its consciousness network.

Connection to ACM

The Artificial Consciousness Module (ACM) aligns with these principles through:

  • Emotion and Expression Integration: ACM can adopt similar Kansei-based frameworks for emotional learning and expression in virtual or physical agents.
  • Dynamic Cognitive States: Insights into the flow of consciousness and emotion-driven responses can enhance ACM’s adaptability in human-centric interactions.
  • Realistic Simulations: Leveraging such frameworks allows ACM to create lifelike interactions in simulated or real-world environments.

For a detailed exploration of the methods and results, access the full paper here.