iCub3 Avatar System: Bridging the Gap from Research Labs to Real-World Integration for Humanoid Robots


Over the past four years, the Artificial and Mechanical Intelligence (AMI) lab at the Istituto Italiano di Tecnologia (IIT) in Genova, Italy, has been at the forefront of developing cutting-edge avatar technologies, exemplified by the iCub3 system. This advanced system has undergone rigorous testing in real-world scenarios, showcasing its versatility and potential applications.

The iCub3 system has been instrumental in enabling human operators to remotely visit locations up to 300 km away, captivate audiences at various events and television appearances, and even compete in prestigious challenges like the ANA Avatar XPrize.

In their seminal research paper published in Science Robotics, the AMI lab outlines the challenges encountered and innovative solutions devised during the development of the iCub3 avatar system. Their approach underscores the importance of transcending laboratory boundaries to address real-world variability, thereby paving the way for the integration of robust humanoid robotic platforms into various economic and productive sectors.

The evolution of the iCub3 system into the ergoCub robot reflects a strategic shift towards enhancing its adaptability and acceptance in work environments. Spearheaded by Italian researcher Daniele Pucci and his team of approximately 50 researchers, the AMI lab has diligently refined the iCub3 avatar system, which comprises the iCub3 robot and wearable technologies known as iFeel, designed to track human body motions.

Central to the iCub3 avatar system is its comprehensive embodiment features, encompassing locomotion, manipulation, voice, and facial expressions, augmented by sensory feedback modalities such as visual, auditory, haptic, weight, and touch. Developed in collaboration with the Italian National Institute for Insurance against Accidents at Work (INAIL), the system exemplifies a seamless integration of human operators and humanoid robots.

The system’s real-world applicability was validated through extensive testing in diverse scenarios. In a landmark demonstration in November 2021, a human operator in Genoa remotely controlled the avatar at the Biennale di Venezia in Venice, navigating through the art exhibition with precision and caution. The deployment of the iFeel suit and custom haptic devices facilitated immersive control and sensory feedback, while robust communication infrastructure ensured seamless interaction.

Subsequent tests at the We Make Future Show and the ANA Avatar XPrize competition further underscored the system’s adaptability and performance under demanding conditions. From navigating crowded venues to executing complex tasks under time constraints, the iCub3 robot showcased its agility and versatility, setting the stage for future applications in collaborative work environments.

The culmination of these efforts resulted in the development of the ergoCub robot, poised to revolutionize collaborative tasks in industries and healthcare settings. Leveraging insights gleaned from real-world deployments, the ergoCub robot prioritizes safety, efficiency, and user acceptance, promising to redefine human-robot collaboration in the digital age.