Read More
Themis QiThe GenieOperator-1 model studies videos featuring human hand movements and then teaches robots to do household chores including pouring a glass of water and handing it over to people as well as putting coffee beans into a machine to make a cuppa.
Chinese robot startup AgiBot has unveiled an artificial intelligence large language model that enables robots to acquire skills from videos in the latest technological advancement in about embodied AI.
ADVERTISEMENT
SCROLL TO CONTINUE WITH CONTENT
If the robot places the cup outside the plate, the LLM identifies the error and improve its movement, a demo shows.
AgiBot said the LLM makes its easier to achieve embodied AI and has been already used on some of its robots.
Based in Shanghai, AgiBot was established by Peng Zhihui in 2023 and has manufactured 1,000 robots after launching its first intelligent robot YuanZheng A1 in August that year.
Peng previously worked as a chip developer in Huawei in 2020 and took home an annual pay packet of 2.01 million yuan (HK$2.15 million).Embodied AI is among the cutting-edge technologies mentioned in Premier Li Qiang's work report last week and tnvestment in the sector has soared after Unitree Robotics' robots danced at a Lunar New Year's gala organized by state broadcaster CMG at the end of January.

An AI LLM teaches AgiBot’s robot how to make coffee. AGIBOT













