Xi’an Jiaotong University Embodied Intelligence Robotics Institute Unveils Humanoid Robot Matrix

Xi'an Jiaotong University Embodied Intelligence Robotics Institute Unveils Humanoid Robot Matrix

Science and Technology Think Tank

On March 29, the Embodied Intelligence Robotics Institute, co-founded by Xi’an Jiaotong University and the leading domestic mobile manipulation robot company Youai Zhihui, unveiled the humanoid robot matrix developed by the team, showcasing one of its wheeled humanoid robots – Xunxiao. The humanoid robot series announced by the institute includes seven products, each designed for different application scenarios, featuring bipedal, wheeled, quadrupedal, and tracked forms. Among them, “Xunxiao” is tailored for large and complex indoor environments, characterized by long endurance and high flexibility. It is based on Youai Zhihui’s expertise in the semiconductor and energy sectors and has been applied in areas such as semiconductor manufacturing Sub-FAB operations and energy distribution room operations.

Based on the adaptability to various scenarios, the Youai Zhihui – Xi’an Jiaotong University Embodied Intelligence Robotics Institute team has constructed a “one brain, multiple forms” embodied intelligence large model, utilizing a hybrid architecture of a multimodal general base model and an “one brain, multiple forms” edge embodied model, completing preliminary validation of scene applications. The base model employs VLM (Multimodal Large Model) and MoE (Mixture of Experts) architectures, leveraging multimodal pre-training datasets and the “Data Ocean” massive expert-level real training dataset to achieve scene perception and general language understanding for complex tasks, translating intricate instructions into the edge embodied model for high-frequency real-time control and quality online assessment. Based on the multimodal base model, the team has built an intelligent agent aggregation application platform aimed at generalized scenarios such as industrial intelligent manufacturing, smart inspection, and life services, integrating industry knowledge graphs. Under the “one brain, multiple forms” edge embodied model architecture, the team has combined key technologies such as reinforcement learning, imitation learning, and deep learning to create a multimodal interaction, visual semantic perception, and behavior reasoning decision-making embodied “brain” mobile artificial intelligence understanding system (MAIC), dexterously controlling the embodied “limbs” with end-to-end control and developing polymorphic “mobile” capabilities tailored to different scenarios based on multimodal sensor fusion. The embodied large model connects the entire chain of professional data – reasoning and decision-making – real-time control – scene feedback – continuous learning, promoting innovative development in the large-scale application of embodied intelligence.

Source: Strategic Center Contribution

Recommended Reading >

[Technology Reference] U.S. Department of Energy: Quantum Information Science Application Roadmap

Chinese Academy of Sciences Dalian Institute of Chemical Physics’ new ultra-low temperature high specific energy lithium battery successfully tested on a hexacopter drone in extreme cold conditions at -36°C

Chalmers University of Technology in Sweden achieves breakthrough in ultra-broadband optical amplification technology

Leave a Comment