- 机器人机器学习公司Generalist发布新一代物理AI系统GEN-1,宣称其在多种需人类手部灵活性与肌肉记忆的物理任务中实现“生产级成功率”。该系统基于前代GEN-0模型发展而来,后者于2025年11月被作为机器人训练中扩展法则可行性的概念验证。GEN-1通过整合超50万小时、达PB级的人类操作数据(由可穿戴“数据手”设备采集)进行训练,显著提升了对物体操控的理解能力。目前,GEN-1在折叠纸盒、手机包装、维护扫地机器人等重复性精细任务中成功率高达99%,运行速度约为GEN-0的三倍,且仅需约一小时适配特定机器人形态即可投入使用。此外,该系统具备应对干扰时即兴调整动作、跨领域联想解决问题的能力。
GEN-1实现生产级物理任务成功率
依赖PB级人类操作数据训练
任务成功率99%且速度提升三倍
- 传统复杂机器人系统通常依赖预先编程动作或专注于特定任务的训练模式,缺乏应对突发状况的灵活性与泛化能力。GEN-1的突破在于其不仅能高精度执行如将钞票放入钱包、折叠衣物或分拣汽车零件等任务,还能在遭遇干扰时自主调整策略,体现出类人的问题解决能力。这种“连接不同思路以应对新问题”的特性,标志着物理AI从固定流程向自适应智能的演进。尽管当前成果集中于工业与家用场景中的结构化任务,但其快速适配能力(约一小时迁移学习)为未来机器人在非结构化环境中的部署提供了可能路径。
具备干扰下即兴调整能力
支持跨任务知识迁移
快速适配新机器人平台
- Robotic machine learning company Generalist has introduced GEN-1, a physical AI system designed to achieve production-level performance across a wide range of manual tasks previously requiring human dexterity. The system demonstrates high success rates in delicate, repetitive operations such as folding boxes, packing phones, and servicing robot vacuums, reaching 99% accuracy at approximately three times the speed of its predecessor, GEN-0. GEN-1 builds on scaling principles similar to those used in large language models, but addresses the unique challenge of limited physical training data by leveraging “data hands”—wearable pincers that record human micro-movements and visual input during manual tasks. Generalist reports collecting over 500,000 hours and petabytes of physical interaction data to train the model. The system adapts to specific robotic platforms in about one hour, enabling rapid deployment. Notably, GEN-1 can recover from disruptions by improvising new actions and synthesizing solutions from diverse experiences, suggesting improved generalization and adaptability in unstructured environments. This advancement signals progress toward autonomous robots capable of handling complex, real-world tasks with minimal reprogramming.
Key Takeaways:
GEN-1 achieves 99% success in delicate mechanical tasks at triple GEN-0’s speed
System trained on 500,000+ hours of human movement data via wearable sensors
Adapts to new robotic platforms in about one hour with minimal fine-tuning
Demonstrates improvisation and problem-solving in response to physical disruptions
Source: Original Article
查看原文 →
View Original →