😠💢😵💫Tired of endless data collection & fine-tuning every time you try out VLA?
Meet RDT2, the first foundation model that zero-shot deploys on any robot arms with unseen scenes, objects & instructions.
No collection. No tuning. Just plug and play🚀
Witness a clear sign of embodied superintelligence
- 7B one-step diffusion → 23 Hz inference⚡
- Re-designed UMI
@chichengcc @SongShuran and manufactured 100 portable devices
- Trained on 10K-hour UMI data on 100 real houses
- Zero-shot: pick, place, press, wipe… open-vocabulary
- Demos: block 30 m/s arrows in 500 ms🛡️; first to play ping-pong with an end-to-end model 🏓; extinguish burning incense by shaking quickly🥢
Fully open source at
github.com/thu-ml/RDT2
Project page:
rdt-robotics.github.io/rdt2/
Thanks to awesome collaborators
@bang_guo96535 @D0g4M74794 @EthanNg51931527