Skip to content

RRGGZZ/CLOT

 
 

Repository files navigation


CLOT: Closed-Loop Global Motion Tracking for Whole-Body Humanoid Teleoperation

Tengjie Zhu1,2,*, Guanyu Cai1,*, Zhaohui Yang1,*, Guanzhu Ren1, Haohui Xie1, Junsong Wu1, ZiRui Wang2, Jingbo Wang2, Xiaokang Yang1, Yao Mu1,2,†, Yichao Yan1,†
* Equal Contribution  † Corresponding Author
1MoE Key Lab of Artificial Intelligence, AI Institute, Shanghai Jiao Tong University   2Shanghai AI Laboratory

arXiv

Pipeline

pipeline

News

  • [2026-02] We release the code and paper for CLOT.

About

This is the official implementation of the paper CLOT: Closed-Loop Global Motion Tracking for Whole-Body Humanoid Teleoperation.

Our paper offers a general-purpose action tracking strategy within a global closed-loop framework, together with a large-scale human motion dataset.

This repository includes:

  • Multi-simulator support
    • Support multiple simulators including IsaacGym, IsaacSim, and MjLab (with MjLab as a primary simulator).
  • Efficient RL training
    • Support multi-GPU parallel training for large-scale experiments.
  • AMP rewards
    • Implementation of AMP discriminator rewards for motion imitation policies.
  • Transformer backbone
    • Our project supports both MLP and Transformer networks as the policy backbone.

Below are the installation and usage instructions for the code in the mjlab environment.

Install

We provide a lightweight environment setup method. We test the code in the following environment:

  • OS: Ubuntu 22.04
  • GPU: NVIDIA RTX 4090, Driver Version: 575.64.03
conda create -n clot python=3.11

pip install warp-lang --extra-index-url https://pypi.nvidia.com/
pip install "mujoco-warp @ git+https://github.com/google-deepmind/mujoco_warp@502556df5e44d79d6bdaa64361669602b5a206cf"

git clone https://github.com/zhutengjie/CLOT.git
cd CLOT

pip install -e .

Data

We have currently uploaded about 10 hours of data to Hugging Face, including BVH files (with the coordinate system converted to Z-up and X-forward), as well as motion data retargeted to Adam Pro and G1 (following the ASAP format). We also provide the corresponding checkpoints for Adam Pro and G1 in this repository.

git lfs install
git clone https://huggingface.co/datasets/Zhutengjie/human_motion

If you want to visualize the data results:

python visualize.py human_motion/adam_data_30fps_cont_mask.pkl humanoidverse/data/robots/adam_pro/adam_pro.xml
# or
python visualize.py human_motion/g1_data_50fps_cont_mask.pkl humanoidverse/data/robots/g1/g1_23dof_lock_wrist.xml

Test

If you want to test the checkpoints in the mjlab environment:

# for Adam Pro
python humanoidverse/eval_agent.py +checkpoint=human_motion/adam_result/adam.pt
# for G1
python humanoidverse/eval_agent.py +checkpoint=human_motion/G1_result/g1.pt

If you want to deploy in the MuJoCo environment:

# for Adam Pro
python humanoidverse/urci.py +opt=record +simulator=mujoco +checkpoint=human_motion/adam_result/exported/adam.onnx

# for G1
python humanoidverse/urci.py +opt=record +simulator=mujoco +checkpoint=human_motion/G1_result/exported/g1.onnx

Train

By default, training is conducted on 8 × 48GB RTX 4090 GPUs.

# for Adam Pro
sh train_adam_multi.sh

# for G1
sh train_g1_multi.sh

If you want to change the number of GPUs used for training, please modify ngpu in humanoidverse/config/base/fabric.yaml and nproc_per_node in the corresponding .sh script.

Model Setup

By default, model is set up to transformer architechture with amp reward. If you want to change the architechture or disable amp reward, please modify +exp in the corresponding .sh script.

# for mlp + amp
+exp=motion_tracking_amp

# for mlp only
+exp=motion_tracking

#for transformer only
+exp=motion_tracking_transformer

Citation

If you find our work helpful, please cite:

@article{zhu2026clot,
  title={CLOT: Closed-Loop Global Motion Tracking for Whole-Body Humanoid Teleoperation},
  author={Zhu, Tengjie and Cai, Guanyu and Zhaohui, Yang and Ren, Guanzhu and Xie, Haohui and Wang, ZiRui and Wu, Junsong and Wang, Jingbo and Yang, Xiaokang and Mu, Yao and others},
  journal={arXiv preprint arXiv:2602.15060},
  year={2026}
}
}

License

This codebase is under CC BY-NC 4.0 license. You may not use the material for commercial purposes, e.g., to make demos to advertise your commercial products.

Acknowledgements

Our code builds upon and references the following excellent works. We sincerely thank the authors for their open-source contributions:

We would like to sincerely thank PNDbotics for providing the robotic platform and comprehensive support related to the robot hardware. We also thank Baidu for providing the GPU resources.

Contact

Feel free to open an issue or discussion if you encounter any problems or have questions about this project.

For collaborations, feedback, or further inquiries, please reach out to:

We welcome contributions and are happy to support the community in building upon this work!

About

official code for paper CLOT: Closed-Loop Global Motion Tracking for Whole-Body Humanoid Teleoperation

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages

  • Python 99.3%
  • Other 0.7%