Zhikai Zhang

I am a first-year Ph.D. student in the IIIS at Tsinghua University, advised by Prof. Li Yi.

Currently, I do research about humanoid robot learning at GALBOT. We are actively looking for interns and full-time employees here about humanoid research.

Email  /  Scholar  /  Github

profile photo

News

  • [2025/06] We open source OpenWBT GitHub stars, a cross-embodiment, easy-to-deploy VR-based humanoid whole-body-teleoperation system, based on our recent work R2S2. Try it!

Research

Unleashing Humanoid Reaching Potential via Real-world-Ready Skill Space
Zhikai Zhang*, Chao Chen*, Han Xue*, Jilong Wang, Sikai Liang, Zongzhang Zhang, He Wang, Li Yi
arXiv, 2025
project page / arXiv / code (OpenWBT) GitHub stars

We present Real-world-Ready Skill Space (R2S2), a skill space that encompasses and encodes various real-world-ready motor skills.

freemotion image
FreeMotion: MoCap-Free Human Motion Synthesis with Multimodal Large Language Models
Zhikai Zhang, Yitang Li, Haofeng Huang, Mingxian Lin, Li Yi
ECCV, 2024
project page / arXiv

Our method explores open-set human motion synthesis using natural language instructions without any motion data.

freepoint image
FreePoint: Unsupervised Point Cloud Instance Segmentation
Zhikai Zhang, Jian Ding, Li Jiang, Dengxin Dai, Guisong Xia
CVPR, 2024
paper / arXiv

Our method explores unsupervised point cloud instance segmentation.

Thanks for watching — whether you're human or robot :)
Template stolen from Jon Barron.
Last updated: May, 2025