My research focuses on embodied AI, visuotactile sensing, and dexterous robotic manipulation. I aim to build full-stack intelligent robotic systems — from flexible sensor hardware to learning-based control policies.
Integrated Electronic Skin for Dexterous Robotic Hands
[GitHub]
Yunqi Zhong, supervised by
Prof. Jiankun Wang.
The laboratory is led by Academician
Max Q.-H. Meng
Shenzhen Key Laboratory of Robotics Perception and Intelligence,
SUSTech
2025.09–present
Focus on real-time visuotactile perception for dexterous robotic hands.
Key contributions: assembled and optimized a 17-DOF dexterous robotic hand;
built the iontronic electronic skin pipeline from flexible sensor fabrication to parallel FDM/CDM readout circuits;
participated in an approved Guangdong Climbing Program project.
DOGlove: Dexterous Hand Teleoperation for RL-Based Assembly Tasks
Yunqi Zhong, supervised by
A/Prof. Chao Zhao
School of Artificial Intelligence,
Jilin University
2026.01–2026.03
Replicated a low-cost, high-performance teleoperation system for dexterous hands
as a data collection and evaluation testbed for reinforcement learning
and vision-based policy learning. Downstream task: autonomous LEGO assembly demonstration.
Key contributions: procured components and assembled full hardware; refined 3D-printed models;
implemented real-time hand pose tracking and force feedback.