Co-authored ZeRO: foundational memory optimizations for training very large models.
Researcher Profile
Yuxiong He
Memory-efficient distributed training (ZeRO)
Co-author, ZeRO
Co-authored ZeRO: foundational memory optimizations for training very large models.
Topics
About This Page
This profile is meant to help you get oriented quickly: why this researcher matters, what to read first, and where to explore next.
Last updated
March 20, 2026
Known For
The ideas, systems, and research directions that make this person worth knowing.
01
Memory-efficient distributed training (ZeRO)
02
ZeRO: Memory Optimizations Toward Training Trillion Parameter Models
03
Systems
04
Training
Start Here
Canonical papers, project pages, or repositories that anchor this profile.
Related Researchers
People worth exploring next because they share topics, labs, or source material with this profile.
Co-authored ZeRO: foundational memory optimizations for training very large models.
Co-authored ZeRO: foundational memory optimizations for training very large models.
Co-authored Megatron-LM: a core reference for scaling transformer training via model parallelism.
Co-authored Megatron-LM: a core reference for scaling transformer training via model parallelism.
Co-authored Megatron-LM: a core reference for scaling transformer training via model parallelism.