Co-authored the sparsely-gated MoE layer paper: a foundational conditional-computation design.
Researcher Profile
Krzysztof Maziarz
Sparsely-gated mixture-of-experts
Co-author, Sparsely-Gated MoE
Co-authored the sparsely-gated MoE layer paper: a foundational conditional-computation design.
Topics
About This Page
This profile is meant to help you get oriented quickly: why this researcher matters, what to read first, and where to explore next.
Last updated
March 20, 2026
Known For
The ideas, systems, and research directions that make this person worth knowing.
01
Sparsely-gated mixture-of-experts
02
Outrageously Large Neural Networks: The Sparsely-Gated Mixture-of-Experts Layer
03
MoE
04
Conditional computation
05
Scaling
Start Here
Canonical papers, project pages, or repositories that anchor this profile.
Related Researchers
People worth exploring next because they share topics, labs, or source material with this profile.
One of the most important architecture-level thinkers in modern AI, with influence spanning Transformers, efficient scaling, and mixture-of-experts systems.
A foundational figure in modern sequence modeling whose work on the Transformer changed the technical direction of language and multimodal systems.
A foundational transformer researcher whose work still matters because it connects the original architecture shift to later efforts on efficiency, scaling, and sequence modeling infrastructure.
One of the most important optimization researchers of the deep-learning era, especially for work that became default infrastructure across nearly every modern training stack.
A foundational computer-vision researcher whose work on representations and architectures still shapes modern pretraining and perception systems.