Co-authored Switch Transformers: a core reference for practical MoE scaling.
Researcher Profile
William Fedus
Trillion-parameter scaling with sparsity (Switch Transformers)
Co-author, Switch Transformers
Co-authored Switch Transformers: a core reference for practical MoE scaling.
Topics
About This Page
This profile is meant to help you get oriented quickly: why this researcher matters, what to read first, and where to explore next.
Last updated
March 20, 2026
Known For
The ideas, systems, and research directions that make this person worth knowing.
01
Trillion-parameter scaling with sparsity (Switch Transformers)
02
Switch Transformers: Scaling to Trillion Parameter Models with Simple and Efficient Sparsity
03
MoE
04
Scaling
Start Here
Canonical papers, project pages, or repositories that anchor this profile.
Related Researchers
People worth exploring next because they share topics, labs, or source material with this profile.
One of the most important architecture-level thinkers in modern AI, with influence spanning Transformers, efficient scaling, and mixture-of-experts systems.
Foundational less for any single public paper than for shaping the infrastructure, engineering culture, and systems thinking that make frontier-model research possible.
Co-authored GLaM: an influential MoE scaling reference in large language modeling.
Co-authored GLaM: an influential MoE scaling reference in large language modeling.
Co-authored GLaM: an influential MoE scaling reference in large language modeling.