Back to researchers

Noam Shazeer

Transformers, Mixture-of-Experts, scaling

One of the most important builders behind Transformers and MoE scaling that power modern LLMs.

Highlights

GeminiTransformersMoEScaling
Focus: Transformers, Mixture-of-Experts, scaling
Why it matters: One of the most important builders behind Transformers and MoE scaling that power modern LLMs.

Research Areas

GeminiTransformersMoEScaling
Noam Shazeer - AI Researcher Profile | 500AI