Home/Researchers/Andrew M. Dai

Researcher Profile

Editor reviewed

Andrew M. Dai

Gemini (multimodal foundation models)

Research scientist at Google Research

A good researcher to follow for the infrastructure side of frontier language models, especially mixture-of-experts scaling, instruction tuning, and the data systems that make very large models usable.

Organizations

Google

About This Page

This profile is meant to help you get oriented quickly: why this researcher matters, what to read first, and where to explore next.

Official And External Links

Known For

The ideas, systems, and research directions that make this person worth knowing.

01

Mixture-of-experts language models

02

Instruction tuning and FLAN

03

Data and scaling work behind Gemini-era systems

04

Gemini (multimodal foundation models)

05

Gemini: A Family of Highly Capable Multimodal Models

06

Gemini

Start Here

Canonical papers, project pages, or repositories that anchor this profile.

Signature Works

Additional papers, projects, or repositories that help flesh out the profile.

Supporting Sources

Additional links that help verify and flesh out this profile.

Related Researchers

People worth exploring next because they share topics, labs, or source material with this profile.

Shared canonical source

Quoc V. Le

Gemini (multimodal foundation models)

4 sources

One of the central Google researchers to follow for the line from large-scale language modeling into instruction tuning, multilingual systems, and practical model scaling.

Start HereQuoc V. Le

Shared canonical source

Radu Soricut

Gemini (multimodal foundation models)

4 sources

Important for understanding how multilingual NLP, translation, and multimodal reasoning meet inside production-scale frontier systems rather than staying separate research tracks.

Start HereRadu Soricut