Home/Researchers/Mostofa Patwary

Researcher Profile

Mostofa Patwary

Model-parallel training at scale (Megatron-LM)

Co-author, Megatron-LM

Co-authored Megatron-LM: a core reference for scaling transformer training via model parallelism.

About This Page

This profile is meant to help you get oriented quickly: why this researcher matters, what to read first, and where to explore next.

Known For

The ideas, systems, and research directions that make this person worth knowing.

01

Model-parallel training at scale (Megatron-LM)

02

Megatron-LM: Training Multi-Billion Parameter Language Models Using Model Parallelism

03

Megatron-LM (GitHub)

04

Systems

05

Training

Start Here

Canonical papers, project pages, or repositories that anchor this profile.

Related Researchers

People worth exploring next because they share topics, labs, or source material with this profile.