One of the clearest people to study if you want the throughline from early neural sequence models to transformers, efficient long-context variants, and modern reasoning systems.
Researcher Profile
Editor reviewedLukasz Kaiser
Transformers
Deep learning researcher at OpenAI and transformer coauthor
One of the clearest people to study if you want the throughline from early neural sequence models to transformers, efficient long-context variants, and modern reasoning systems.
Organizations
About This Page
This profile is meant to help you get oriented quickly: why this researcher matters, what to read first, and where to explore next.
Last reviewed
March 18, 2026
Known For
The ideas, systems, and research directions that make this person worth knowing.
01
Attention Is All You Need
02
Neural sequence-model research before and after transformers
03
Reasoning-model work at OpenAI
04
Transformers
05
Foundational
06
Code LLMs
Start Here
Canonical papers, project pages, or repositories that anchor this profile.
Signature Works
Additional papers, projects, or repositories that help flesh out the profile.
Supporting Sources
Additional links that help verify and flesh out this profile.
Related Researchers
People worth exploring next because they share topics, labs, or source material with this profile.
A foundational figure in modern sequence modeling whose work on the Transformer changed the technical direction of language and multimodal systems.
A foundational transformer researcher whose work still matters because it connects the original architecture shift to later efforts on efficiency, scaling, and sequence modeling infrastructure.
One of the most important architecture-level thinkers in modern AI, with influence spanning Transformers, efficient scaling, and mixture-of-experts systems.
A useful person to follow for the evaluation layer of open models, especially where benchmark infrastructure and RLHF tooling become reusable community assets rather than one-off lab code.
A foundational transformer co-author who is now worth following for a very different reason: he is one of the few people trying to build a serious frontier lab around alternatives to the default scaling path.