A strong person to follow if you care about open-weight language models and retrieval-heavy NLP systems, especially the line from RoBERTa and RAG into LLaMA-era model development.
Researcher Profile
Editor reviewedYann LeCun
Representation learning, AI systems
Chief AI Scientist at Meta and professor at New York University
A foundational deep-learning figure whose influence spans convolutional networks, representation learning, and long-running arguments about what capable AI systems should optimize for next.
Organizations
Labs
Topics
About This Page
This profile is meant to help you get oriented quickly: why this researcher matters, what to read first, and where to explore next.
Last reviewed
March 18, 2026
Known For
The ideas, systems, and research directions that make this person worth knowing.
01
Convolutional networks
02
Representation learning and self-supervision
03
Field-shaping arguments about future AI architectures
04
Representation learning, AI systems
05
A Path Towards Autonomous Machine Intelligence
06
Meta
Start Here
Canonical papers, project pages, or repositories that anchor this profile.
Related Researchers
People worth exploring next because they share topics, labs, or source material with this profile.
A foundational computer-vision researcher whose work on representations and architectures still shapes modern pretraining and perception systems.
One of the cleaner bridge figures between the vision-transformer era and the open-weight LLaMA era: his public paper trail runs from influential self-supervised vision work into the first LLaMA release, Llama 2, and Code Llama.
A strong page to keep because he sits on both sides of a major shift in open models: he appears on Meta's LLaMA 2 paper and then on Mistral 7B and Mixtral, which makes him part of the early handoff from the first LLaMA wave into Mistral's open-weight model line.
A stronger page than the old stub because his work cuts across two important threads in modern language models: early retrieval-augmented generation systems like Atlas and the later LLaMA open-weight model line.
Worth upgrading because he is present across multiple major generations of the LLaMA family, which makes his page more useful as a stable thread through Meta's open-model program than as a one-paper author stub.