Important for the practical representation-learning line behind fastText, multilingual embeddings, and later open-weight model work at Meta.
Researcher Profile
Editor reviewedArmand Joulin
Open-weight foundation models (LLaMA)
Representation-learning researcher at Meta
A strong bridge figure between the older fastText and self-supervision era and the newer open-weight LLaMA wave at Meta.
Organizations
Labs
About This Page
This profile is meant to help you get oriented quickly: why this researcher matters, what to read first, and where to explore next.
Last reviewed
March 18, 2026
Official And External Links
Known For
The ideas, systems, and research directions that make this person worth knowing.
01
fastText and practical NLP baselines
02
Self-supervised vision learning
03
Open-weight foundation models at Meta
04
Open-weight foundation models (LLaMA)
05
LLaMA: Open and Efficient Foundation Language Models
06
LLaMA
Start Here
Canonical papers, project pages, or repositories that anchor this profile.
Signature Works
Additional papers, projects, or repositories that help flesh out the profile.
Related Researchers
People worth exploring next because they share topics, labs, or source material with this profile.
One of the cleaner bridge figures between the vision-transformer era and the open-weight LLaMA era: his public paper trail runs from influential self-supervised vision work into the first LLaMA release, Llama 2, and Code Llama.
A strong page to keep because he sits on both sides of a major shift in open models: he appears on Meta's LLaMA 2 paper and then on Mistral 7B and Mixtral, which makes him part of the early handoff from the first LLaMA wave into Mistral's open-weight model line.
A stronger page than the old stub because his work cuts across two important threads in modern language models: early retrieval-augmented generation systems like Atlas and the later LLaMA open-weight model line.
Worth upgrading because he is present across multiple major generations of the LLaMA family, which makes his page more useful as a stable thread through Meta's open-model program than as a one-paper author stub.
Important for the open-weight frontier-model story because her paper trail runs through both the original LLaMA releases and the early Mistral efficiency push.