A useful person to follow if you care about the bridge between embodied-agent research and modern open-weight language-model systems, rather than treating those worlds as separate.
Researcher Profile
Florian Bressand
Mixture-of-experts LLMs
Co-author, Mixtral
Co-authored Mixtral of Experts: a key MoE reference in the open-weights frontier.
Labs
About This Page
This profile is meant to help you get oriented quickly: why this researcher matters, what to read first, and where to explore next.
Last updated
March 20, 2026
Best First Clicks
Official And External Links
Known For
The ideas, systems, and research directions that make this person worth knowing.
01
Mixture-of-experts LLMs
02
Mixtral of Experts
03
Mixtral
04
MoE
05
Large Language Models
Start Here
Canonical papers, project pages, or repositories that anchor this profile.
Signature Works
Additional papers, projects, or repositories that help flesh out the profile.
Related Researchers
People worth exploring next because they share topics, labs, or source material with this profile.
A strong person to know for the Mistral line of open-weight models, especially if you care about the arc from compact performant base models into mixture-of-experts, multimodal systems, and reasoning models.
Useful because his work connects earlier privacy and representation-learning research to some of Mistral’s most important open-weight model releases.
Co-authored Mixtral of Experts: a key MoE reference in the open-weights frontier.
Co-authored Mixtral of Experts: a key MoE reference in the open-weights frontier.
Co-authored Mixtral of Experts: a key MoE reference in the open-weights frontier.