A worthwhile head-page upgrade because it gives one of the quieter Jamba contributors a concrete place in the stack: the pre- and post-training work that turns a hybrid architecture into an actual usable model.
Researcher Profile
Editor reviewedYuval Globerson
Hybrid Transformer–Mamba language models (Jamba)
Algorithm developer at AI21 Labs
A better head-queue page because it turns one of the thinner Jamba-1.5 coauthor profiles into an actual AI21 systems profile with a concrete role, a paper trail, and a clearer place in the hybrid-model program.
Organizations
Labs
Topics
About This Page
This profile is meant to help you get oriented quickly: why this researcher matters, what to read first, and where to explore next.
Last reviewed
March 18, 2026
Known For
The ideas, systems, and research directions that make this person worth knowing.
01
Algorithm development at AI21 Labs
02
Jamba-1.5 and hybrid-model systems
03
Enterprise-oriented model engineering
04
Hybrid Transformer–Mamba language models (Jamba)
05
Jamba: A Hybrid Transformer-Mamba Language Model
06
Jamba-1.5: Hybrid Transformer-Mamba Models at Scale
Start Here
Canonical papers, project pages, or repositories that anchor this profile.
Signature Works
Additional papers, projects, or repositories that help flesh out the profile.
Related Researchers
People worth exploring next because they share topics, labs, or source material with this profile.
Useful because it turns one of the anonymous-looking Jamba authors into an actual person page, which makes the hybrid-model line easier to understand than treating it as a single monolithic team output.
Useful because it captures one of the less-visible people behind AI21’s training stack, where hybrid-model quality depends as much on pre- and post-training choices as on the architectural headline.
A useful page because evaluation work is easy to flatten into leaderboard noise, and her profile anchors the people inside AI21 who were responsible for turning Jamba performance claims into something measurable.
A valuable systems page because hybrid-model launches depend on much more than modeling alone, and his contribution bucket points directly at the serving and infrastructure work needed to make Jamba usable in practice.
A better long-tail AI21 page because it makes the data side of Jamba visible, instead of leaving the impression that hybrid-model progress came only from architecture and not from the people shaping the data pipeline underneath it.