Home/Researchers/Tomer Asida

Researcher Profile

Editor reviewed

Tomer Asida

Hybrid Transformer–Mamba language models (Jamba)

Researcher behind AI21's Jamba hybrid-model work

A sensible page to keep because his name appears directly on the original Jamba paper, giving users another concrete entry point into the people who built AI21’s hybrid architecture.

Organizations

AI21 Labs

Labs

About This Page

This profile is meant to help you get oriented quickly: why this researcher matters, what to read first, and where to explore next.

Known For

The ideas, systems, and research directions that make this person worth knowing.

01

Contributions to the original Jamba model

02

Hybrid Transformer-Mamba research at AI21 Labs

03

Public model-release work in the Jamba line

04

Hybrid Transformer–Mamba language models (Jamba)

05

Jamba: A Hybrid Transformer-Mamba Language Model

06

Jamba-1.5: Hybrid Transformer-Mamba Models at Scale

Start Here

Canonical papers, project pages, or repositories that anchor this profile.

Supporting Sources

Additional links that help verify and flesh out this profile.

Related Researchers

People worth exploring next because they share topics, labs, or source material with this profile.

Shared canonical source

Alan Arazi

Hybrid Transformer–Mamba language models (Jamba)

4 sources

A valuable page in this cluster because his public role description is unusually specific: post-training, steerability, and AI-generated evaluation data are exactly the kinds of practical problems strong researcher pages should make discoverable.

Start HereAlan Arazi

Shared canonical source

Dor Muhlgay

Hybrid Transformer–Mamba language models (Jamba)

5 sources

A strong long-tail researcher page because his public profile explicitly points to factual knowledge and grounding, which are much more useful signals than another generic AI21/Jamba placeholder.

Start HereDor Muhlgay