Home/Researchers/Yikang Shen

Researcher Profile

Editor reviewed

Yikang Shen

Linear transformers via the delta rule

Researcher working on efficient sequence models and multimodal RLHF

Useful because his work links two strands that usually get discussed separately: efficient sequence-model architectures on one side and multimodal alignment work on the other.

Organizations

MIT-IBM Watson AI Lab

About This Page

This profile is meant to help you get oriented quickly: why this researcher matters, what to read first, and where to explore next.

Known For

The ideas, systems, and research directions that make this person worth knowing.

01

Gated linear attention and Delta-rule models

02

Multimodal RLHF and hallucination reduction

03

Research at the boundary of systems efficiency and alignment

04

Linear transformers via the delta rule

05

Parallelizing Linear Transformers with the Delta Rule over Sequence Length

06

DeltaNet

Start Here

Canonical papers, project pages, or repositories that anchor this profile.

Signature Works

Additional papers, projects, or repositories that help flesh out the profile.

Supporting Sources

Additional links that help verify and flesh out this profile.

Related Researchers

People worth exploring next because they share topics, labs, or source material with this profile.

Shared canonical source

Songlin Yang

Linear transformers via the delta rule

3 sources

A high-signal researcher for the post-attention design space, especially if you care about the line of work trying to make linear-attention and Delta-rule models actually competitive in real language-model systems.

Start HereSonglin Yang

Shared canonical source

Yoon Kim

Linear transformers via the delta rule

4 sources

A useful researcher to study for the line from classic neural NLP into today’s efficient large-model work, with papers that span early sentence models, character-aware language modeling, and current sequence-model efficiency research.

Start HereYoon Kim