Home/Researchers/Yu Zhang

Researcher Profile

Editor reviewed

Yu Zhang

Linear transformers via the delta rule

Researcher at Soochow University working on efficient linear-time sequence modeling

Worth surfacing because he leads the Gated Slot Attention paper, which is one of the clearer attempts to push the RWKV-adjacent efficient-sequence line toward stronger memory and retrieval behavior rather than stopping at architecture novelty.

Organizations

Soochow University

About This Page

This profile is meant to help you get oriented quickly: why this researcher matters, what to read first, and where to explore next.

Known For

The ideas, systems, and research directions that make this person worth knowing.

01

Gated Slot Attention

02

Linear-time sequence modeling

03

Bridging academic sequence-model research with practical FLA tooling

04

Linear transformers via the delta rule

05

Parallelizing Linear Transformers with the Delta Rule over Sequence Length

06

DeltaNet

Start Here

Canonical papers, project pages, or repositories that anchor this profile.

Supporting Sources

Additional links that help verify and flesh out this profile.

Related Researchers

People worth exploring next because they share topics, labs, or source material with this profile.

Shared canonical source

Songlin Yang

Linear transformers via the delta rule

3 sources

A high-signal researcher for the post-attention design space, especially if you care about the line of work trying to make linear-attention and Delta-rule models actually competitive in real language-model systems.

Start HereSonglin Yang

Shared canonical source

Yoon Kim

Linear transformers via the delta rule

4 sources

A useful researcher to study for the line from classic neural NLP into today’s efficient large-model work, with papers that span early sentence models, character-aware language modeling, and current sequence-model efficiency research.

Start HereYoon Kim