Home/Researchers/Peng Zhou

Researcher Profile

Editor reviewed

Peng Zhou

RWKV and efficient sequence modeling

Researcher at LuxiTech working on RWKV-family and linear-time sequence models

A strong RWKV page to have because he recurs across the original RWKV paper, Eagle and Finch, and Gated Slot Attention, which makes him one of the clearer repeat contributors to this whole sequence-model line rather than a one-off coauthor.

Organizations

LuxiTech

About This Page

This profile is meant to help you get oriented quickly: why this researcher matters, what to read first, and where to explore next.

Known For

The ideas, systems, and research directions that make this person worth knowing.

01

Repeated contributions across the RWKV family

02

Linear-time sequence modeling and recurrent alternatives to Transformers

03

Practical model work spanning RWKV and Gated Slot Attention

04

RWKV and efficient sequence modeling

05

RWKV: Reinventing RNNs for the Transformer Era

06

RWKV (project)

Start Here

Canonical papers, project pages, or repositories that anchor this profile.

Supporting Sources

Additional links that help verify and flesh out this profile.

Related Researchers

People worth exploring next because they share topics, labs, or source material with this profile.