A useful RWKV page because he is present on the original paper, Eagle/Finch, and RWKV-7, making him part of the smaller set of contributors who stayed with the architecture as it evolved rather than only appearing at launch.
Researcher Profile
Editor reviewedRuichong Zhang
RWKV and efficient sequence modeling
Researcher at Tsinghua University contributing to multiple RWKV releases
A strong long-tail RWKV page because he is present on the original paper, Eagle/Finch, and RWKV-7, which makes him part of the smaller recurring contributor set that carried the architecture through several major revisions.
Organizations
About This Page
This profile is meant to help you get oriented quickly: why this researcher matters, what to read first, and where to explore next.
Known For
The ideas, systems, and research directions that make this person worth knowing.
01
Repeated authorship across RWKV releases
02
Matrix-valued and dynamic-state RWKV variants
03
Open sequence-model work around the RWKV project
04
RWKV and efficient sequence modeling
05
RWKV: Reinventing RNNs for the Transformer Era
06
RWKV (project)
Start Here
Canonical papers, project pages, or repositories that anchor this profile.
Supporting Sources
Additional links that help verify and flesh out this profile.
Related Researchers
People worth exploring next because they share topics, labs, or source material with this profile.
Worth tracking if you care about alternatives to the standard transformer playbook, especially the line of work trying to keep strong language-model performance while making inference and memory use much cheaper.
A good RWKV page because he appears on the original paper, Eagle/Finch, and RWKV-7, which gives the profile real continuity instead of a one-off coauthor credit before he moved into a broader PhD research program.
Important in the long tail because he is another contributor whose work spans both the RWKV sequence-model thread and the Polish PLLuM effort, which makes his page more informative than a generic single-paper profile.
A good page to surface because it connects two otherwise separate maps: the open RWKV sequence-model line and the newer Polish-language model ecosystem around PLLuM.
A strong page to keep because he links the early RWKV work to the later Wrocław-centered PLLuM effort, which makes him one of the clearer continuity threads between open sequence models and Polish-language LLM development.