A useful RWKV page because he is present on the original paper, Eagle/Finch, and RWKV-7, making him part of the smaller set of contributors who stayed with the architecture as it evolved rather than only appearing at launch.
Researcher Profile
Editor reviewedBo Peng
RWKV and efficient sequence modeling
Creator of the RWKV architecture
Worth tracking if you care about alternatives to the standard transformer playbook, especially the line of work trying to keep strong language-model performance while making inference and memory use much cheaper.
Organizations
About This Page
This profile is meant to help you get oriented quickly: why this researcher matters, what to read first, and where to explore next.
Last reviewed
March 18, 2026
Known For
The ideas, systems, and research directions that make this person worth knowing.
01
RWKV and efficient sequence modeling
02
Open-source alternatives to standard transformer inference
03
Long-context and constant-memory design tradeoffs
04
RWKV: Reinventing RNNs for the Transformer Era
05
RWKV (project)
06
RWKV
Start Here
Canonical papers, project pages, or repositories that anchor this profile.
Signature Works
Additional papers, projects, or repositories that help flesh out the profile.
Supporting Sources
Additional links that help verify and flesh out this profile.
Related Researchers
People worth exploring next because they share topics, labs, or source material with this profile.
A good RWKV page because he appears on the original paper, Eagle/Finch, and RWKV-7, which gives the profile real continuity instead of a one-off coauthor credit before he moved into a broader PhD research program.
Worth surfacing because he shows up on both the original RWKV paper and RWKV-7, which makes him one of the contributors who spans the early release and the later Goose architecture rather than disappearing after launch.
A strong long-tail RWKV page because he is present on the original paper, Eagle/Finch, and RWKV-7, which makes him part of the smaller recurring contributor set that carried the architecture through several major revisions.
A distinctive page because his work bridges open-sequence-model experimentation with applied machine learning for molecules, proteins, and structural biology, and he shows up on multiple RWKV-family papers including the hybrid GoldFinch branch rather than only the first release.
A strong open-model and data-centric page because his work sits close to the infrastructure that made OLMo and Dolma useful to the broader research community rather than just another benchmark-driven model release.