Home/Researchers/Rui-Jie Zhu

Researcher Profile

Editor reviewed

Rui-Jie Zhu

RWKV and efficient sequence modeling

PhD student at the University of California, Santa Cruz working on efficient language models and spiking neural networks

Probably the strongest page in this batch because he spans the original RWKV paper, Eagle/Finch-adjacent work, and later efficient-language-model papers like SpikeGPT and Gated Slot Attention instead of ending at a single coauthor credit.

Organizations

University of California, Santa Cruz

About This Page

This profile is meant to help you get oriented quickly: why this researcher matters, what to read first, and where to explore next.

Known For

The ideas, systems, and research directions that make this person worth knowing.

01

RWKV and Eagle/Finch sequence-model work

02

SpikeGPT

03

Efficient language modeling and spiking neural networks

04

RWKV and efficient sequence modeling

05

RWKV: Reinventing RNNs for the Transformer Era

06

RWKV (project)

Start Here

Canonical papers, project pages, or repositories that anchor this profile.

Signature Works

Additional papers, projects, or repositories that help flesh out the profile.

Related Researchers

People worth exploring next because they share topics, labs, or source material with this profile.

Shared canonical source

Eric Alcaide

RWKV and efficient sequence modeling

5 sources

A distinctive page because his work bridges open-sequence-model experimentation with applied machine learning for molecules, proteins, and structural biology, and he shows up on multiple RWKV-family papers including the hybrid GoldFinch branch rather than only the first release.

Start HereEric Alcaide

Shared canonical source

Alon Albalak

RWKV and efficient sequence modeling

5 sources

A strong open-model and data-centric page because his work sits close to the infrastructure that made OLMo and Dolma useful to the broader research community rather than just another benchmark-driven model release.

Start HereAlon Albalak

Shared canonical source

Huanqi Cao

RWKV and efficient sequence modeling

4 sources

Useful because it turns an otherwise thin RWKV byline into a real systems profile: after the original paper, his public work tracks toward large-scale pretraining infrastructure, pipeline parallelism, and systems support for frontier-scale models.