Home/Researchers/Qihang Zhao

Researcher Profile

Editor reviewed

Qihang Zhao

RWKV and efficient sequence modeling

Researcher contributing to RWKV, Eagle/Finch, and RWKV-inspired efficient language models

Useful because his work connects the main RWKV sequence-model line with the RWKV-inspired SpikeGPT branch, making the page more informative than a single coauthor record.

Organizations

LuxiTech Co. Ltd.University of Science and Technology of China

About This Page

This profile is meant to help you get oriented quickly: why this researcher matters, what to read first, and where to explore next.

Known For

The ideas, systems, and research directions that make this person worth knowing.

01

Original RWKV authorship

02

Eagle and Finch

03

SpikeGPT and efficient language modeling inspired by RWKV

04

RWKV and efficient sequence modeling

05

RWKV: Reinventing RNNs for the Transformer Era

06

RWKV (project)

Start Here

Canonical papers, project pages, or repositories that anchor this profile.

Supporting Sources

Additional links that help verify and flesh out this profile.

Related Researchers

People worth exploring next because they share topics, labs, or source material with this profile.

Shared canonical source

Rui-Jie Zhu

RWKV and efficient sequence modeling

4 sources

Probably the strongest page in this batch because he spans the original RWKV paper, Eagle/Finch-adjacent work, and later efficient-language-model papers like SpikeGPT and Gated Slot Attention instead of ending at a single coauthor credit.

Start HereRui-Jie Zhu