Worth tracking if you care about alternatives to the standard transformer playbook, especially the line of work trying to keep strong language-model performance while making inference and memory use much cheaper.
Researcher Profile
Editor reviewedKrishna Sri Ipsit Mantri
RWKV and efficient sequence modeling
Doctoral researcher at the University of Bonn and the Lamarr Institute working on graph representation learning
A strong page to keep because it connects the original RWKV paper to a later, much clearer research identity in graph representation learning, latent-space geometry, and multi-task adaptation.
Organizations
About This Page
This profile is meant to help you get oriented quickly: why this researcher matters, what to read first, and where to explore next.
Last reviewed
March 18, 2026
Known For
The ideas, systems, and research directions that make this person worth knowing.
01
Original RWKV authorship
02
Graph representation learning
03
Geometry-aware adaptation and graph methods
04
RWKV and efficient sequence modeling
05
RWKV: Reinventing RNNs for the Transformer Era
06
RWKV (project)
Start Here
Canonical papers, project pages, or repositories that anchor this profile.
Signature Works
Additional papers, projects, or repositories that help flesh out the profile.
Supporting Sources
Additional links that help verify and flesh out this profile.
Related Researchers
People worth exploring next because they share topics, labs, or source material with this profile.
A distinctive page because his work bridges open-sequence-model experimentation with applied machine learning for molecules, proteins, and structural biology, and he shows up on multiple RWKV-family papers including the hybrid GoldFinch branch rather than only the first release.
A strong open-model and data-centric page because his work sits close to the infrastructure that made OLMo and Dolma useful to the broader research community rather than just another benchmark-driven model release.
Co-authored RWKV: Reinventing RNNs for the Transformer Era.
Useful because it turns an otherwise thin RWKV byline into a real systems profile: after the original paper, his public work tracks toward large-scale pretraining infrastructure, pipeline parallelism, and systems support for frontier-scale models.
Co-authored RWKV: Reinventing RNNs for the Transformer Era.