Home/Researchers/Jacob Devlin

Researcher Profile

Editor reviewed

Jacob Devlin

Pretraining and representation learning for NLP

Co-author, BERT

A core name in the pretraining era of NLP, especially if you want to understand how BERT reshaped the field and how that line of work extended into broader document understanding and large-scale language systems.

Organizations

Google

About This Page

This profile is meant to help you get oriented quickly: why this researcher matters, what to read first, and where to explore next.

Known For

The ideas, systems, and research directions that make this person worth knowing.

01

BERT and bidirectional pretraining

02

Language representation learning at Google scale

03

Document understanding and retrieval-oriented NLP systems

04

Pretraining and representation learning for NLP

05

BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

06

NLP

Start Here

Canonical papers, project pages, or repositories that anchor this profile.

Related Researchers

People worth exploring next because they share topics, labs, or source material with this profile.

Shared canonical source

Kenton Lee

NLP systems and evaluation

4 sources

A strong person to follow for practical language systems because his work sits right at the intersection of pretraining, retrieval, and question answering, where product-grade NLP systems either become robust or fall apart.

Start HereKenton Lee

Shared topic

Niki Parmar

Transformers and sequence modeling

3 sources

A foundational transformer researcher whose work still matters because it connects the original architecture shift to later efforts on efficiency, scaling, and sequence modeling infrastructure.