paper Completed

Attention Is All You Need

Vaswani et al. · NeurIPS

Completed 2024

★★★★★

Read →

The original Transformer paper. Self-attention, no recurrence, parallelizable. The architecture that enabled GPT, BERT, and everything since.