Language Models are Few-Shot Learners (GPT-3) cover art

Language Models are Few-Shot Learners (GPT-3)

Language Models are Few-Shot Learners (GPT-3)

Listen for free

View show details

About this listen

In this third episode of AI Papers Explained, we explore GPT-3: Language Models are Few-Shot Learners, the landmark paper by OpenAI (2020).
Discover how scaling up model size and training data led to new emergent capabilities and marked the beginning of the large language model era.
We connect this milestone to the foundations laid by Attention Is All You Need and BERT, showing how GPT-3 transformed research into the age of general-purpose AI.

🎙️ Source: Brown et al., OpenAI, 2020 — Language Models are Few-Shot Learners (arXiv:2005.14165)

No reviews yet
In the spirit of reconciliation, Audible acknowledges the Traditional Custodians of country throughout Australia and their connections to land, sea and community. We pay our respect to their elders past and present and extend that respect to all Aboriginal and Torres Strait Islander peoples today.