Language Models are Few-Shot Learners (GPT-3)
Failed to add items
Sorry, we are unable to add the item because your shopping cart is already at capacity.
Add to basket failed.
Please try again later
Add to Wish List failed.
Please try again later
Remove from Wish List failed.
Please try again later
Follow podcast failed
Unfollow podcast failed
-
Narrated by:
-
By:
About this listen
In this third episode of AI Papers Explained, we explore GPT-3: Language Models are Few-Shot Learners, the landmark paper by OpenAI (2020).
Discover how scaling up model size and training data led to new emergent capabilities and marked the beginning of the large language model era.
We connect this milestone to the foundations laid by Attention Is All You Need and BERT, showing how GPT-3 transformed research into the age of general-purpose AI.
🎙️ Source: Brown et al., OpenAI, 2020 — Language Models are Few-Shot Learners (arXiv:2005.14165)
No reviews yet
In the spirit of reconciliation, Audible acknowledges the Traditional Custodians of country throughout Australia and their connections to land, sea and community. We pay our respect to their elders past and present and extend that respect to all Aboriginal and Torres Strait Islander peoples today.