Transformers : Attention Is All You Need (The Birth of Transformers)
Failed to add items
Sorry, we are unable to add the item because your shopping cart is already at capacity.
Add to basket failed.
Please try again later
Add to Wish List failed.
Please try again later
Remove from Wish List failed.
Please try again later
Follow podcast failed
Unfollow podcast failed
-
Narrated by:
-
By:
About this listen
In this first episode of AI Papers Explained, we explore one of the most influential research papers in the history of deep learning: Attention Is All You Need (Vaswani et al., 2017).
You’ll learn why the Transformer architecture replaced RNNs and LSTMs, how self-attention works, and how this paper paved the way for models like BERT, GPT, and T5.
🎙️ Hosted by Anass El Basraoui, a Data Scientist and AI researcher.
Topics covered:
- Scaled dot-product attention
- Multi-head attention• Encoder–decoder structure
- Positional encoding
- The legacy of the Transformer
No reviews yet
In the spirit of reconciliation, Audible acknowledges the Traditional Custodians of country throughout Australia and their connections to land, sea and community. We pay our respect to their elders past and present and extend that respect to all Aboriginal and Torres Strait Islander peoples today.