Mamba: Linear-Time Sequence Modeling with Selective State Spaces
Failed to add items
Sorry, we are unable to add the item because your shopping cart is already at capacity.
Add to basket failed.
Please try again later
Add to Wish List failed.
Please try again later
Remove from Wish List failed.
Please try again later
Follow podcast failed
Unfollow podcast failed
-
Narrated by:
-
By:
About this listen
In this episode of AI Papers Explained, we explore Mamba: Linear-Time Sequence Modeling with Selective State Spaces, a 2023 paper by Albert Gu and Tri Dao that rethinks how AI handles long sequences.Unlike Transformers, which compare every token to every other, Mamba processes information linearly and selectively, remembering only what matters.
This marks a shift toward faster, more efficient architectures, a possible glimpse into the post-Transformer era.
No reviews yet
In the spirit of reconciliation, Audible acknowledges the Traditional Custodians of country throughout Australia and their connections to land, sea and community. We pay our respect to their elders past and present and extend that respect to all Aboriginal and Torres Strait Islander peoples today.