SLMs vs LLMs: Building Faster, Cheaper, and More Private AI Systems cover art

SLMs vs LLMs: Building Faster, Cheaper, and More Private AI Systems

SLMs vs LLMs: Building Faster, Cheaper, and More Private AI Systems

Listen for free

View show details

About this listen

Do you really need a trillion-parameter model to solve enterprise problems?

In this episode, we unpack why Small Language Models (SLMs) are gaining momentum across enterprise AI. We explore how techniques like knowledge distillation and quantization allow smaller models to deliver competitive performance - while significantly reducing cost, latency, and energy consumption.

We also discuss why SLMs are a natural fit for agentic AI, enabling multi-step reasoning, on-device and on-prem deployments, and stronger data privacy in regulated environments. The takeaway: the future of AI isn’t just about bigger models, but smarter architectures built for real-world production.

No reviews yet
In the spirit of reconciliation, Audible acknowledges the Traditional Custodians of country throughout Australia and their connections to land, sea and community. We pay our respect to their elders past and present and extend that respect to all Aboriginal and Torres Strait Islander peoples today.