
Ep05: "Deploy Local LLMs π’π§ the Cloud (πππ% ππππ ππ«π’π―πππ²)"
Failed to add items
Sorry, we are unable to add the item because your shopping cart is already at capacity.
Add to basket failed.
Please try again later
Add to Wish List failed.
Please try again later
Remove from Wish List failed.
Please try again later
Follow podcast failed
Unfollow podcast failed
-
Narrated by:
-
By:
About this listen
π¨π½βπ Welcome to Episode 05 of "Tech Beats unplugged"
This time, we tried something completely crazy β we're letting the AI hosts take over! That's right π. We're flipping the script and giving the AI the mic to guide us through the fascinating world of local LLMs. but that's not all as this episode is actually inspired by my recent talk at Oracle Cloud World in Vegas. The topic? You guessed it: Local LLMs in the cloud.
π Weβre so excited to share our latest tech Beats show with youπ§‘!
We hope you'll enjoy it!!!
Topics discussed:
- (00:00) Introduction
- (01:00) Why OpenAI Might Not Be Your BFF?
- (02:40) Local/Open LLMs to the Rescue!
- (03:38) What's Quantization!
- (04:30) Where to find these Open LLMs?
- (05:02) Inference Engines (Ollama)!
- (05:50) What's a modelfile!
- (06:40) What about deploying local AI to the cloud?(OKE/managed kubernetes)
- (07:30) From zero to cloud deployment Hero
- (08:28) What's Next (LLM ethic benchmark)
- (09:55) Outro.
Show Notes
- My local LLM GitRepo: Ollama_lab
- Helm leaderboard for model safety: Sandford Helm model leaderboard
- My talks in Oracle cloud world 2024: OCW2024LLM
What listeners say about Ep05: "Deploy Local LLMs π’π§ the Cloud (πππ% ππππ ππ«π’π―πππ²)"
Average Customer RatingsReviews - Please select the tabs below to change the source of reviews.
In the spirit of reconciliation, Audible acknowledges the Traditional Custodians of country throughout Australia and their connections to land, sea and community. We pay our respect to their elders past and present and extend that respect to all Aboriginal and Torres Strait Islander peoples today.