Scaling AI Safety Through Mentorship w/ Dr. Ryan Kidd cover art

Scaling AI Safety Through Mentorship w/ Dr. Ryan Kidd

Scaling AI Safety Through Mentorship w/ Dr. Ryan Kidd

Listen for free

View show details

About this listen

What does it actually take to build a successful AI safety organization? I'm joined by Dr. Ryan Kidd, who has co-led MATS from a small pilot program to one of the field's premier talent pipelines. In this episode, he reveals the low-hanging fruit in AI safety field-building that most people are missing: the amplifier archetype.I pushed Ryan on some hard questions, from balancing funder priorities and research independence, to building a robust selection process for both mentors and participants. Whether you're considering a career pivot into AI safety or already working in the field, this conversation offers practical advice on how to actually make an impact.Chapters(00:00) - - Intro (08:16) - - Building MATS Post-FTX & Summer of Love (13:09) - - Balancing Funder Priorities and Research Independence (19:44) - - The MATS Selection Process (33:15) - - Talent Archetypes in AI Safety (50:22) - - Comparative Advantage and Career Capital in AI Safety (01:04:35) - - Building the AI Safety Ecosystem (01:15:28) - - What Makes a Great AI Safety Amplifier (01:21:44) - - Lightning Round Questions (01:30:30) - - Final Thoughts & OutroLinksMATSRyan's WritingLessWrong post - Talent needs of technical AI safety teamsLessWrong post - AI safety undervalues foundersLessWrong comment - Comment permalink with 2025 MATS program detailsLessWrong post - Talk: AI Safety Fieldbuilding at MATSLessWrong post - MATS Mentor SelectionLessWrong post - Why I funded PIBBSSEA Forum post - How MATS addresses mass movement building concernsFTX Funding of AI SafetyLessWrong blogpost - An Overview of the AI Safety Funding SituationFortune article - Why Sam Bankman-Fried’s FTX debacle is roiling A.I. researchNY Times article - FTX probes $6.5M in payments to AI safety group amid clawback crusadeCointelegraph article - FTX probes $6.5M in payments to AI safety group amid clawback crusadeFTX Future Fund article - Future Fund June 2022 Update (archive)Tracxn page - Anthropic Funding and InvestorsTraining & Support ProgramsCatalyze ImpactSeldon LabSPARBlueDot ImpactYCombinatorPivotalAthenaAstra FellowshipHorizon FellowshipBASE FellowshipLASR LabsEntrepeneur FirstFunding OrganizationsCoefficient Giving (previously Open Philanthropy)LTFFLongview PhilanthropyRenaissance PhilanthropyCoworking SpacesLISAMoxLighthavenFAR LabsConstellationColliderNET OfficeBAISHResearch Organizations & StartupsAtla AIApollo ResearchTimaeusRAND CASTCHAIOther SourcesAXRP website - The AI X-risk Research PodcastLessWrong blogpost - Shard Theory: An Overview
No reviews yet
In the spirit of reconciliation, Audible acknowledges the Traditional Custodians of country throughout Australia and their connections to land, sea and community. We pay our respect to their elders past and present and extend that respect to all Aboriginal and Torres Strait Islander peoples today.