Episodes

  • Ep 8 - How Airy Powers Real-Time AI Agents with Data Streaming and Confluent
    May 13 2025

    Agentic AI is transforming how modern systems interact with data, and Airy is at the forefront with its real-time approach.

    In this episode, Steffen Hoellinger, Co-founder and CEO of Airy, discusses the transformative impact of agentic AI for enterprises and why real-time data streaming is a crucial component. From Apache Flink® and Apache Kafka® to the importance of focusing on core business challenges, Steffen breaks down how Airy builds intelligent systems that react to data as it happens.

    You’ll learn:

    • How Airy’s team uses Confluent to simplify their data infrastructure
    • What agentic AI means beyond the buzzwords and why it’s a leap past basic natural language processing (NLP)
    • Why real-time interaction is essential for creating next-generation software experiences
    • How business users can create and test AI-driven patterns without writing any code

    If you’re building AI-powered apps—or planning to—this one’s for you.

    About the Guest:
    Steffen Hoellinger is the Co-founder and CEO of Airy, where he leads the development of open-source data infrastructure that connects real-time event streaming technologies like Apache Kafka® and Apache Flink® with large language models. Airy helps enterprises build AI agents and data copilots for both technical and business users across streaming and batch data. Steffen is also an active early-stage investor focusing on data infrastructure, AI, and deep technology.

    Guest Highlight:
    “Real-time is getting more and more important. Once you understand all the fancy ways people in the market are trying to solve a real-time data problem, you come to the realization that [data streaming] is the best way of doing things. It became natural to adopt it early on in our journey.”

    Episode Timestamps:
    *(05:10) - Overview of Airy’s AI Solutions
    *(37:20) -   The Runbook: Tools & Tactics
    *(44:00) -   Data Streaming Street Cred: Improve Data Streaming Adoption
    *(47:50) - Quick Bytes
    *(50:25) - Joseph’s Top 3 Takeaways

    Dive Deeper into Data Streaming:

    • EP1—Stream On: Unleashing Innovation with Data Streaming | Life Is But A Stream
    • EP2—Processing Without Pause: Continuous Stream Processing and Apache Flink® | Life Is But A Stream
    • EP3—The Connective Tissue: Shift Left to turn Data Chaos to Clarity | Life Is But A Stream

    Links & Resources:

    • Connect with Joseph: @thedatagiant
    • Joseph’s LinkedIn: linkedin.com/in/thedatagiant
    • Steffen’s LinkedIn: linkedin.com/in/hoellinger
    • Learn more at Confluent.io

    Our Sponsor:

    Your data shouldn’t be a problem to manage. It should be your superpower. The Confluent data streaming platform transforms organizations with trustworthy, real-time data that seamlessly spans your entire environment and powers innovation across every use case. Create smarter, deploy faster, and maximize efficiency with a true data streaming platform from the pioneers in data streaming. Learn more at confluent.io.

    Show More Show Less
    53 mins
  • Ep 7 - Streaming for Impact: APFA's Real-Time Journey
    Apr 29 2025

    What does real transformation look like when the stakes are high? Todd Smitala, Technology Solutions Architect at the Association of Professional Flight Attendants (APFA), shares how his journey of modernizing data system support 27,000 union members.

    With the legacy infrastructure crumbling under peak traffic during a high-volume union vote, Todd’s team turned to cloud-based real-time streaming with Apache Kafka® on Confluent Cloud. The result? A more reliable, scalable system that delivered immediate impact, ensuring every vote counted. In this episode, Todd walks through the technical decisions and leadership alignment that made it possible.

    You’ll learn:

    • How real-time infrastructure solved a high-stakes voting challenge
    • What it took to modernize under pressure and bring leadership on board
    • How to frame cost, reliability, and system ownership to drive successful change

    If you're leading a data streaming initiative and need a blueprint for impact, this episode is for you.

    About the Guest:
    As a Technology Solutions Architect at APFA, Todd Smitala brings over 10 years of experience as a flight attendant and 20 years in IT, allowing Todd to bridge the gap between technology and the needs of flight attendants. He’s passionate about leveraging technology to create meaningful change, support union members, and enforce contractual legalities. This combination of industry insight and technical expertise empowers Todd to deliver innovations that make a real difference. Outside of work, Todd enjoys scuba diving, adventure travel, hiking, playing music, and dancing at Carnaval in Brazil.

    Guest Highlight:
    “We all agreed that Confluent Cloud would be the way we want to go because of how it does host. It does really handle all of the background stuff. I'm not a Java developer… We didn't have the time to invest in all of that so Confluent Cloud is actually perfect for this type of thing.”

    Episode Timestamps:
    *(01:22) - How APFA Is Using Data Streaming to Empower Members
    *(20:39) -   The Runbook: Tools & Tactics
    *(26:45) -   Data Streaming Street Cred: Improve Data Streaming Adoption
    *(28:30) - Quick Bytes
    *(33:36) - Joseph’s Top 3 Takeaways

    Dive Deeper into Data Streaming:

    • EP1—Stream On: Unleashing Innovation with Data Streaming | Life Is But A Stream
    • EP2—Processing Without Pause: Continuous Stream Processing and Apache Flink® | Life Is But A Stream
    • EP3—The Connective Tissue: Shift Left to turn Data Chaos to Clarity | Life Is But A Stream

    Links & Resources:

    • Connect with Joseph: @thedatagiant
    • Joseph’s LinkedIn: linkedin.com/in/thedatagiant
    • Todd’s LinkedIn: linkedin.com/in/tsmitala
    • Learn more at Confluent.io

    Our Sponsor:

    Your data shouldn’t be a problem to manage. It should be your superpower. The Confluent data streaming platform transforms organizations with trustworthy, real-time data that seamlessly spans your entire environment and powers innovation across every use case. Create smarter, deploy faster, and maximize efficiency with a true data streaming platform from the pioneers in data streaming. Learn more at confluent.io.

    Show More Show Less
    36 mins
  • Ep 6 - From Roadblocks to Results: How Shared Vision Drives Data Streaming Success
    Apr 15 2025

    Whether you're an executive setting the strategy or an architect building the backbone, alignment is the key to turning transformation from a buzzword into results. Rick Hernandez, principal technical architect at EY, shares how to unlock shared vision and turn it into enterprise-wide data streaming success.

    In this episode, Rick joins Joseph to explore how organizations can connect leadership ambition with technical execution. Exploring how top-down buy-in, clearly defined objectives, and strategic alignment with the right technology can give your organization “wings to a tiger.”

    You’ll learn:

    • How to directly map business pain points to streaming-first solutions
    • How real-time infrastructure enables faster decisions and a competitive edge
    • What happens when stakeholders discover their challenges are already solvable through streaming

    If you're leading modernization efforts or championing real-time data, this episode is your guide for building momentum across teams.

    About the Guest:
    Rick Hernandez is a Principal Technical Architect at EY, specializing in enterprise architecture and digital transformation. He helps organizations implement innovative solutions and optimize IT strategies. Rick’s expertise includes architecture and development in middleware technologies, cloud integration work, business process management, enterprise application integration, and more.

    Guest Highlight:
    “Technology transformation is not for one area of a company. The overall company needs to actually change to do something better, differently, and more efficiently.”

    Episode Timestamps:
    *(01:00) - EY’s Data Streaming Strategy
    *(05:30) -  Data Streaming Goodness: Having a Shared Vision
    *(22:00) -   The Runbook: Tools & Tactics
    *(26:45) -   Data Streaming Street Cred: Improve Data Streaming Adoption
    *(31:10) - Quick Bytes
    *(33:45) - Joseph’s Top 3 Takeaways

    Dive Deeper into Data Streaming:

    • EP1—Stream On: Unleashing Innovation with Data Streaming | Life Is But A Stream
    • EP2—Processing Without Pause: Continuous Stream Processing and Apache Flink® | Life Is But A Stream
    • EP3—The Connective Tissue: Shift Left to turn Data Chaos to Clarity | Life Is But A Stream

    Links & Resources:

    • Connect with Joseph: @thedatagiant
    • Joseph’s LinkedIn: linkedin.com/in/thedatagiant
    • Rick’s LinkedIn: linkedin.com/in/rick-hernandez-3298574
    • Learn more at Confluent.io

    Our Sponsor:

    Your data shouldn’t be a problem to manage. It should be your superpower. The Confluent data streaming platform transforms organizations with trustworthy, real-time data that seamlessly spans your entire environment and powers innovation across every use case. Create smarter, deploy faster, and maximize efficiency with a true data streaming platform from the pioneers in data streaming. Learn more at confluent.io.

    Show More Show Less
    36 mins
  • Ep 5 - The Secret to Data Streaming Success: Speaking the Same Language
    Apr 1 2025

    Want your real-time data streaming initiative to stick? Success hinges on more than pipelines—it’s about people, governance, and business impacts. Jeffrey Johnathon Jennings (J3), managing principal at signalRoom, shares how to bring it all together.

    In this episode, J3 shares how he’s used impactful proofs of concepts to demonstrate value early, then scaled effectively through shift left with governance and stronger cross-team collaboration.

    You’ll learn about:

    • Why proofs of concept are key to securing buy-in and demonstrating ROI early
    • How data governance as a shared language create consistency across teams
    • Strategies for establishing a data streaming center of excellence
    • The role of business outcomes in guiding streaming data adoption strategies

    If you’re building or scaling a data streaming practice, this episode goes beyond the technology, showing you how to drive real impact.

    About the Guest:
    Jeffrey Johnathan Jennings is the managing principal of signalRoom, a dedicated father, avid traveler, and EDM enthusiast whose creativity and energy shape both his personal and professional life. As a cloud-native data streaming expert, he specializes in integrating ML/AI technologies to drive transformative change and improve business outcomes. With a focus on innovation, he designs scalable data architectures that enable real-time insights and smarter decision-making. Committed to continuous learning, Jeffrey stays ahead of technological advancements to help businesses navigate the evolving digital landscape and achieve lasting growth.

    Guest Highlight:
    “We need to speak the same language. The only way to speak the same language is to have a Schema Registry. I don't think there's an option. You just have to do this. We share a common language and therefore we build common libraries.”

    Episode Timestamps:
    *(01:13) - J3’s Data Streaming Journey
    *(07:07) -  Data Streaming Goodness: Strategies to Demonstrate Value
    *(26:38) -   The Runbook: Data Streaming Center of Excellence
    *(37:00) -   Data Streaming Street Cred: Improve Data Streaming Adoption
    *(42:35) - Quick Bytes
    *(45:00) - Joseph’s Top 3 Takeaways

    Dive Deeper into Data Streaming:

    • EP1—Stream On: Unleashing Innovation with Data Streaming | Life Is But A Stream
    • EP2—Processing Without Pause: Continuous Stream Processing and Apache Flink® | Life Is But A Stream
    • EP3—The Connective Tissue: Shift Left to turn Data Chaos to Clarity | Life Is But A Stream

    Links & Resources:

    • Connect with Joseph: @thedatagiant
    • Joseph’s LinkedIn: linkedin.com/in/thedatagiant
    • J3’s LinkedIn: linkedin.com/in/jeffreyjonathanjennings/
    • J3’s GitHub: github.com/j3-signalroom
    • “Fall in Love with the Problem, Not the Solution” by Uri Levine
    • “Ask Your Developer” by Jeff Lawson
    • Learn more at Confluent.io

    Our Sponsor:
    Your data shouldn’t be a problem to manage. It should be your superpower. The Confluent data streaming platform transforms organizations with trustworthy, real-time data that seamlessly spans your entire environment and powers innovation across every use case. Create smarter, deploy faster, and maximize efficiency with a true data streaming platform from the pioneers in data streaming. Learn more at confluent.io.

    Show More Show Less
    48 mins
  • Ep 4 - From Legacy to Cutting-Edge: Henry Schein One's Data Streaming Vision
    Mar 18 2025

    Despite its value, legacy data can feel like a roadblock in a fast-paced digital world—Henry Schein One is clearing the path forward with real-time data streaming.

    In this episode, Chris Kapp, Software Architect at Henry Schein One (HS1), shares how his team modernizes data management to stay competitive and unlock real-time insights.

    You’ll learn about:

    • How tagging strategy, immutable audit log, and governance keep data secure and reliable
    • The challenges (and wins) of getting leadership buy-in for data modernization
    • HS1’s approach to decentralized data ownership, domain-driven design, and the importance of stream processing for scaling
    • The role of GenAI in the future of real-time stream processing

    Get ready to future-proof your data strategy with this must-listen episode for technology leaders facing scalability, governance, or integration challenges.

    About the Guest:
    Chris Kapp is a Software Architect at Henry Schein One specializing in domain-driven design and event-driven patterns. He has 34 years of experience in the software industry including Target and Henry Schein One. He is passionate about teaching patterns for scalable data architectures. He’s currently focused on the One-Platform initiative to allow Henry Schein applications to work together as a single suite of products.

    Guest Highlight:
    “It's important to collect the data, try to eliminate our biases and go towards what is delivering quickly. The key is to start small, agile, iterative, and build something small with the people that are excited and willing to learn new things. If it doesn't work, then be agile, adjust, and find the thing that does work."

    Episode Timestamps:
    *(01:18) - Chris’ Data Streaming Journey
    *(03:35) -   Data Streaming Goodness: AI-Driven Reporting & Data Streaming
    *(21:08) -   The Playbook: Data Revitalization & Event-Driven Architecture
    *(31:55) -   Data Streaming Street Cred: Executive Alignment & Engineering Collaboration
    *(32:03) - Quick Bytes
    *(40:14) - Joseph’s Top 3 Takeaways

    Links & Resources:

    • Connect with Joseph: @thedatagiant
    • Joseph’s LinkedIn: linkedin.com/in/thedatagiant
    • Chris’ LinkedIn: linkedin.com/in/chris-kapp-87868a4
    • Designing Event-Driven Systems eBook
    • Designing Data-Intensive Applications eBook
    • Current 2025—The Data Streaming Event
    • Learn more at Confluent.io

    Our Sponsor:
    Your data shouldn’t be a problem to manage. It should be your superpower. The Confluent data streaming platform transforms organizations with trustworthy, real-time data that seamlessly spans your entire environment and powers innovation across every use case. Create smarter, deploy faster, and maximize efficiency with a true data streaming platform from the pioneers in data streaming. Learn more at confluent.io.

    Show More Show Less
    44 mins
  • Ep 3 - The Connective Tissue: Shift Left to turn Data Chaos to Clarity
    Mar 4 2025

    In the final episode of our 3-part series on the basics of data streaming, we take a deep dive into data integration—covering everything from data governance to data quality.

    Our guests, Mike Agnich, General Manager of Data Streaming Platform, and David Araujo, Director of Product Management at Confluent, explain why connectors are must-haves for integrating systems.

    You’ll learn:

    • Why real-time ETL out performs the old-school approach
    • How shifting left with governance saves time and pain later
    • The overlooked role of schemas in data quality
    • And more…

    About the Guests:

    Mike Agnich is the General Manager and VP of Product for Confluent's Data Streaming Platform (DSP). Mike manages a product portfolio that includes stream processing, connectors and integrations, governance, partnerships, and developer tooling. Over the last six years at Confluent, Mike has held various product leadership roles spanning Apache Kafka®, Confluent Cloud, and Confluent Platform. Working closely with customers, partners, and R&D to drive adoption and execution of Confluent products. Prior to his work at Confluent, Mike was the founder and CEO of Terrain Data (acquired by Confluent in 2018).

    David Araujo is a Director of Product Management at Confluent, focusing on data governance with products such as Schema Registry, Data Catalog, and Data Lineage. David previously held positions at companies like Amobee, Turn, WeDo Technologies Australia, and Saphety, where David worked on various aspects of data management, analytics, and infrastructure. With a background in Computer Science from the University of Évora, David has a strong foundation of technical expertise and leadership roles in the tech industry.

    Guest Highlights:

    "If a ton of raw data shows up on your doorstep, it's like shipping an unlabeled CSV into a finance organization and telling them to build their annual forecast. By shifting that cleaning and structure into streaming, we remove a massive amount of toil for our organizations… Instead of punting the problem down to our analytics friends, we can solve it because we're the ones that created the data." - Mike Agnich

    "We've had data contracts in Kafka long before it became a buzzword—we called them schemas… But more recently, we've evolved this concept beyond just schemas. In streaming, a data contract is an agreement between producers and consumers on both the structure (schema) and the semantics of data in motion. It serves as a governance artifact, ensuring consistency, reliability, and quality while providing a single source of truth for understanding streaming data." - David Araujo

    Links & Resources:

    • Connect with Joseph: @thedatagiant
    • Joseph’s LinkedIn: linkedin.com/in/thedatagiant
    • Mike’s LinkedIn: linkedin.com/in/magnich
    • David’s LinkedIn: linkedin.com/in/davidaraujo
    • What Is a Data Streaming Platform (DSP)
    • Learn more at Confluent.io

    Episode Timestamps:

    *(02:00) - Mike and David’s Journey in Data Streaming
    *(13:55) -   Data Streaming 101: Data Integration
    *(40:06) -   The Playbook: Tools & Tactics for Data Integration
    *(53:25) -   Voices from the World of Data Streaming
    *(59:33) - Quick Bytes
    *(1:05:20) - Joseph’s Top 3 Takeaways

    Our Sponsor:

    Your data shouldn’t be a problem to manage. It should be your superpower. The Confluent Data Streaming Platform transforms organizations with trustworthy, real-time data that seamlessly spans your entire environment and powers innovation across every use case. Create smarter, deploy faster, and maximize efficiency with a true Data Streaming Platform from the pioneers in data streaming. Learn more at confluent.io

    Show More Show Less
    1 hr and 8 mins
  • Processing Without Pause: Continuous Stream Processing and Apache Flink®
    Feb 18 2025

    We’re diving even deeper into the fundamentals of data streaming to explore stream processing—what it is, the best tools and frameworks, and its real-world applications.

    Our guests, Anna McDonald, Distinguished Technical Voice of the Customer at Confluent, and Abhishek Walia, Staff Customer Success Technical Architect at Confluent, break down what stream processing is, how it differs from batch processing, and why tools like Flink are game changers.

    You’ll learn:

    • The key differences between stream and batch processing
    • How different frameworks like Flink, Kafka Streams, and ksqlDB approach stream processing
    • The role of POCs and observability in real-time data workflows
    • And more…

    About the Guests:

    Anna McDonald is the Distinguished Technical Voice of the Customer at Confluent. She loves designing creative solutions to challenging problems. Her focus is on event-driven architectures, reactive systems, and Apache Kafka®.

    Abhishek Walia is a Staff Customer Success Technical Architect at Confluent. He has years of experience implementing innovative, performance-driven, and highly scalable enterprise-level solutions for large organizations. Abhishek specializes in architecting, designing, developing, and delivering integration solutions across multiple platforms.

    Guest Highlights:

    “Flink is more approachable because it blends approaches together and says, ‘If you need this, you still can use this.’ It's the most powerful at this point.” - Abhishek Walia

    “If you're somebody who's ever gone from normal to eventing, at some point you probably would have gone, ‘When does [the data] stop?’ It doesn't stop.” - Anna McDonald

    “ Start with a fully managed service. That's probably going to save a lot of cycles for you.” - Abhishek Walia

    Episode Timestamps:

    *(01:35) - Anna & Abhishek’s Journey in Data Streaming

    *(12:30) -   Data Streaming 101: Stream Processing

    *(26:30) -   The Playbook: Tools & Tactics for Stream Processing

    *(50:20) -   Voices from the World of Data Streaming

    *(56:13) -    Quick Bytes

    *(58:57) - Top 3 Takeaways

    Links & Resources:

    • Connect with Joseph: @thedatagiant
    • Joseph’s LinkedIn: linkedin.com/in/thedatagiant
    • Anna’s LinkedIn: linkedin.com/in/jbfletch
    • Abhishek’s LinkedIn: linkedin.com/in/abhishek-walia
    • Introducing Derivative Event Sourcing
    • Designing Event-Driven Systems
    • Learn more at Confluent.io

    Our Sponsor:

    Your data shouldn’t be a problem to manage. It should be your superpower. The Confluent Data Streaming Platform transforms organizations with trustworthy, real-time data that seamlessly spans your entire environment and powers innovation across every use case. Create smarter, deploy faster, and maximize efficiency with a true Data Streaming Platform from the pioneers in data streaming. Learn more at confluent.io.

    Show More Show Less
    1 hr and 1 min
  • Stream On: Unleashing Innovation with Data Streaming
    Feb 4 2025

    Real-time data streaming is shaking up everything we know about modern data systems. If you’re ready to dive in but unsure where to begin, no worries. That’s why we’re here.

    Our first episode breaks down the basics of data streaming—from what it is, to its pivotal role in processing and transferring data in a fast-paced digital environment. Your guide is Tim Berglund, VP of Developer Relations at Confluent, where he and his team work to make data streaming data and its emerging toolset accessible to all developers.

    You’ll learn:

    • The fundamentals of data streaming
    • Data streaming advantages vs. other technologies
    • What is an Event-Driven Architecture (EDA)
    • And much more…

    About the Guest:

    Tim Berglund serves as the VP of Developer Relations at Confluent, where he and his team work to make streaming data and its emerging toolset accessible to all developers. He is a regular speaker at conferences and a presence on YouTube explaining complex technology topics in an accessible way.

    Guest Highlights:

    “The basic intellectual habit that you have in building a data streaming system isn't first, ‘What are the things?’ But it's, ‘What is happening?’”

    “With batch processing, I’ve got my data in a pile—I know where it starts, where it ends, and I can work through it. With streaming, it’s not a pile—it’s a pipe.”

    “The future of data streaming is real-time everything—flows, insights, and actions. There’s no more ‘take the data here and think about it later.’ The insight is now, ready to be consumed by anyone who needs it. Businesses built on this model can respond to the world as it changes, right away."

    Episode Timestamps:

    *(01:44) -  Tim’s Journey in Data Streaming

    *(14:35) -   Data Streaming 101: Unlocking the Power of Data

    *(38:56) -   The Playbook: Tools & Tactics for Data Streaming

    *(49:00) -   Voices from the World of Data Streaming

    *(53:35) -    Quick Bytes

    *(57:10) - Top 3 Takeaways

    Links & Resources:

    • Connect with Joseph: @thedatagiant
    • Joseph’s LinkedIn: linkedin.com/in/thedatagiant
    • Tim’s LinkedIn: linkedin.com/in/tlberglund
    • Explore the 2024 Data Streaming Report
    • Learn more at Confluent.io

    Our Sponsor:

    Your data shouldn’t be a problem to manage. It should be your superpower. The Confluent Data Streaming Platform transforms organizations with trustworthy, real-time data that seamlessly spans your entire environment and powers innovation across every use case. Create smarter, deploy faster, and maximize efficiency with a true Data Streaming Platform from the pioneers in data streaming. Learn more at confluent.io.

    Show More Show Less
    1 hr and 1 min