BABL AI is pleased to share that CEO Dr. Shea Brown was featured on the latest episode of the AI with Sach podcast, titled “Building Trust Through Responsible AI.” Hosted by industry expert Sach, the episode explores the fast-changing world of AI regulation, audits, and responsible AI practices.
During the conversation, Dr. Brown discusses how organizations can center human flourishing through strong AI auditing and risk assessment programs. Drawing on his background as an astrophysicist turned AI auditing leader, he explains why trust and transparency must be built into AI systems from the start. Also, at the same time, he addresses the real challenge many organizations face: balancing innovation with compliance.
Key Topics Covered:
- The role of AI audits in fostering transparency and accountability.
- Strategies for risk assessment and mitigation in AI systems.
- Insights into the regulatory landscape, including the importance of compliance and its impact on innovation.
- Practical advice for organizations to implement responsible AI development practices.
Shea Brown’s expertise and thought leadership have positioned BABL AI as a trusted voice in the field of responsible AI. Also, his work underscores the importance of ethical practices. It also highlights the proactive measures companies can take to navigate the complexities of AI governance.
Where to Listen:
About BABL AI:
Since 2018, BABL AI has been auditing and certifying AI systems, consulting on responsible AI best practices and offering online education on related topics. BABL AI’s overall mission is to ensure that all algorithms are developed, deployed, and governed in ways that prioritize human flourishing.
About AI with Sach:
AI with Sach is a podcast that dives into the transformative power of AI across industries. Through in-depth conversations with CEOs, directors, and visionary thinkers, the podcast offers exclusive insights into the strategic use of AI, its real-world impact, and the opportunities and challenges of this technological revolution.


