What’s in Store for AI Auditing in 2023?

012. Lunchtime BABLing – What’s in store for AI Auditing in 2023?

Today on Lunchtime BABLing, Shea reflects on recent meetings, events, and announcements, including:

1. Public hearing for NYC Local Law No. 144

2. European Commission Workshop on auditing for the DSA

3. New AI laws and guidelines (e.g. NIST AI RMF, NJ, and NY laws)

Shea follows it up with his thoughts on the training needed for AI auditing, and why 2023 is when AI and Algorithm Auditing goes mainstream.

Available on YouTube, Simplecast, and all major Podcast streaming platforms. 

AI Audit & Assurance

011. Lunchtime BABLing – AI Audit & Assurance

On this episode of Lunchtime BABLing, Shea talks about AI Audit & Assurance, and where it fits into the emerging regulatory landscape.

1. What laws, regulations, and guidelines are driving the need for AI audit and assurance?

2. What the ecosystem looks like, and where I think it’s going (he might mention your company here)?

3. What is different about algorithm auditing as compared to other types of audit and assurance?

Available on YouTube, Simplecast, and all major Podcast streaming platforms.

Breaking into AI Ethics Consulting

010. Lunchtime BABLing – Breaking into AI Ethics Consulting

How can you apply the skills you already have to the emerging field of AI ethics, governance, and policy consulting? In this edition of Lunchtime BALBing, Shea Brown talks about his experience and thoughts on finding your unique niche in the industry.

Available on YouTube, Simplecast, and all major Podcast streaming platforms. 

The Future of AI Regulation in the US & Europe

009. Lunchtime BABLing – The Future of AI Regulations in the US and Europe

On this week’s Lunchtime BABLing, we’re talking with Merve Hickok, a leading voice in AI policy and regulation. We discuss the future of AI regulation, especially the EU AI Act. Topics include:

1: How can regulations best protect fundamental rights?

2: What will regulations require of companies and governments?

3: Why are responsible AI practices crucial for businesses?

4: What can companies do now to ensure they’re on the right path?

Merve’s LinkedIn here.

Free resources for Responsible AI here.

Available on YouTube, Simplecast, and all major Podcast streaming platforms.

New Proposed Rules, Again! | NYC Algorithm Hiring Law

008. Lunchtime BABLing – New Proposed Rules, Again! | NYC Algorithm Hiring Law

New York City’s Local Law 144 requires independent bias audits for automated employment decision tools (AEDT) used to substantially assist or replace decisions in hiring or promotion. The new date for enforcement has been pushed back to April 15th, 2023.

In another episode of our weekly mini-webinar series, BABL AI’s CEO Shea Brown discusses the newly released proposed rules (again!) and how they may affect obligations that companies may have, including:

1. What is considered an AEDT under the law

2. Clarified relationship between the vendors that create the tools and the employers that use the tool (and must comply with the law)

3. New language around what an “independent auditor” means

4. Open Q&A session

Available on YouTube, Simplecast, and all major Podcast streaming platforms.

AI and Research Ethics

007. Lunchtime BABLing – AI and Research Ethics

In today’s episode of Lunchtime BABLing, Shea Brown invites Borhane Blili-Hamelin, PhD to discuss some surprising parallels between the challenge of putting AI ethics into practice in industry versus research settings!

This conversation is inspired by ongoing joint work by Borhane Blili-Hamelin & Leif Hancox-Li, which was presented at a NeurIPS 2022 workshop on November 28 and December 5. Read our work-in-progress paper here.

Available on YouTube, Simplecast, and all major Podcast streaming platforms.

Process Audit for Disparate Impact Testing

006. Lunchtime BABLing – Process Audit for Disparate Impact Testing

In this weeks episode, BABL AI’s CEO Shea Brown discusses what a process audit is, and how it can be used to verify disparate impact testing conducted by employers and vendors.

New York City’s Local Law 144 requires independent bias audits for automated employment decision tools (AEDT) used to substantially assist or replace decisions in hiring or promotion. The law comes into effect on Jan 1, 2023.

Listen on YouTube, Simplecast and all major Podcast streaming platforms.

Algorithmic Auditing International Conference | Recap

005. Lunchtime BABLing – Algorithmic Auditing International Conference | Recap

In the fourth installment of our weekly mini-webinar series, BABL AI’s CEO Shea Brown discusses insights from the “Algorithmic Auditing International Conference”, held 8–10 November 2022 in Barcelona and Brussels and hosted by the firm Eticas.

Listen on YouTube, Simplecast, and all major Podcast streaming platforms. 

Ethical Risk & Impact Assessments for AI Systems

004. Lunchtime BABLing – Ethical Risk & Impact Assessments for AI Systems

A number of forthcoming laws and regulations that will govern the use and development of AI will require mandatory risk or impact assessments.

This week BABL AI’s CEO Shea Brown discusses what an ethical risk assessment is, and how your organization can implement them today.

Listen on YouTube, Simplecast and all major Podcast streaming platforms.

What is an Algorithmic Bias Audit?

003. Lunchtime BABLing – What is an Algorithmic Bias Audit?

New York City’s Local Law 144 requires independent bias audits for automated employment decision tools (AEDT) used to substantially assist or replace decisions in hiring or promotion. Despite the law coming into effect on Jan 1, 2023, several aspects of the bias audit have still to be clarified.

In our second weekly mini-webinar series, BABL AI’s CEO Shea Brown discusses what an algorithmic bias audit is, including:

1. Bias audit basics

2. Differences between employer and vendor audits

3. Open Q&A session

Listen on YouTube, Simplecast and all major Podcast streaming platforms.