New Guide Prepares Internal Auditors for EU AI Act Compliance

Written by Jeremy Werner

Jeremy is an experienced journalist, skilled communicator, and constant learner with a passion for storytelling and a track record of crafting compelling narratives. He has a diverse background in broadcast journalism, AI, public relations, data science, and social media management.
Posted on 03/26/2025
In News

A new guide published by the European Confederation of Institutes of Internal Auditing (ECIIA) provides a comprehensive roadmap for internal auditors tasked with preparing organizations for compliance with the EU AI Act. Titled “The AI Act: Road to Compliance, the report is tailored to help auditors navigate complex regulatory obligations, assess AI-related risks, and evaluate internal control frameworks.

 

The EU AI Act, which entered into force in August 2024, introduces sweeping regulations for AI systems based on a risk-based classification. Prohibited AI systems that violate fundamental rights are banned outright, while high-risk systems—such as those used in biometric surveillance, employment, credit scoring, or law enforcement—must meet stringent requirements. Limited and minimal-risk systems face lighter obligations, primarily related to transparency.

 

The guide outlines how internal auditors can support organizations by building AI registries, classifying systems based on risk, and ensuring compliance deadlines are met. It also emphasizes the importance of understanding an organization’s role in the AI value chain, as providers and deployers face different responsibilities. For example, a deployer could become a provider if it significantly modifies or rebrands an AI system, triggering a higher compliance burden.

 

The AI Act’s phased implementation includes key deadlines through 2027, starting with prohibitions on high-risk AI in February 2025 and culminating in broad requirements for all systems by August 2026. General Purpose AI (GPAI) models with systemic risk face additional scrutiny, including mandatory notification to the European Commission and rigorous documentation and testing protocols.

 

A supporting ECIIA survey of more than 40 companies found that while 57% are already deploying or implementing AI systems, only 28% have defined a technological architecture, and fewer than half have internal regulations for AI or generative AI. Moreover, most internal audit departments lack dedicated IT auditors and have limited familiarity with the AI Act, underscoring the need for urgent training and capacity building.

 

The ECIIA calls on internal auditors to take a proactive role in leading their organizations toward responsible AI use. The report stresses that auditing AI is not merely a future concern—it’s an immediate governance priority that requires internal audit departments to upskill, adopt new frameworks, and ensure AI is deployed ethically and lawfully.

 

 

Need Help?

 

If you’re concerned or have questions about how to navigate the EU or global AI regulatory landscape, don’t hesitate to reach out to BABL AI. Their Audit Experts can offer valuable insight and ensure you’re informed and compliant.

 

Subscribe to our Newsletter

Keep up with the latest on BABL AI, AI Auditing and
AI Governance News by subscribing to our news letter