Key Takeaways from the EU AI Act before it is Finalized

Written by Jeremy Werner

Jeremy is an experienced journalist, skilled communicator, and constant learner with a passion for storytelling and a track record of crafting compelling narratives. He has a diverse background in broadcast journalism, AI, public relations, data science, and social media management.
Posted on 01/18/2024
In Blog

UPDATE – JULY 2025: The EU AI Act officially entered into force on August 1, 2024, with compliance obligations rolling out in phases through 2027. Prohibited AI practices became enforceable on February 2, 2025, and requirements for General-Purpose AI (GPAI) and high-risk systems are scheduled for August 2025 and 2026, respectively. Businesses operating in or interacting with the EU should begin compliance efforts immediately, particularly for high-risk systems requiring conformity assessments, risk documentation, and oversight frameworks.

 

OLD BLOG POST:

 

Why the EU AI Act Is Reshaping Compliance in the Global AI Landscape

 

2024 may be dominated by the EU AI Act. That’s because it is emerging as a major piece of legislation that will regulate the use of AI globally. It is poised to impact AI like how GDPR affected privacy regulations. 

 

What Is the EU AI Act?

The EU AI Act is the world’s first comprehensive legal framework for artificial intelligence. It introduces new rules designed to ensure safety, transparency, and accountability in the development and use of AI. Critically, its scope is extraterritorial: if your company interacts with users in the EU—even from outside Europe—you’ll need to comply.

Who Is Affected?

The Act impacts a wide range of stakeholders:

  • Developers must ensure their systems meet EU requirements if they want access to the EU market.

  • Deployers and users of AI tools are also subject to compliance—even if they didn’t create the systems.

  • Professionals working with AI need to understand how risk-based obligations affect their workflows.

  • Consumers will experience greater transparency, including new disclosures and consent mechanisms.

AI systems are categorized by risk—prohibited, high-risk, and low-risk—with compliance obligations increasing alongside risk level.

What Is a Conformity Assessment?

A conformity assessment certifies that an organization’s AI systems comply with the Act. There are two types:

  • Internal conformity assessment: Companies conduct a self-audit and attest to compliance.

  • Independent third-party assessment: Required for most high-risk systems, this involves an accredited external body verifying compliance.

Exact thresholds and procedures continue to evolve, but for high-risk AI, external certification will be mandatory.

What Are the Penalties?

Noncompliance can result in steep fines:

  • Up to €35 million or 7% of global turnover for the most severe violations.

  • Lower-tier fines of €7.5 million or 1.5% of turnover, depending on risk and company size.

How Should Organizations Prepare?

To prepare for compliance, organizations should:

  1. Assign a compliance lead or team.

  2. Inventory all AI systems (developed, procured, embedded).

  3. Categorize by risk level (prohibited, high-risk, limited, minimal).

  4. Conduct risk assessments for high-risk systems.

  5. Establish internal governance, including quality and risk management systems.

  6. Prepare documentation for technical details, usage logs, and training data.

  7. Engage external experts for conformity assessments and high-impact systems.

  8. Implement monitoring and update mechanisms.

When Does It Take Effect?

The Act is being rolled out in stages:

  • February 2, 2025: Prohibited AI systems became illegal.

  • August 2, 2025: Obligations for GPAI models and national AI Office oversight begin.

  • August 2, 2026: High-risk AI system compliance becomes mandatory.

  • August 2, 2027: Final deadlines for some high-risk systems and implementation standards.

Given the scale and complexity of the requirements, organizations should act now to avoid missing key milestones.

 

The details of the EU AI Act are onerous and can be overwhelming, so don’t hesitate to reach out to BABL AI. Their team of Audit Experts can provide valuable insights on the EU AI Act and other global regulations.

Subscribe to our Newsletter

Keep up with the latest on BABL AI, AI Auditing and
AI Governance News by subscribing to our news letter