Key Takeaways from the EU AI Act before it is Finalized

Written by Jeremy Werner

Jeremy is an experienced journalists, skilled communicator, and constant learner with a passion for storytelling and a track record of crafting compelling narratives. He has a diverse background in broadcast journalism, AI, public relations, data science, and social media management.
Posted on 01/18/2024
In Blog

2024 may be dominated by the EU AI Act. That’s because it is emerging as a major piece of legislation that will regulate the use of AI globally. It is poised to impact AI like how GDPR affected privacy regulations. 

 

What is the EU AI Act and why is it important?

 

The act matters because it will require any company wanting to operate in Europe, or even interact with a European anywhere around the world, to conform to laws and requirements when developing or using AI systems. It will increase transparency around the use of AI through new requirements, as well as provide enhanced consumer protections when interacting with AI systems. Overall, the EU AI Act will bring major changes to the AI landscape.

 

Who is affected by the EU AI Act?

 

  1. Companies that develop AI systems will need to ensure their systems comply with the act if they want to sell them in the EU market.
  2. Companies that use AI tools and systems will also need to comply, even if they didn’t create the system. The law is extraterritorial, meaning it applies to use of AI on EU citizens even outside of Europe.
  3. Professionals working with AI systems will need to understand the new requirements and processes for compliance.
  4. General consumers will notice changes in transparency around AI, with new disclosures and consent requirements when interacting with AI systems.
  5. The obligations vary based on the level of risk an AI system poses, with prohibited AI, high-risk AI, and low-risk AI treated differently. But overall the impact is extensive.

 

What is a conformity assessment and how does an organization get one?

 

A conformity assessment is essentially a certification that an organization’s AI systems and internal processes comply with the requirements of the EU AI Act. There are two types:

 

  1. Internal conformity assessment: The organization performs its own audit and self-attests that its AI practices meet the act’s obligations.
  2. Independent third-party assessment: An external independent body audits the organization’s AI systems and processes and certifies whether they are compliant. This is more thorough.

 

To obtain a third-party conformity assessment, organizations will need to work with external firms who are approved to conduct these audits and issue certifications. The specific requirements around when third-party vs self-assessments will be permitted are still being finalized.

 

What are the penalties for non-compliance?

 

A crucial element of the EU AI Act will be the imposition of fines for AI systems found in violation. Fines will depend on the risk level, the severity of the non-compliance and the size of the company. Fines will range from 35 million Euros or 7% of turnover down to 7.5 million Euros or 1.5%.

 

What steps should an organization take to become compliant?

 

The key steps an organization should take are:

  1. Designate an internal role or team responsible for EU AI Act compliance.
  2. Conduct an inventory of all AI systems used, including internally developed, procured from vendors, and AI embedded in software tools.
  3. Categorize each system based on risk level (prohibited, high-risk, low-risk).
  4. Conduct in-depth risk assessments for high-risk systems.
  5. Document risks, mitigations, governance processes.
  6. Implement controls like quality management systems and risk management systems.
  7. Seek help from external experts for complex steps like risk analysis, conformity assessments.
  8. Continuously monitor systems and update documentation.

 

What is the timeline for compliance?

 

The EU AI Act will come into force in stages, with prohibited AI system compliance required by 2025 and high-risk AI systems compliance required by 2026. Given the extensive obligations for high-risk systems, organizations should start their compliance efforts immediately. They will need time to inventory systems, conduct assessments, implement controls and get certified. Waiting too long could put them at risk of missing deadlines and facing penalties. Experts recommend beginning the work now to understand gaps and become compliant.

 

The details of the EU AI Act are onerous and can be overwhelming, so don’t hesitate to reach out to BABL AI. Their team of Audit Experts can provide valuable insights on the EU AI Act and other global regulations.

Subscribe to our Newsletter

Keep up with the latest on BABL AI, AI Auditing and
AI Governance News by subscribing to our news letter