How our Audit works
Lorem ipsum dolor sit.
How our Audit works
Scoping
Fieldwork
Final Report
Download section heading
Relevant copy to explain the download.
Gated content
FAQs
How to get a Conformity Assessment for the EU AI Act?
Currently the rules for the EU AI Act are not finalized and no organization can obtain a Conformity Assessment. However, many of the required elements of a these Conformity Assessments are publicly available and BABL AI can help organizations prepare for a Conformity Assessment by preforming an EU AI Act Conformity Assessment Preparation Audit.
What is the EU AI Act?
The EU Artificial Intelligence Act proposes a comprehensive framework for categorizing and overseeing artificial intelligence applications according to their potential for harm. This framework predominantly comprises three key classifications: prohibited practices, high-risk AI systems, and other AI applications. High-risk AI systems, as defined by the Act, encompass those presenting substantial risks to public health, safety, or fundamental human rights. Prior to market release, they necessitate a mandatory compliance evaluation, which the provider must perform through self-assessment.
What is a Conformity Assessment?
The process of conformity assessment demonstrates to the public that an AI System meets the relevant requirements of the EU AI Act.
Is this audit a Conformity Assessment?
At the moment the final rules for a Conformity Assessment have not been finalized. This audit process is to help your organization be fully prepared for a Conformity Assessment when the rules have been finalized.
What is a “high-risk” AI system?
As defined in the EU AI Act, high-risk systems are those that present substantial risks to public health, safety, or fundamental human rights. These systems include but not limited to:
- Biometric identification and categorization of natural persons
- Management and operation of critical infrastructure
- Education and vocational training
- Employment, workers management, and access to self-employment
- Access to and enjoyment of essential private service, and public services and benefits
- Law enforcement
- Migration, asylum, and border control management
- Administration of justice and democratic processes
Who does the EU AI Act apply to?
This regulation is applicable to AI system stakeholders whose products are available in the EU market or impact individuals within the EU, irrespective of their organizational type, whether public or private, and whether they operate within or outside the EU. The regulation outlines specific obligations for these stakeholders:
- Providers: These are the ones who create an AI system or have someone make it for them. They sell it with their own name or brand.
- Importers: These are the ones who bring AI systems from outside the EU and sell them in the EU using the original name or brand.
- Distributors: These are the ones who sell AI systems in the EU market without changing them in any way. They are not the creators or importers.
- Users: These are the people or organizations that use AI systems, but not for personal, non-professional use.
Each role has its own responsibilities under the regulation.
Note: Providers, such as the developer of a candidate screening tool, bear different obligations from users, such as a bank procuring this screening tool.
Which obligations do AI system providers have under the EU AI Act?
Legals requirements pertaining to providers include areas such as:
- Quality management system
- Risk management system
- Data governance
- Technical documentation and transparency
- Human oversight
- Post-market monitoring system, among others
Reach out to us today for a free consultation!