Brussels Privacy Hub and Over 100 Academics Sign Appeal for the EU AI Act

Written by Jeremy Werner

Jeremy is an experienced journalists, skilled communicator, and constant learner with a passion for storytelling and a track record of crafting compelling narratives. He has a diverse background in broadcast journalism, AI, public relations, data science, and social media management.
Posted on 09/15/2023
In News

Brussels Privacy Hub and Over 100 Academics Sign Appeal for the EU AI Act

Before the European Parliament gathers for several planned meetings next week, over 130 academics and the Brussels Privacy Hub are hoping they get their attention. An appeal was signed for the Harmonised Rules on Artificial Intelligence, or the EU AI Act, calling for the EU AI Act to require a fundamental rights impact assessment (FRIA). A FRIA is a process for systematically assessing the potential impacts that a policy, AI system or other technology or initiative may have on human rights. A FRIA typically has an evaluation on impacts to rights like privacy, non-discrimination, freedom of expression, etc. It also considers the impacts on potentially affected groups and analyzes whether or not the policy/technology aligns with human rights and laws. A FRIA also would identify and mitigate risks early in the AI system design and deployment process. The overall goal of a FRIA is to embed respect for rights and laws into governance and systems.

While protections for fundamental rights are already in the EU AI Act, the press release on the appeal says there are risks that fundamental rights could be weakened when it comes time for negotiation on the legislation. The appeal also asks that a FRIA covers private and public sector AI, which would include independent oversight as well as transparency. The signers of the appeal believe that the FRIA should evaluate impacts on fundamental rights that high-risk AI systems may have. The appeal goes on to say that it should have clear parameters, public summaries, independent public authorities in assessments and involvement of the affected users. The appeal adds that the FRIA would complement existing impact assessments already in place, like the General Data Protection Regulation (GDPR).

The appeal, signed by various academics and experts on technology, law and policy at dozens of institutions, concludes with a statement that they believe a FRIA is pivotal to the EU AI Act. They conclude their thoughts by stating that a FRIA in the EU AI Act would uphold the European Union’s commitment to human rights and its value. The appeal ends with a statement that they will circulate a more detailed report in the coming day to explain their view on the best practices to regulate FRIAs.

If you have questions on how this could affect your company, reach out to BABL AI. They can answer all your questions and more.

Subscribe to our Newsletter

Keep up with the latest on BABL AI, AI Auditing and
AI Governance News by subscribing to our news letter