U.S. Commission on Civil Rights Releases Report on the Civil Rights Implications of Federal Use of Facial Recognition Technology

Written by Jeremy Werner

Jeremy is an experienced journalists, skilled communicator, and constant learner with a passion for storytelling and a track record of crafting compelling narratives. He has a diverse background in broadcast journalism, AI, public relations, data science, and social media management.
Posted on 09/25/2024
In News

The U.S. Commission on Civil Rights released its comprehensive report, “The Civil Rights Implications of the Federal Use of Facial Recognition Technology (FRT).” The report addresses growing concerns about the federal government’s use of FRT and makes several recommendations for regulatory oversight and best practices to safeguard civil rights.

 

Facial Recognition Technology, a tool that uses biometric data to identify individuals, has gained widespread use across both private and public sectors. However, the Commission found that federal oversight and guidelines for responsible use have not kept pace with the technology’s real-world deployment. The report focuses on how three federal departments—the Department of Justice (DOJ), Department of Homeland Security (DHS), and the Department of Housing and Urban Development (HUD)—are using FRT and how this aligns with civil rights protections. These agencies utilize FRT in ways that directly impact U.S. citizens, including in law enforcement, border security, and public housing. Yet, the Commission noted the absence of clear legal frameworks or regulations to govern this use, raising concerns about potential civil rights violations.

 

During the Commission’s investigation, it found that FRT is most frequently used by the DOJ within the FBI and U.S. Marshals Service for generating leads in criminal investigations. DHS, in contrast, utilizes FRT for broader purposes, including supporting national security and public safety by integrating biometric systems into airports, seaports, and land border checkpoints. Meanwhile, HUD integrates FRT into public housing surveillance systems, which raises particular concerns about privacy and discrimination given the demographic makeup of public housing tenants, who are disproportionately people of color and women.

 

One of the report’s most significant findings is the absence of federal laws expressly regulating the use of FRT by government agencies. As a result, there is no legal oversight of its implementation, despite concerns about accuracy and bias. In particular, the Commission found that certain demographic groups, especially Black women, people of East Asian descent, and older adults, are more likely to be misidentified by FRT. Misidentifications can have grave consequences, including wrongful arrests, unwarranted surveillance, and discrimination, especially in law enforcement contexts.

 

Rochelle Garza, Chair of the U.S. Commission on Civil Rights, emphasized the urgency of addressing these risks: “Unregulated use of facial recognition technology poses significant risks to civil rights, especially for marginalized groups who have historically borne the brunt of discriminatory practices. As we work to develop AI policies, we must ensure that facial recognition technology is rigorously tested for fairness.”

 

The report also highlighted a concerning lack of transparency in how FRT is used by federal agencies. Although agencies like the DOJ have introduced interim policies, such as not using FRT as the sole basis for arrest or prosecution, the Commission found limited public access to data about how often FRT is employed, which crimes it is used to investigate, or how accurate these systems are in practice.

 

As part of its recommendations, the Commission called on Congress to direct the National Institute of Standards and Technology (NIST) to develop operational testing protocols that assess the effectiveness and fairness of FRT systems in real-world conditions. This would ensure that the technology’s performance across different demographic groups is accurately measured and that disparities can be addressed. Moreover, the Commission recommends that FRT vendors provide ongoing training and support to ensure these systems remain accurate and fair in deployment.

 

Mondaire Jones, a Commissioner with the U.S. Commission on Civil Rights, described the report as a crucial step forward: “This report addresses concerns about accuracy, oversight, transparency, discrimination, and access to justice. We are excited to provide guidance for the U.S. Government to meet this moment of immense technological potential with due consideration for civil rights.”

 

In addition to accuracy concerns, the Commission raised alarms about the broader civil rights implications of FRT, particularly the potential for discrimination in federally funded public housing. If HUD grantees use FRT systems that are prone to misidentification, it could lead to discriminatory eviction practices or limit access to housing, in violation of Title VI of the Civil Rights Act of 1964.

 

The Commission’s report underscores the need for greater regulatory oversight, including the establishment of Chief AI Officers across federal agencies to ensure that FRT systems are tested in real-world scenarios and subjected to ongoing evaluation. Additionally, the Commission called for statutory mechanisms to allow individuals to seek redress if they are harmed by the misuse of FRT.

 

Need Help?

 

If you have questions or concerns about any global AI reports, guidelines, regulations and laws, don’t hesitate to reach out to BABL AI. Their Audit Experts can offer valuable insight, and ensure you’re informed and compliant.

 

Subscribe to our Newsletter

Keep up with the latest on BABL AI, AI Auditing and
AI Governance News by subscribing to our news letter