EU AI Act and Biometrics - New Challenges for Visual Data Anonymization

Mateusz Zimoch
8/5/2025

Visual data anonymization is the process of permanent and irreversible removal of information enabling identification of individuals from graphic and video materials. In the context of the recently adopted EU Artificial Intelligence Act (EU AI Act), this topic gains particular significance, especially regarding the processing of biometric data.

As an expert in personal data protection, I observe that the new regulations introduce numerous restrictions concerning biometric systems, directly affecting how organizations must approach the processing of individuals’ images. Correct application of anonymization techniques is not only best practice but also a legal requirement with financial and reputational consequences.

Close-up of a person's face with digital interface elements, including a circular scanner and security icons, representing technology and cybersecurity.

What Is the EU AI Act and How Does It Affect Biometric Data Processing?

The EU AI Act is a comprehensive European Union regulation governing the use of artificial intelligence systems. It introduces a four-tier risk classification framework, where biometric systems are often classified as high-risk or unacceptable risk.

Particularly important are provisions related to biometric identification, including facial recognition, voice recognition, gait analysis, and other physical traits enabling person identification. According to the new rules, many applications of such systems in public spaces are prohibited except for strictly defined security-related cases.

Organizations processing visual materials must therefore implement effective anonymization methods to avoid unlawful biometric data processing.

A 3D abstract shape composed of glowing dots against a background of binary code on a black surface.

Which Biometric Data Are Subject to Special Protection under the EU AI Act?

The EU AI Act extends existing GDPR regulations regarding biometric data. Data requiring special treatment include:

  • Facial images enabling person identification
  • Vehicle license plates
  • Distinctive clothing elements or tattoos
  • Retina patterns and fingerprints

Importantly, the new regulations address not only direct identification but also indirect identification through combining datasets. Hence, effective anonymization must consider a broader processing context.

How Does the EU AI Act Change the Approach to Anonymization of Surveillance Footage?

Video surveillance is an area where the new regulations introduce significant changes. Surveillance systems equipped with biometric analytics (e.g., automatic facial recognition) are classified as high-risk systems.

This means operators of such systems must:

  • Conduct data protection impact assessments (DPIA)
  • Implement appropriate face and license plate anonymization mechanisms
  • Use on-premise solutions for highly sensitive data processing
  • Regularly test the effectiveness of employed anonymization methods

Organizations using surveillance must revise their procedures to ensure compliance with these new requirements, especially regarding footage storage and sharing.

Which Visual Data Anonymization Methods Comply with the EU AI Act?

The EU AI Act does not specify precise anonymization methods but focuses on outcomes. Compliant methods are those that effectively and permanently prevent person identification. In practice, these include:

  • Face blurring - using face detection algorithms combined with pixelation or Gaussian blur filters. Simple blurring alone may be insufficient against advanced de-anonymization algorithms.
  • License plate masking - similar to face blurring but targeting vehicle identifiers. Effective solutions automatically detect and anonymize plates even on moving footage.

Silhouetted person with digital code projected on them, creating a mysterious and abstract effect in a dark setting.

Why Are On-Premise Solutions Preferred for Biometric Data Anonymization?

Like GDPR, the EU AI Act emphasizes data processing security. In biometric data contexts, on-premise (locally installed) solutions offer several benefits:

  • Sensitive data never leave the organization’s infrastructure, minimizing leak risks. This is crucial given the EU AI Act’s severe sanctions, which may reach €35 million or 7% of global turnover.
  • Organizations retain full control over the anonymization process and can tailor it to specific industry or legal requirements - key for regulated sectors.

An example is Gallio Pro - a visual data anonymization tool operating fully within client infrastructure. Check Gallio Pro to see how you can securely protect your data under the new requirements.

Can AI Help Achieve EU AI Act-Compliant Anonymization?

Paradoxically, while the EU AI Act regulates AI use, advanced AI algorithms provide the most effective anonymization methods. Machine learning algorithms can:

  • Detect faces and license plates with greater accuracy than traditional methods, even in difficult lighting or partial occlusion - critical for comprehensive anonymization.
  • Automatically track anonymization-required objects throughout video sequences, speeding up the process and reducing human error risk. For large visual datasets, automation becomes essential.

Implementing such solutions enables organizations to meet EU AI Act requirements while maintaining operational efficiency.

Person in a dark hoodie wearing futuristic, illuminated glasses, looking down in a dimly lit environment.

How to Share Visual Materials with Third Parties under the EU AI Act?

Sharing visual data with media, law enforcement, or posting on social media requires special caution under new regulations. Key principles include:

  • Anonymize materials before sharing, unless there is a clear legal basis for disclosing identifiable data
  • Document anonymization and consent processes
  • Use secure data transmission channels
  • Conclude proper data processing agreements when materials contain personal data

Entities publishing visual data, e.g., on YouTube, must remember that responsibility for effective anonymization lies with the data discloser, even if physical processing happens on external platforms.

A person's face is visible through a layer of bubble wrap, creating a textured and abstract effect in grayscale.

What Penalties Exist for Non-Compliance with EU AI Act Biometric Provisions?

The EU AI Act imposes a strict sanctions regime, especially on biometric systems. Financial penalties may reach:

  • Up to €35 million or 7% of annual global turnover (whichever is higher) for using prohibited AI systems, including certain biometric identification applications in public spaces
  • Up to €15 million or 3% turnover for significant breaches of other provisions, including high-risk system requirements, often applying to biometric data processors

Besides fines, non-compliant organizations may be required to withdraw systems from the market, causing serious operational disruptions.

How to Prepare for EU AI Act Compliance in Anonymization?

Organizational preparation should include:

  • Auditing existing visual data processing for biometric information usage
  • Deploying appropriate anonymization tools, preferably local (on-premise)
  • Training staff on new requirements and procedures
  • Updating data processing documentation including privacy policies
  • Regularly testing anonymization effectiveness

Consider professional solutions like Gallio Pro for effective anonymization compliant with new regulations. Download demo and see how our tool can assist your organization in adapting to the EU AI Act.

A 3D wireframe face is displayed on a black box and a smartphone, set against a gray background with digital elements.

FAQ - Frequently Asked Questions About the EU AI Act and Anonymization

When does the EU AI Act come into force and what is the implementation timeline?

The EU AI Act was formally adopted in 2023 with transitional periods. Most provisions will apply by 2025, except certain high-risk system requirements, which will be phased in gradually.

Does the EU AI Act only apply to companies headquartered in the EU?

No, like GDPR, the EU AI Act has extraterritorial scope, covering all entities offering products or services in the EU market regardless of location.

Are there exceptions to the ban on biometric identification in public spaces?

Yes, narrow exceptions exist mainly for serious crime investigation, missing persons searches, or terrorism prevention, always requiring appropriate legal basis and oversight.

Is face blurring always sufficient for anonymization?

Not always. Simple blurring might not withstand advanced de-anonymization attacks. The EU AI Act requires effective anonymization, often entailing more advanced techniques.

How does the EU AI Act treat monitoring systems using behavioral analytics (e.g., abnormal behavior detection)?

Such systems are generally classified as high-risk and subject to stringent requirements, including conformity assessments, human oversight, and transparency.

Do historical (archival) materials also require anonymization?

Yes, when actively processed or shared. The EU AI Act does not provide a blanket exception for archival materials, though exemptions may apply for public interest or research purposes.

Where can I find more information about EU AI Act compliance?

We encourage you to contact us - we offer consultations on adapting anonymization processes to EU AI Act and GDPR requirements.

3D white question mark on a smooth gray background, casting a subtle shadow.

References list

  1. Regulation (EU) on harmonized rules for artificial intelligence (Artificial Intelligence Act) COM/2021/206 final Regulation (EU) 2016/679 on the protection of natural persons regarding the processing of personal data (GDPR) European Data Protection Board, "Guidelines 3/2019 on processing of personal data through video devices", 2020 European Union Agency for Fundamental Rights, "Facial recognition technology: fundamental rights considerations in law enforcement", 2020 Information Commissioner’s Office (ICO), "Guide to the General Data Protection Regulation - Biometric data", 2021