What Needs to Be Anonymized Besides License Plates? A Visual Data Protection Checklist

Łukasz Bonczol
Published: 1/13/2026
Updated: 3/10/2026

Visual data anonymization means transforming images or videos so that individuals cannot be identified by any means reasonably likely to be used. Under GDPR and UK GDPR, data is no longer personal when identification is not possible, taking into account costs, time, and available technology [1]. In publishing scenarios, anonymization typically relies on face blurring and license plate blurring, but many other visual cues can still identify a person. This article focuses on what else commonly requires attention before publishing photos or videos.

A black-and-white photo showing a dozen or so American license plates probably attached to a wall

Why license plate blurring alone is not enough

Plates are only one of several common identifiers. In practice, people are often recognised by facial features, unique clothing, tattoos, workplace badges, or the combination of location, time, and context. Regulators regularly note that identifiability is not limited to names or numbers. It includes any information relating to an identified or identifiable natural person, including by reference to physical, physiological, genetic, mental, economic, cultural, or social identity [1]. A robust approach looks beyond vehicles to the broader visual scene.

In a snowy setting, a Moskvitch car stands with suitcases on the roof, a black-and-white photo with the license plates anonymized

A visual data protection checklist for photos and videos

This checklist lists typical visual elements that can reveal identity. The items are ordered from most to least likely to enable identification in common publishing use cases. Organisations often consider these steps as part of a common compliance approach when preparing media for websites, reports, social channels, or public information portals.

  1. Faces and heads. Apply face blurring to frontal and profile views. Consider partial faces in mirrors or glass reflections.
  2. Tattoos, scars, and birthmarks. These can be strong identifiers. Targeted masking beyond faces is often required.
  3. Work badges and ID cards worn visibly. Blur names, photos, and barcodes/QR codes on lanyards and badges captured in frame.
  4. Clothing with names or numbers. Team jerseys, personalised jackets, school uniforms with names, or high-visibility vests with printed identifiers should be obfuscated.
  5. Body shape and gait. Distinctive physique or gait patterns can be identifying (especially in small communities or niche contexts). Cropping or full-body blurring may be needed where recognition risk is high.
  6. House numbers and doorbells when linked to people. A clear house number with a resident present or obviously associated can contribute to identification.
  7. Vehicle features beyond plates. Company vehicle markings, call signs, unit numbers, or unique stickers can identify a driver when combined with time and location.
  8. Screens or wearables that show names or messages. Smartwatch notifications, phone screens, or in-car displays can reveal identity or private information. Cropping often works best.
  9. Children’s features and school insignia. For minors, more conservative masking of faces, name tags, and recognisable school crests is common practice.
  10. Sensitive context. Images of individuals entering a clinic, religious venue, or union office can reveal or strongly suggest special category data by context. In such cases, masking the person (and sometimes other contextual cues) is typically considered.

Public places do not create a free-to-publish zone. If a person is identifiable, GDPR or UK GDPR applies. A contextual risk assessment is still needed before posting.

In a snowy setting, a Moskvitch car stands with suitcases on the roof, a black-and-white photo with the license plates anonymized

Three limited exceptions where blurring may be unnecessary

There are narrow cases where anonymization may not be required. These do not remove the need for assessment and safeguards:

  1. Household exemption. Processing by a natural person in the course of a purely personal or household activity is outside GDPR scope [1, Art. 2(2)(c)].
  2. Journalistic, academic, artistic, or literary purposes. Member States provide exemptions to reconcile data protection with freedom of expression and information [1, Art. 85]. In the UK this is implemented through the Data Protection Act 2018 special purposes provisions [3].
  3. No identification is reasonably likely. If individuals are not identifiable by any means reasonably likely to be used, GDPR does not apply [1, Recital 26]. This requires a careful, context-dependent assessment.

Law enforcement processing by competent authorities is governed by a separate regime (in the EU: Law Enforcement Directive as implemented in national law; in the UK: DPA 2018 Part 3) and is not covered here. For business and public-sector publishing, the above three are the most relevant boundary cases.

In the black-and-white photo, you can see a shiny McLaren 720S car against the backdrop of Norwegian houses.

Applying technology: automation, review, and on-premise software

Automation helps scale visual data anonymization. Tools can detect faces and license plates, then apply masking or blurring. Accuracy and cost depend on scene complexity, camera angles, lighting, and model quality. A human review step is still common practice for high-risk publications or where sensitive contexts appear.

On-premise software is often preferred where footage is sensitive or where transfer to third-party servers is restricted by policy. Keeping processing on controlled infrastructure can reduce data transfer and vendor exposure risk. Where cloud is used, publishers typically consider data location, retention, access controls, and logging.

Check out Gallio PRO for automated face blurring and license plate blurring with deployment options aligned to enterprise governance.

A black-and-white wide-angle photo, with a white car in the center, and a rugged landscape in the background: meadows and a mountain range

Operational checklist before publishing

  1. Define purpose and legal basis. Common bases include legitimate interests; consent may be appropriate in some contexts depending on the relationship, setting, and expectations.
  2. Identify all visual identifiers. Use the checklist above to scope beyond license plates.
  3. Choose techniques. Combine face blurring, license plate blurring, and targeted masking for tattoos, badges, or house numbers.
  4. Decide processing location. Prefer on-premise software where transfer risks are unacceptable.
  5. Add quality control. Use a second review for minors or sensitive contexts.
  6. Record decisions. Keep an audit trail of what was masked and why, including any residual risk justification.

Teams seeking a hands-on evaluation can Download a demo to test workloads on representative footage.

black and white photo of the back of a Dodge pickup truck, the license plates have been anonymized

GDPR and UK GDPR elements relevant to publishing images

Topic

EU GDPR

UK GDPR + DPA 2018

 

Scope for identifiable images

Personal data if a person can be identified directly or indirectly [1, Art. 4(1)]

Equivalent definition retained in UK law

Household exemption

Outside scope for purely personal activities [1, Art. 2(2)(c)]

Equivalent exemption

Freedom of expression exemptions

Member States must reconcile via Article 85 [1]

Implemented through DPA 2018 special purposes provisions [3]

Video surveillance guidance

EDPB Guidelines 3/2019 on processing of personal data through video devices [2]

ICO guidance on video surveillance/CCTV and information rights [4]

Children’s images

Children merit specific protection; apply a risk-based approach

ICO guidance highlights additional protections for children [5]

For implementation questions or tailored deployment options, Contact us.

In the black-and-white photo, a Porsche car can be seen driving, and the license plate has been anonymized and blurred.

Common pitfalls to avoid

Relying on public-place capture is a misconception. The location of capture does not remove obligations if people are identifiable. Another frequent issue is masking only faces while leaving tattoos or name-printed clothing visible. Finally, do not rely solely on manual review at scale. Combining automated detection with a documented review step can produce more consistent outcomes.

question mark, like graffiti on a wall, black-and-white photo

FAQ - What Needs to Be Anonymized Besides License Plates?

Is face blurring always required?

Not always. If individuals are not identifiable due to distance or image quality, GDPR may not apply. This is context dependent and should be documented against Recital 26 criteria.

What about company logos in photos?

Logos are not personal data by themselves. If a logo is strongly linked to a person in a specific frame (for example, a named staff member in a small local business), consider whether other identifiers make the person identifiable. Otherwise, masking logos is usually a brand, confidentiality, or contractual issue rather than a data protection requirement.

Do uniforms need masking?

Uniforms with names, numbers, or unique local markings can identify individuals. Masking those parts is a common compliance approach, while generic uniforms without identifiers often do not require masking on that basis alone.

How to handle minors in event photos?

Use conservative masking of faces and name tags unless an appropriate legal basis and safeguards are in place. Review images for school insignia and predictable re-identification through context.

Is consent better than anonymization?

Consent can be valid in some settings, but it must be freely given, specific, informed, and unambiguous, and it must be as easy to withdraw as to give. For broad public dissemination, consent can be hard to manage. Anonymization reduces identifiability and can lower compliance risk, but both approaches should align with purpose and context.

Can automated tools miss a face?

Yes. Detection accuracy is context dependent. Low light, angles, reflections, and occlusions can cause misses. A documented human review step is recommended for high-risk publications.

Should audio be altered too?

This article covers visual data anonymization. Where videos include audio, a separate assessment is needed for voices or spoken information that can identify someone.

References list

  1. [1] Regulation (EU) 2016/679 (General Data Protection Regulation), including Articles 2, 4, 85 and Recital 26.
  2. [2] European Data Protection Board, Guidelines 3/2019 on processing of personal data through video devices.
  3. [3] UK Data Protection Act 2018, including special purposes provisions related to journalism, academia, art and literature.
  4. [4] UK Information Commissioner’s Office, guidance on video surveillance/CCTV and information rights.
  5. [5] UK Information Commissioner’s Office, Guide to the UK GDPR - Children and the UK GDPR.