GDPR Risk Matrix for Video Recordings: How to Assess and Reduce Your Organization’s Legal Exposure

Łukasz Bonczol
Published: 4/13/2026

Visual data anonymization is a set of measures designed to reduce the likelihood that people visible in photos and video recordings can be identified before the material is published or reused. In practice, this most often means face blurring and license plate blurring. For marketing, PR, public sector, and compliance teams, this is not just a technical issue. It is a practical tool for reducing legal, organizational, and reputational risk.

When it comes to visual content, simply asking “should we publish this?” is too broad. A more useful question in an internal audit is: what is the level of legal exposure for this specific material, for this specific processing purpose, and for this specific category of recorded individuals? That is exactly what the GDPR risk matrix below is designed to answer.

A white laptop with a padlock and shield icon, symbolizing cybersecurity, sits against a black background. Office supplies are in the background.

The GDPR treats an image as personal data when a person can be identified directly or indirectly [1]. In video content, the risk increases because identification often results not only from the face itself, but also from context: the location, time, event, clothing, license plates, or the description accompanying the publication. That is why organizations often separate the assessment of whether publication is lawful from the assessment of the level of risk exposure.

With regard to faces, the need to have an appropriate legal basis for data processing and, where applicable, consent to publish a person’s likeness stems in practice not only from the GDPR, but also from civil law and copyright-related provisions. There are, however, three exceptions: the person is widely known and the image was captured in connection with the performance of their public role; the person’s likeness is merely a detail of a larger whole, such as a gathering, landscape, or public event; or the person received agreed payment for posing, unless expressly stated otherwise. Even where one of these exceptions applies, organizations typically still assess reputational risk and the risk of excessive publication.

For license plates, the situation is more complex. There is no general EU-wide rule requiring them to be blurred in every case. In Poland, the issue also depends on context. On the one hand, the Polish supervisory authority and the broader European approach to identifiability support a cautious approach; on the other hand, administrative court rulings have at times found that a license plate alone does not always constitute personal data. For an auditor, this leads to one practical conclusion: for open publication, especially online, license plate blurring is usually a risk-reduction measure even where the legal classification remains context-dependent.

Gray keyhole shape with embedded circuit board patterns, symbolizing cybersecurity or digital security concepts.

Audit Framework: Two Dimensions of Risk Assessment

The proposed GDPR risk matrix is based on two dimensions. The first is the purpose of processing, and the second is the category of individuals being recorded. This structure is practical because it allows a DPO or internal auditor to assess material before publication without creating an overly complex compliance worksheet.

The first dimension, the purpose of processing, can be divided into four levels:

  1. low-exposure internal purpose - internal training, quality review of footage, technical documentation with restricted access;
  2. informational purpose with moderate exposure - event coverage, institutional communication, press material;
  3. promotional purpose with elevated exposure - marketing, employer branding, social media, advertising;
  4. monitoring or evidentiary purpose with high exposure in case of secondary publication - CCTV, incident recordings, materials later used outside their original purpose.

The second dimension, the category of recorded individuals, can also be grouped into four levels:

  1. public figures or participants in a large event shown as part of a broader scene;
  2. employees, contractors, speakers, and individuals knowingly participating in the recording;
  3. customers, citizens dealing with public authorities, passers-by, and bystanders who may be identifiable;
  4. individuals requiring heightened caution - children, patients, people in sensitive situations, and participants in crisis events.

Combination lock on a keyboard with payment cards, symbolizing digital security and privacy. Black and white image.

GDPR Risk Matrix for Publishing Video and Photos

Processing purpose × category of individuals

Risk level

Typical audit assessment

Recommended action

Low-exposure internal purpose × public figures or broader scene

Low

Limited audience and low likelihood of harm

Review retention, restrict access, external publication usually not recommended without an additional legal basis

Low-exposure internal purpose × employees or individuals aware of the recording

Low to moderate

Risk depends on whether participation was truly voluntary and information was transparent

Document the purpose, minimize scope, apply face blurring if reused outside the original purpose

Low-exposure internal purpose × customers, citizens, or bystanders

Moderate

Identification is realistic and expectations of privacy are higher

Face blurring as a standard, license plate blurring where vehicles are visible, access control

Low-exposure internal purpose × individuals requiring heightened caution

High

Even without external publication, the risk of infringement remains

Strong data minimization, restricted use, and often no need for an identifiable image at all

Informational purpose × public figures or broader scene

Low to moderate

Publication is more often permissible, but the assessment depends on framing and captions

Select shots carefully, avoid close-ups, assess whether the person remains part of the whole scene

Informational purpose × employees or individuals aware of the recording

Moderate

Risk increases on open platforms

Verify the legal basis for using the person’s likeness, apply face blurring where identification is unnecessary

Informational purpose × customers, citizens, or bystanders

High

This is a frequent point of dispute in event coverage and public-sector materials

Default to face blurring, blur license plates, and crop footage to reduce identifiability

Informational purpose × individuals requiring heightened caution

Very high

Publication requires special care and may be disproportionate

As a rule, no identifiable image should be published

Promotional purpose × public figures or broader scene

Moderate

Marketing increases the intensity of interference

Careful shot selection, avoid default identification of specific individuals

Promotional purpose × employees or individuals aware of the recording

High

In relationships of dependence, consent may be challenged as not freely given [3]

Assess proportionality, use alternative shots, apply face blurring where recognition is not necessary

Promotional purpose × customers, citizens, or bystanders

Very high

The most common scenario requiring anonymization before publication

Default to face blurring and license plate blurring, plus manual review of other identifying elements

Promotional purpose × individuals requiring heightened caution

Critical

Publishing identifiable material will usually create unacceptable legal exposure

As a rule, refrain from publication or apply full visual anonymization

Monitoring or evidentiary purpose with secondary publication × all categories except a broad crowd scene

High to critical

Secondary use outside the original purpose significantly increases risk [1]

Assess purpose compatibility, determine the need for anonymization, and often carry out a fuller DPIA review

Gray metal door with a padlock securing it, positioned slightly to the right. Minimalist and monochrome appearance.

How to Use the Risk Matrix in an Internal Audit

The most practical approach consists of four steps. First, assign the material its dominant processing purpose. Second, identify the highest-risk category of individuals visible in the material. Third, read the risk level from the matrix. Fourth, assign the mandatory risk-mitigation measures. This ensures the assessment does not end with the abstract statement that “the material contains personal data,” but leads to a specific publication decision.

In practice, risk mitigation measures operate on three levels. The first is shot selection. The second is visual data anonymization. The third is control of the publication process, including source file retention, file access, and documentation of the rationale. If an organization uses a solution such as Gallio PRO, it can implement a standardized workflow for photo and video materials without moving the analysis into a streaming system. This matters because the solution does not perform real-time anonymization or video stream anonymization; instead, it supports preparing content before publication.

A padlock centered within binary code, encircled by chain links, symbolizing cybersecurity and data protection.

For material intended for external publication, face blurring is the most commonly used safeguard. If vehicles appear in the frame, organizations often add license plate blurring, especially for public and cross-border publication. This approach is consistent with a cautious compliance practice, even if the legal treatment of license plates in Poland remains context-dependent.

However, the tool’s capabilities should be described precisely. Gallio PRO automatically blurs only faces and license plates. It does not automatically detect company logos, tattoos, name badges, documents, or images shown on monitor screens. These elements can be blurred manually in the editor built into the software. For an auditor, this has evidential significance. If the risk of identification stems from something other than a face or a license plate, the automated process alone is not enough.

It is also worth noting that the software does not blur entire bodies and does not store logs containing detection data, personal data, or special category data. From a risk-reduction perspective, this is beneficial because it limits the creation of additional processing artifacts. If you want to validate the workflow using your own materials, you can try the demo.

Padlock and keys resting on a black keyboard, symbolizing cybersecurity. Black and white image.

When Should the Matrix Lead to a DPIA or an Implementation Consultation?

Not every item of content requires an in-depth analysis. However, if the matrix indicates a very high or critical risk level, organizations often move to a broader data protection impact assessment, especially where publication is large-scale, involves bystanders, or concerns material originally collected for another purpose, such as CCTV footage. This is the natural point at which to refer to internal compliance documentation or a DPIA worksheet.

This also applies to technical scenarios such as deploying on-premise software, processing large volumes of content, or combining anonymization procedures with a publication approval process. In such cases, it is worth to reach out to the team and clarify the implementation model, scope of permissions, and the way organizational measures should be documented.

Black and white photo of numerous padlocks attached to a metal cable, symbolizing security or love locks on a bridge.

Most Common Mistakes in Risk Assessment for Photos and Recordings

The first mistake is assuming that consent solves the entire problem. In employer-employee relationships or in interactions between public authorities and citizens, voluntariness may be assessed strictly [3]. The second mistake is equating a broad crowd scene with unlimited freedom to publish. A facial close-up or a caption attached to the material can change the assessment. The third mistake is overlooking license plates when publication is public and easily indexed by search engines. The fourth mistake is assuming that an automated tool will detect every identifying element. In practice, automatic detection covers only faces and license plates.

A question mark made of assorted small beads arranged on a textured wooden surface in black and white.

FAQ - GDPR Risk Matrix for Video Recordings

Does every recording with a visible face require anonymization before publication?

Not always. The assessment depends on the purpose of processing, the context of the shot, and the legal basis for using the person’s likeness. The three exceptions mentioned above still apply: a widely known person, a detail of a larger whole, and agreed payment for posing [7]. Even so, organizations often continue to assess whether publication is proportionate.

Are license plates always personal data?

That depends on the jurisdiction and the context. In Poland, the issue is not entirely settled, because the supervisory authority’s practice supports caution, while court decisions have also presented a different view. That is why license plate blurring is a common risk-reduction measure for open publication.

How does the matrix help a DPO during an audit?

It allows the material to be assigned to a specific combination of processing purpose and category of individuals, and then links that classification directly to appropriate risk-mitigation measures. This speeds up the decision on whether visual anonymization is sufficient or whether a broader analysis is needed.

Will an automated tool also detect tattoos, ID badges, and monitor screens?

No. Automatic detection covers only faces and license plates. Tattoos, logos, nameplates, documents, and images displayed on monitors require separate assessment and, where necessary, manual blurring.

Can video anonymization be done in real time?

The workflow described here concerns preparing material before publication. Gallio PRO does not perform real-time anonymization or live video stream anonymization.

Does the absence of detection logs matter for compliance?

Yes. From a data minimization perspective, this has organizational relevance. Gallio PRO does not collect logs containing face or license plate detection data, nor logs containing personal data or special category data.

References list

  1. Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 - General Data Protection Regulation.
  2. European Data Protection Board, Guidelines 3/2019 on processing of personal data through video devices.
  3. European Data Protection Board, Guidelines 05/2020 on consent under Regulation 2016/679.
  4. Polish Personal Data Protection Office (UODO), materials and guidance on image rights and video surveillance.
  5. Information Commissioner’s Office, guidance on video surveillance and personal data, and guidance on lawful basis and consent.
  6. Act of 23 April 1964 - Civil Code.
  7. Act of 4 February 1994 on Copyright and Related Rights.