Definition
Masking and blurring are obfuscation techniques used for image and video anonymization by intentionally distorting selected areas, such as faces and license plates. In a normative context, they are forms of visual data obfuscation described in ISO/IEC 20889:2018 as methods that hinder the identification of individuals or objects by degrading visual information. The most commonly used techniques include Gaussian blur, pixelation, mosaic effects, and solid covering masks.
In the context of the GDPR, masking and blurring serve as data minimization and privacy protection measures when publishing or sharing video and photographic materials. For obfuscation to be effective, it must cover all frames in which the object appears and apply a level of degradation sufficient to reduce the risk of re-identification to an acceptable level, in line with the privacy by design principle. Automation requires object detection, which in practice relies on deep learning models trained on labeled datasets and then applied to locate regions that need to be masked.
The Role of Masking and Blurring in Image and Video Anonymization
Masking and blurring are essential when publishing or sharing CCTV footage, journalistic content, training materials, or research recordings. Faces and license plates generally constitute personal data under the GDPR if they enable the identification of a person, especially when combined with other information.
- Faces: The obligation to anonymize arises from the GDPR and from Article 81 of the Polish Copyright and Related Rights Act. Exceptions under Article 81 include public figures, images captured as part of a broader scene, and situations where the individual has received agreed compensation for posing.
- License plates: Data protection authorities in many EU countries indicate that a license plate identifies a vehicle and may indirectly identify a person, which justifies anonymization when materials are disseminated. EDPB Guidelines 3/2019 on video devices emphasize the need to limit identifiability when disclosing recordings. In Poland, this issue is sometimes disputed, and administrative court rulings are not uniform as to whether license plates qualify as personal data. In practice, data controllers often blur license plates to comply with the data minimization principle.
In Gallio PRO, automation applies exclusively to faces and license plates. The software does not blur human silhouettes and does not perform real-time anonymization. Elements such as logos, tattoos, or documents can be masked manually in the editor. The solution operates on-premise and does not collect logs containing personal data or information that would allow the identification of detected objects.
Masking and Blurring Technologies
Effective obfuscation requires reliable detection, tracking, and the application of an appropriate filter to the region of interest. Below is an outline of a standard processing pipeline.
- Object detection: Deep learning-based algorithms (e.g., convolutional neural networks for faces and license plates). Models are trained on labeled datasets and run offline in inference mode on CPU or GPU.
- Temporal tracking: Maintaining object identity across frames, which reduces mask flickering and gaps caused by partial occlusions.
- ROI expansion: Enlarging the bounding box with a margin to compensate for positioning differences and detector error.
- Obfuscation filters: Gaussian blur, pixelation, mosaic effects, solid fills, and elliptical or rectangular masks with soft edges.
- Export and audit: Saving the obfuscated material along with a technical process log, without storing personal data.
Key Parameters and Metrics
Parameter selection affects both the level of privacy protection and the usability of the material. It is recommended to measure both detection quality and the effectiveness of preventing identification. Below are selected parameters and metrics commonly used in practice.
Parameter / Metric | Description | Unit / Notes
|
|---|---|---|
Detection threshold | Minimum model confidence required to accept an object | [0-1] |
IoU for matching | Intersection over Union threshold for evaluation and tracking | typically 0.5 or 0.75 |
Detection recall | Fraction of actual objects successfully detected | referenced to COCO/PASCAL metrics |
Detection precision | Fraction of detections that are true objects | unitless |
Blur kernel size | Size of the Gaussian filter | px, e.g. 21×21 |
Gaussian sigma | Blur intensity | px |
Mosaic pixel size | Side length of a pixelation block | px |
ROI margin | Percentage expansion of the mask relative to detection | % |
Re-identification risk R | Share of successful identifications after obfuscation by a strong recognition algorithm | R = successes / attempts |
Anonymization strength S | Measure defined as 1 − R | S = 1 − R |
Latency | Processing delay per frame | ms/frame |
Throughput | Processing throughput | fps |
QA defect rate | Percentage of frames requiring manual correction | % |
Challenges and Limitations
A trade-off must be considered between privacy protection and the analytical value of the image. Technical and legal risks also arise.
- Missed detections: Extreme angles, low contrast, occlusions, or small object scale.
- Tracking issues: Loss of an object between frames can create gaps in masks.
- Reversibility: Weak blurring or insufficient pixelation may fail to protect against recognition by modern face recognition algorithms.
- Legal uncertainty: Divergent interpretations regarding the status of license plates as personal data in Poland versus the practices of authorities in other EU countries.
- Usability: Excessive obfuscation reduces the evidential or training value of the material.
Use Cases
Masking and blurring techniques are applied across a range of image processing workflows outside of real-time regimes.
- Publishing CCTV or bodycam footage after incidents, with bystanders and license plates blurred.
- Sharing recordings with media outlets or local communities while anonymizing people and vehicles.
- Training materials and e-learning content with participant image protection.
- Research and sharing of public-space video datasets with face anonymization.
Normative References and Sources
This list includes legal acts and standards that define concepts, requirements, and best practices related to image processing and anonymization.
- GDPR - Regulation (EU) 2016/679 of 27 April 2016, Articles 5 and 25, and recitals on identifiability.
- EDPB, Guidelines 3/2019 on processing personal data through video devices, version 2.0 of 29 January 2020.
- ISO/IEC 20889:2018 - Privacy enhancing data de-identification terminology and classification.
- ISO/IEC 29100:2011 - Privacy framework.
- Polish Copyright and Related Rights Act, Article 81 - consent for image dissemination and exceptions.
- CNIL, Practical guidance on publishing images and identifying data in visual content - classification of license plates as personal data.
- AEPD, Guide on the use of video cameras for security and other purposes, updated versions - classification of license plates as personal data.
- NIST, Face Recognition Vendor Test (FRVT) - materials on evaluating face recognition algorithms; for re-identification testing, NIST FRVT publications and selected NISTIR reports may be referenced (numbering depends on the specific edition).