Burger menu
-


Biometric Obfuscation Attacks

A biometric obfuscation attack conceals a person’s true identity with an object, facial hair, makeup, or even plastic surgery

Problem Overview and Definition

When it comes to liveness classification, biometric obfuscation is a malicious technique which allows a culprit to conceal their true identity from a biometric recognition system based on facial recognition, voice recognition, or other modalities. For this reason, types, countermeasures and challenges of antispoofing classify it as a potential attack tool.

Obfuscation attacks are close to Presentation Attacks (PAs) as they can employ similar tricks and tools: masks, digital alterations, makeup, and so on. However, PAs are aimed at impersonating a specific target to achieve a false acceptance, whereas obfuscation attacks are committed to conceal someone’s identity.

J. Dillinger is reported to be the first culprit who successfully used surgical obfuscation to conceal his fingerprints and face
J. Dillinger is reported to be the first culprit who successfully used surgical obfuscation to conceal his fingerprints and face

In facial antispoofing, obfuscation attacks involve falsifying, disguising, and even removing initial biometric traits to bypass an antispoofing system. The attack tools may vary, from simple facial occlusions like sunglasses to even getting facial plastic surgery. Face morphing can also technically be classified as an obfuscation attack, as it enables a culprit to hide behind someone else’s likeness and be successfully authenticated at the Automatic Border Control (ABC), among other situations.

Difference between Presentation and Obfuscation attack methods
Difference between Presentation and Obfuscation attack methods

Obfuscation attacks have been attempted for almost a century, if not longer. In 1933, Theodore Klutas, the head of the College Kidnappers, unsuccessfully tried to file down his fingerprints. Later, the practice was adopted by other Depression-era criminals: John Dillinger, Alvin Karpis, Fred Barker, and others. Recent notable cases include Orland Park’s bank robbery, a series of betting shop robberies in London, the "Immigrant Fingerprints" incident, and more.

Alvin Karpis demonstrating his fingers with surgically removed fingerprints
Alvin Karpis demonstrating his fingers with surgically removed fingerprints

Datasets

Two facial datasets were developed to neutralize the obfuscation threat.

SiW-M

The SiW-M, or Spoof in the Wild database with Multiple Attack Types, is the most extensive dataset focused on obfuscation and impersonation. It contains a rich collection of genuine and spoofed videos featuring 165 volunteers. For each participant, 8 genuine and up to 20 spoof videos were made with proportionate gender and ethnic diversity in mind. The obfuscation part of the dataset focuses on makeup attacks.

Samples from the SiW-M dataset displaying various poses, facial expressions and capturing methods
Samples from the SiW-M dataset displaying various poses, facial expressions and capturing methods

HQ-WMCA

The HQ-WMCA, or High-Quality Wide Multi-Channel Attack database, is largely based on the WMCA dataset with the intent of further boosting its sample quality. It contains 2,904 videos featuring actual attack data with higher resolution and frame rate. An array of capturing methods includes Shortwave infrared (SWIR) spectrum sensors.

A WMCA sample demonstrating an obfuscation attack with the face-aging makeup
A WMCA sample demonstrating an obfuscation attack with the face-aging makeup


As for other biometric traits, obfuscation is reported to be extensively used in fingerprint attacks. However, there are no known fingerprint datasets that focus solely on this problem.

Examples of fingerprint alteration: a) Transplanted from feet b) Bitten c) Acid-burned d) Stitched
Examples of fingerprint alteration: a) Transplanted from feet b) Bitten c) Acid-burned d) Stitched

Types of Obfuscation Attacks

Next, let’s go over some examples of obfuscation attacks and examine how they work.

Extreme Makeup

The impact of makeup on facial recognition has been explored for over a decade. Experts point to Makeup Presentation Attacks, or M-PAs, as a potential threat — with the help of cosmetics, it is possible to change facial shape, conceal unique skin markings, remove wrinkles, and so on.

Example of facial shape alteration through makeup application
Example of facial shape alteration through makeup application


Makeup is divided into Light and Heavy types. The former is rarely ill-intended, as it serves as a beautification tool and highlights natural complexion, lip color, etc. The latter aims to look unnatural — for example, artistic makeup that drastically changes a person’s appearance.

Example of impersonation through extreme artistic-level makeup
Example of impersonation through extreme artistic-level makeup

A separate type of makeup can be classified as Camouflaging, with CV Dazzle being the primary example. It was purposely developed to trick facial recognition for social protesters. It has even been proven effective when used against simpler police algorithms that compare a person’s image to a mugshot database — though more robust anti-spoofing solutions, especially based on passive liveness detection, can be immune to that tactic.

Example of CV Dazzle makeup
Example of CV Dazzle makeup

Partial Occlusion

Partial occlusions — which refer to visual obstructions of parts of the face — are a common issue in biometrics. They can disrupt recognition of face, voice, veins, fingerprints, and other traits. The  most common occlusive facial items are masks, glasses, phones, hematomas, drinking bottles, and so on. They can either conceal a person's identity or hinder liveness detection.

Example of an 'old man mask' obfuscation attack during the Orland bank robbery
Example of an 'old man mask' obfuscation attack during the Orland bank robbery

They are divided into Systematic, Temporary, and Synthetic groups. The first implies that an occlusion is either constant or intensively recurring, such as beard or pigmentation. The second includes pose variations, a person covering their face with a hand, etc. The last type refers to digital hindrances, such as digital stickers or augmented reality (AR) filters. Illumination levels are also considered as a special type of occlusion.

Overly bright photo flash is considered a facial occlusion
Overly bright photo flash is considered a facial occlusion

Plastic Surgery

Facial surgical procedures have been in use for malicious purposes since the 1930s — a report titled Plastic Surgery and Crime was issued in 1935. Plastic surgery is divided into two types:

  1. Local surgery. Mostly a benign procedure that aims to remove unaesthetic blemishes — birthmarks, wrinkles, or scars.
  2. Global surgery. This type is resorted to whenever a patient’s appearance needs to be reconstructed, usually after extreme damage to the facial structure. However, the same procedure can be used to completely change a subject’s look.

Developing a surgery-detecting solution is problematic due to how difficult it is to collect training material: pre/after footage of patients is often protected by medical confidentiality.

A narcotrafficker Luiz Carlos da Rocha managed to avoid arrest for 30 years due to plastic surgery
A narcotrafficker Luiz Carlos da Rocha managed to avoid arrest for 30 years due to plastic surgery

Experiments

Obfuscation attacks are harder to detect than regular PAs, as they are more insidious in nature. While impersonation PAs tend to change the entire face of a culprit, obfuscation mostly focuses on altering a specific facial region. Pixel-Wise Binary Supervision (PixBiS) has proven to be the best model, as it analyzes color information and is capable of detecting such occlusions as glasses, wigs, and makeup. Its enhanced version MC-PixBiS + ∆SWIRₒpt could successfully detect tattoos.

Performance of various models in obfuscation detection
Performance of various models in obfuscation detection

Evasion and Obfuscation in Speaker Recognition

Automatic Speaker Verification is susceptible to obfuscation attacks as well. Two typical methods are prosody and synthetic manipulations: pitch alteration, usage of falsetto, bite blocking, voice muffling, simulated speech impediments, voice conversion, white noise, and so on. The problem is still poorly explored, while some ASV systems demonstrate a 20%-48% Equal Error Rate (EER).

Face Morphing and Gender Obfuscation

Face morphing can be used for gender obfuscation, as this study shows. In this case, a male/female face is used as an obfuscator, which provides to a target face the required gender information:

  • Male + Female obfuscator = Average female face.
  • Female + Male obfuscator = Average male face.

To achieve better results, morphed images should be neutral and taken against a homogenous background. Delaunay triangulation is applied to separate morphing faces into a group of triangles to achieve results of higher quality.

Delaunay triangles provide smoother facial morphing
Delaunay triangles provide smoother facial morphing

Obfuscation Attacks in Authorship Recognition

Authorship is defined by an obfuscator’s unique style, word choice, chord progressions, length of sentences, palette of colors, favored plots, and so on. A few methods exist to overcome this obfuscation type, so far focusing on linguistics only. Among them are stylometric analysis, synonym-based classifiers, and Artificial Neural Network (ANN) approaches focusing on lexical properties, alternative readability measure, character count, and other features.

Federalist papers is a notable example of authorship obfuscation
Federalist papers is a notable example of authorship obfuscation

References

  1. J. Dillinger is reported to be the first culprit who successfully used surgical obfuscation to conceal his fingerprints and face
  2. A Survey on Anti-Spoofing Methods for Facial Recognition with RGB Cameras of Generic Consumer Devices
  3. John Dillinger- Fingerprint Obliteration
  4. Bandit Who Wore Old Man Mask Charged in Orland Bank Heist: FBI
  5. The man in the latex mask: BLACK serial armed robber disguised himself as a WHITE man to rob betting shops
  6. Doctor convicted of surgery to alter immigrant fingerprints
  7. Alvin Karpis demonstrating his fingers with surgically removed fingerprints
  8. SiW: Spoofing in the Wild Database
  9. High-Quality Wide Multi-Channel Attack (HQ-WMCA)
  10. Wide Multi Channel Presentation Attack (WMCA)
  11. Examples of fingerprint alteration
  12. Examples of web-collected images before and after the use of makeup to change facial shape
  13. Example of impersonation through extreme artistic-level makeup
  14. 'Dazzle' makeup won't trick facial recognition. Here’s what experts say will
  15. Can this makeup fool facial recognition?
  16. Overly bright photo flash is considered a facial occlusion
  17. Plastic Surgeon and Crime
  18. Drug cartel boss used facial plastic surgery to avoid police for 30 years before being arrested in Brazil
  19. Deep Models and Shortwave Infrared Information to Detect Face Presentation Attacks
  20. Deep Pixel-wise Binary Supervision for Face Presentation Attack Detection
  21. Evasion And Obfuscation In Speaker Recognition Surveillance And Forensics
  22. Delaunay Triangulation
  23. Gender Obfuscation through Face Morphing
  24. The Federalist Papers by Wikipedia
Avatar Antispoofing

1 Followers

Editors at Antispoofing Wiki thoroughly review all featured materials before publishing to ensure accuracy and relevance.

Contents

Hide