Biometric Obfuscation Attacks
Problem Overview & Definition
Obfuscation is a malevolent technique, which allows concealing a culprit’s true identity from a biometric recognition system. In essence, obfuscation is somewhat close to Presentation Attacks (PAs) as it can employ similar tricks and tools: masks, digital alterations, makeup, and so on. The only difference is that PAs are aimed at impersonating a specific target to achieve a false acceptance.
Obfuscation attacks imply falsifying, disguising, and even removing initial biometric traits to bypass an anti-spoofing system. The attack tools may vary: from simple facial occlusions like sunglasses to plastic surgery. Partly, face morphing can also be classified as an obfuscation attack as it enables a culprit to hide behind someone else’s likeness and be successfully authenticated at the Automatic Border Control (ABC), among all else.
Obfuscation attacks have been known since at least 1933 — Theodore Klutas, the head of the College Kidnappers, unsuccessfully tried to file down his fingerprints. Later, the practice was adopted by other Depression-era criminals: John Dillinger, Alvin Karpis, Fred Barker, and others. Today’s notable cases include Orland Park’s bank robbery, a series of betting shop robberies in London, "Immigrant Fingerprints" incident, and so forth.
Datasets
Two facial datasets were developed to neutralize the obfuscation threat.
SiW-M
SiW-M or Spoof in the Wild database with Multiple Attack Types is the most extensive dataset focusing on obfuscation and impersonation. It contains a rich collection of genuine and spoofing videos featuring 165 volunteers. For each participant 8 genuine and up to 20 spoof videos were made with proportionate gender and ethnic diversity in mind. The obfuscation part of the dataset focuses on makeup attacks.
HQ-WMCA
HQ-WMCA or High-Quality Wide Multi-Channel Attack database is largely based on the WMCA dataset further boosting its sample quality. It contains 2,904 videos featuring bona fide and attack data with higher resolution and frame rate. An array of capturing methods includes Shortwave infrared (SWIR) spectrum sensors.
As for other biometric traits, obfuscation is reported to be extensively used in fingerprint attacks. However, there are no known fingerprint datasets that focus solely on the problem.
Types of Obfuscation Attacks
The following obfuscation attack types are mentioned.
Extreme Makeup
The effect of makeup on facial recognition has been explored for over a decade. Experts point to Makeup Presentation Attacks or M-PAs as a potential threat — with the help of cosmetics it is possible to change facial shape, conceal signature skin blemishes, remove wrinkles, and so on.
Makeup is divided into Light and Heavy types. The former is barely ill-intended as it serves as a beautification tool and highlights natural complexion, lip color, etc. The latter looks rather aggressive or unnatural. Artistic makeup can also be attested as heavy as it drastically changes a person’s appearance.
A separate type of make can be classified as Camouflaging with CV Dazzle being the primary example. It was purposely developed to trick facial recognition for social protesters. However, it is reported that it is effective when used against simpler police algorithms that compare a person’s image to a mugshot database — more robust anti-spoofing solutions can be immune to that tactic.
Partial Occlusion
Partial occlusions — which refer to visual obstructions — are a common issue in biometrics. They can disrupt recognition of face, voice, veins, fingerprints, and other traits. The common occlusive facial items are masks, glasses, phones, hematomas, drinking bottles, and so on.
They are divided into Systematic, Temporary, and Synthetic groups. The first implies that an occlusion is either constant or intensively recurring, such as beard or pigmentation. The second includes pose variations, a person covering their face with a hand, etc. The last type refers to digital hindrances, such as digital stickers or augmented reality (AR) filters. Illumination levels are also considered as a special type of occlusion.
Plastic Surgery
Facial surgical procedures have been in use for malicious purposes since the 1930s — a report titled Plastic Surgery and Crime was issued in 1935. Plastic surgery is divided into two types:
- Local surgery. Mostly a benign procedure that aims to remove unaesthetic blemishes — birthmarks, wrinkles, or scars.
- Global surgery. This type is resorted to whenever a patient’s appearance needs to be literally reconstructed after heavy functional damage has been received. At the same time, the same procedure can be used for completely changing a subject’s look.
Developing a surgery-detecting solution is hindered due to a difficulty of collecting training material: pre/after footage of patients is protected by medical confidentiality.
Experiments
It’s been confirmed that obfuscation attacks are harder to detect than regular PAs, as they are more insidious in nature. While impersonation PAs tend to change the entire face of a culprit, obfuscation mostly focuses on a specific facial region. Pixel-Wise Binary Supervision (PixBiS) proved to be the best model as it analyzes color information and is capable of detecting such occlusions as glasses, wigs, and makeup. Its enhanced version MC-PixBiS + ∆SWIRₒpt could successfully detect tattoos.
Evasion and Obfuscation in Speaker Recognition
Automatic Speaker Verification is susceptible to obfuscation attacks as well. Two typical methods are prosody and synthetic manipulations: pitch alteration, usage of falsetto, bite blocking, voice muffling, simulated speech impediments, voice conversion, white noise, and so on. The problem is still poorly explored, while some ASV systems demonstrate a 20%-48% Equal Error Rate (EER).
Face Morphing and Gender Obfuscation
Face morphing can be used for gender obfuscation as a study shows. In this case a male/female face is used as an obfuscator, which provides to a target face the required gender information:
- Male + Female obfuscator = Average female face.
- Female + Male obfuscator = Average male face.
To achieve better results, morphed images should be neutral and taken against a homogenous background. Delaunay triangulation is applied to separate morphing faces into a group of triangles to achieve results of higher quality.
Obfuscation Attacks in Authorship Recognition
Authorship is defined by the unique style, word choice, chord progressions, length of sentences, palette of colors, favored plots, and so on. A few methods exist to overcome this obfuscation type, so far focusing on linguistics only. Among them are stylometric analysis, synonym-based classifiers, and Artificial Neural Network (ANN) approaches focusing on lexical properties, alternative readability measure, character count, and other features.
References
- J. Dillinger is reported to be the first culprit who successfully used surgical obfuscation to conceal his fingerprints and face
- A Survey on Anti-Spoofing Methods for Facial Recognition with RGB Cameras of Generic Consumer Devices
- John Dillinger- Fingerprint Obliteration
- Bandit Who Wore Old Man Mask Charged in Orland Bank Heist: FBI
- The man in the latex mask: BLACK serial armed robber disguised himself as a WHITE man to rob betting shops
- Doctor convicted of surgery to alter immigrant fingerprints
- Alvin Karpis demonstrating his fingers with surgically removed fingerprints
- SiW: Spoofing in the Wild Database
- High-Quality Wide Multi-Channel Attack (HQ-WMCA)
- Wide Multi Channel Presentation Attack (WMCA)
- Examples of fingerprint alteration
- Examples of web-collected images before and after the use of makeup to change facial shape
- Example of impersonation through extreme artistic-level makeup
- 'Dazzle' makeup won't trick facial recognition. Here’s what experts say will
- Can this makeup fool facial recognition?
- Overly bright photo flash is considered a facial occlusion
- Plastic Surgeon and Crime
- Drug cartel boss used facial plastic surgery to avoid police for 30 years before being arrested in Brazil
- Deep Models and Shortwave Infrared Information to Detect Face Presentation Attacks
- Deep Pixel-wise Binary Supervision for Face Presentation Attack Detection
- Evasion And Obfuscation In Speaker Recognition Surveillance And Forensics
- Delaunay Triangulation
- Gender Obfuscation through Face Morphing
- Practical Attacks Against Authorship Recognition Techniques
- The Federalist Papers by Wikipedia