Researchers from New York University have used a neural network to generate artificial fingerprints that work as a “master key” for biometric identification systems and prove fake fingerprints can be created.
According to a paper presented at a security conference in Los Angeles, the artificially generated fingerprints, dubbed “DeepMasterPrints” were able to imitate more than one in five fingerprints in a biometric system that should only have an error rate of 1 in a thousand.
The researchers took advantage of two properties of fingerprint-based authentication systems:
The first is that, most fingerprint readers do not read the entire finger at once, instead imaging whichever part of the finger touches the scanner. Such systems do not blend all the partial images in order to compare the full finger against a full record; instead, they simply compare the partial scan against the partial records.
The second is that some features of fingerprints are more common than others. That means that a fake fingerprint that contains a lot of very common features is more likely to match with other fingerprints.
Based on this, the researchers used a common machine learning technique, called a generative adversarial network, to artificially create new fingerprints that matched as many partial fingerprints as possible.