Daniel Oberhaus writing for Motherboard:
AI can generate fake fingerprints that work as master keys for smartphones that use biometric sensors. According to the researchers that developed the technique, the attack can be launched against individuals with “some probability of success.”
In most cases, spoofing biometric IDs requires making a fake face or finger vein pattern that matches an existing individual. In a paper posted to arXiv earlier this month, however, researchers from New York University and the University of Michigan detailed how they trained a machine learning algorithm to generate fake fingerprints that can serve as a match for a “large number” of real fingerprints stored in databases.
Known as DeepMasterPrints, these artificially generated fingerprints are similar to the master key for a building. To create a master fingerprint the researchers fed an artificial neural network—a type of computing architecture loosely modeled on the human brain that “learns” based on input data—the real fingerprints from over 6,000 individuals. Although the researchers were not the first to consider creating master fingerprints, they were the first to use a machine learning algorithm to create working master prints.
It’s a scary technology but…
At the highest level of security, the researchers note that the master print is “not very good” at spoofing the sensor—the master prints only fooled the sensor less than 1.2 percent of the time.
So there’s nothing to be scared of now but still interesting none the less.