Call coming in:
"That fingerprint sensor on your smartphone is not as safe as you think" by Vindu Goel New York Times April 12, 2017
SAN FRANCISCO — Fingerprint sensors have turned modern smartphones into miracles of convenience. A touch of a finger unlocks the phone — no password required. With services like Apple Pay or Android Pay, a fingerprint can buy a bag of groceries, a new laptop, or even a $1 million vintage Aston Martin. And pressing a finger inside a banking app allows the user to pay bills or transfer thousands of dollars.
While such wizardry is convenient, it has also left a gaping security hole.
New findings published Monday by researchers at New York University and Michigan State University suggest that smartphones can easily be fooled by fake fingerprints digitally composed of many common features found in human prints. In computer simulations, the researchers from the universities were able to develop a set of artificial “master prints” that could match real prints similar to those used by phones as much as 65 percent of the time.
The researchers did not test their approach with real phones, and other security experts said the match rate would be significantly lower in real-life conditions. Still, the findings raise troubling questions about the effectiveness of fingerprint security on smartphones.
“It’s almost certainly not as worrisome as presented, but it’s almost certainly pretty darn bad,” said Andy Adler, a professor of systems and computer engineering at Carleton University in Canada, who studies biometric security systems.
Full human fingerprints are difficult to falsify, but the finger scanners on phones are so small that they read only partial fingerprints. When a user sets up fingerprint security on an Apple iPhone or a phone that runs Google’s Android software, the phone typically takes eight to 10 images of a finger to make it easier to make a match. And many users record more than one finger — say, the thumb and forefinger of each hand.
Since a finger swipe has to match only one stored image to unlock the phone, the system is vulnerable to false matches.
“It’s as if you have 30 passwords and the attacker only has to match one,” said Nasir Memon, a professor of computer science and engineering at NYU’s Tandon School of Engineering, who is one of three authors of the study, which was published in IEEE Transactions on Information Forensics and Security. The other authors are Aditi Roy, a postdoctoral fellow at NYU’s Tandon School, and Arun Ross, a professor of computer science and engineering at Michigan State.
Memon said their findings indicated that if you could somehow create a magic glove with a master print on each finger, you could get into 40 to 50 percent of iPhones within the five tries allowed before the phone demands the numeric password, known as a personal identification number.
That's what they forgot to wear!
If it doesn't fit, you must.... never mind.
Apple said the chance of a false match in the iPhone’s fingerprint system was 1 in 50,000 with one fingerprint enrolled.
That seems pretty high.
Ryan James, a company spokesman, said Apple had tested various attacks when developing its Touch ID system, and also incorporated other security features to prevent false matches.
Google declined to comment.
The actual risk is difficult to quantify. Apple and Google keep many details of their fingerprint technology secret.
Stephanie Schuckers, a professor at Clarkson University and director of the Center for Identification Technology Research, was cautious about the findings. She said the researchers used a commercially available software program designed to match full fingerprints, limiting the broader applicability of their findings.
“To really know what the impact would be on a cellphone, you’d have to try it on the cellphone,” she said....
Gee, with the CIA hacking tools out there and available to anybody....