Now that the X is shipping, owners of the specially expensive flagship Apple phone can finally impress their friends by unlocking the device just by looking at it. Billed with a one-in-a-million chance of making a mistake, the facial recognition was the future of digital security, at least according to the marketing folks at Apple. It didn’t take long for someone in the security industry to turn that into a joke. Wired reported yesterday that a Vietnamese security company tricked the new unlocking mechanism with a cheap 3D-printed mask.
In two decades, biometric recognition transitioned from spy-movie futuristic tech to an everyday utility. Biometric passports allow us to jump queues at airports. Personal devices, such as mobile phones, now come with state-of-the-art fingerprint scanners for unlocking. Law enforcement agencies all over the world use facial recognition to look for suspects and identify threats. The tech is so easily available today that some drive-in restaurants, such as Wow Bao in the US, started to use face recognition to match payments and authorise take-out delivery. Biometric signatures are billed as the solution to people forgetting usernames and passwords. But the tech is not fool-proof, especially the consumer-grade kits that are emerging everywhere, and the dangerous trend of overusing flawed tech is makes me fear potential disasters.
For example, HSBC launched Voice ID this year, using biometric features to authorise access to people’s accounts. Creating a unique voice print consisting of “100 behavioural and physical vocal traits”, the bank claims that their system can recognise you even when you have a cold. Apart from the fact that my own family can rarely recognise my voice when I have a cold, as a professional software developer I am very skeptical of any bold claims such as this. It might be fine for a party trick, but I want my money protected by something stronger, thank you very much.
There are several key problems with this. The first is that biometric data is not unique, it just identifies people with very low probability of mix-ups. People sound alike, and look alike. And of course, it did not take too long for someone to discover how to cheat the system. Dan Simmons, a journalist working for BBC Click, managed to fool Voice ID in May 2017 with his twin brother Joe. All those hundred voice characteristics were no match for a simple edge case — a twin mimicking his brother’s voice.
The second big problem with such magic is that biometric data is actually an identifier, not an authorisation mechanism. Biometric information is a username, not a password. Even without finding a way to copy or clone biometry, it is much easier to force someone to give up their biometric data than a piece of secret information. Yes, you may be able to unlock the phone just by looking at it, but so can someone else if they just hold the device close to your face. Privacy advocates have already started raising concerns about border agents or police just taking people’s phones and easily unlocking them without the owner’s permission, but my worries are much worse.
As biometric readers are becoming more ubiquitous in personal devices to prevent hacking, thieves do not really need to go hi-tech to steal them. They can just go medieval. For example, Mr K Kumaran in Malaysia protected his 75000 USD Mercedes car with a fingerprint scan. Four attackers chopped of his finger in 2005 with a machete and happily drove away.
The third big problem it’s pretty difficult to prevent unwanted collection of face or voice data. Minority-report style shopping systems are popping up that can identify and track people as they enter. The tech is not there yet, but in a few years, it’s safe to expect your that facial biometric data will be fingerprinted as frequently as you now appear on CCTV. And then the key question becomes how much you trust every shop owner you ever visited to actually keep that data safe, given that a cheap 3D mask can unlock your phone.
Take for example Wow Bao, the innovative restaurant chain using facial prints to match you when you pay and get food. Their tech provider filed for bankruptcy, leading to worries that someone will pick up sensitive biometric data about millions of their customers by purchasing the failing company assets, leading to a class-action lawsuit in the US by the consumers.
The fourth big problem with biometric signatures is that they are very difficult to replace in case of problems. If someone clones your key, you can still change the lock on the door. But when someone gets access to your biometric data, how exactly do you go about changing it? And the more valuable the things under biometric lock are, the more they will present a target for hackers. Bank accounts surely fit the bill. The recent Equifax scandal, where information on 150 million people was stolen, should give pause to anyone considering that a big collection of biometric data is completely safe. I’m not looking forward to having to poke one of my eyes out because of negligent fools at a credit agency.
Even when biometrics are used for identifiers, not passwords, the technology is not perfect. John Gass from Natick, Massachusetts, USA, got a curious letter from the Massachusetts Registry of Motor Vehicles in April 2011. His driver’s licence had been revoked, effective from five days earlier, without any explanation. He immediately phoned the registry, but staff refused to explain why the licence had been revoked, and only suggested that Gass could reinstate it if he could prove his identity. After ten days of phone calls and a legal hearing at the registry, the mystery was finally solved. An automated photo recognition system matched him against something in an antiterrorism database by mistake.
Finally, biometric matching mistakes can lead to some quite curious problems. Alicia and Alicen Kennedy from Evans, Georgia, USA, who were repeatedly denied drivers’ licenses by their local Department of Motor Vehicles in 2015. DMV clerk wouldn’t let them take the test after handing in the paperwork. The computer refused Alicia’s photo. She took another picture, then another one, and the computer just kept refusing them. The pair had to sign the application forms over and over. The clerk finally gave up and called the headquarters, and it turned out that a computer was flagging the applications as fraud. In fact, the Kennedy sisters’ only crime was that they were twins, and the DMV’s facial recognition system could not tell the pair apart. The fraud detection systems complained that a single person is trying to apply under two different names.
The sad truth of any AI system is that it is only as good as its training data, and it seems as if nobody is actively training their robots to recognise twins. For those of you with an identical sibling, unfortunately this means that you won’t be able to drive cars or cross borders in the future. Though, on the other hand, you will easily be able to spend your sister’s cash.
For more stories on how tech fails because of bad assumptions, check out my latest book Humans vs Computers.