As I detailed in this space, face recognition is potentially dangerous because people can be recognized at far distances and also online through posted photographs. That’s a potentially privacy-violating combination: Take a picture of someone in public from 50 yards away, then run that photo through online face-recognition services to find out who they are and get their home address, phone number and a list of their relatives. It takes a couple of minutes, and anybody can do it free. This already exists.
Major Silicon Valley companies such as Facebook and Google routinely scan the faces in hundreds of billions of photos and allow any user to identify or “tag” family and friends without permission of the person tagged.
In general, people should be far more concerned about face-recognition technologies than any other kind.
It’s important to understand that other technologies, processes or applications are almost always used in tandem with face recognition. And this is also true of Apple’s iPhone X.
For example, Face ID won’t unlock an iPhone unless the user’s eyes are open. That’s not because the system can’t recognize a person whose eyes are closed. It can. The reason is that A.I. capable of figuring out whether eyes are open or closed is separate from the system that matches the face of the authorized user with the face of the current user. Apple deliberately chose to disable Face ID unlocking when the eyes are closed to prevent unauthorized phone unlocking by somebody holding the phone in front of a sleeping or unconscious authorized user.
Apple also uses this eye detector to prevent sleep mode on the phone during active use, and that feature has nothing to do with recognizing the user (it will work for anyone using the phone).
In other words, the ability to authorize a user and the ability to know whether a person’s eyes are open are completely separate and unrelated abilities that use the same hardware.
Which brings us back to the point of controversy: Is Apple allowing app developers to violate user privacy by sharing face data?
Critics lament Apple’s policy of enabling third-party developers to receive face data harvested by the TrueDepth image sensors. They can gain that access in apps by using Apple’s ARKit, and the specific new face-related tools therein.
The tools allow the building of apps that can know the position of the face, the direction of the lighting on the face and also facial expression.
The purpose of this policy is to allow developers to create apps that can place goofy glasses (or fashionable glasses to try on at an online eyewear store’s website), or any number of other apps that can react to head motion and facial expression. Characters in multiplayer games will appear to frown, smile and talk in an instant reflection of the players’ actual facial activity. Smiling while texting may result in the option to post a smiley face emoji.
Sign up for Computerworld eNewsletters.