Companies like Apple and Samsung are replacing fingerprint scanners on smartphones and tablets with facial recognition systems. While that makes design sense, does it also make security sense?What\u2019s driving this change is the desire to make premium phone with what is known as 'edge-to-edge' display. This means the front of the phone is just screen, free of the frame (known as bezels) around it. However, without bezels there\u2019s no place for the fingerprint sensor on the front of the phone. Samsung and others have tried moving it to the back of the phone. On the Galaxy S8 it\u2019s right next to the camera lens which frequently gets smudged when using it. Also, it\u2019s just not as convenient as on the front of the phone. With consumer tech security, convenience is everything.The other possible solution is integrating the sensor into the screen itself. That has turned out to be no simple thing. Sensing the fingerprint beneath the glass of the display makes it significantly harder to get the quality of the image needed. Until that issue is solved companies are turning to facial recognition to get the job done. Does it?Unfortunately there are several problems inherent in both technology and faces that suggest the answer is no.The first is that unlike fingerprints, faces change. This can be the result of age, facial hair, illness, and\/or gaining weight, it doesn\u2019t matter \u2013 they all make it more difficult for facial recognition to work well. In There\u2019s also the issue of how the face is seen: While your facial features are intrinsic properties, the appearance of your face is subject to several factors, including pose (or camera viewpoint), illumination, facial expression, and occlusions (sunglasses or other coverings). In unconstrained scenarios where face image acquisition is not well controlled, or where subjects may be uncooperative, the factors affecting appearance will confound the performance of face recognition.Moreover, there may be similarities between the face images of different people, especially if they are genetically related. Such similarities further compound the difficulty of recognizing people based on their faces.And this is before you get into the very well-documented problems facial recognition has with race and gender. As Joy Buolamwini of MIT and Timnit Gebru of Microsoft found in their research of three commercial software systems: \u201cDarker-skinned females are the most misclassified group (with error rates of up to 34.7%). The maximum error rate for lighter-skinned males is 0.8%.\u201d This problem is so persistent that Microsoft is calling for government regulation to deal with it and when is the last time you heard of a tech company doing that?Then there\u2019s the issue of lighting and smartphone facial recognition.Cameras on the screen side of phones aren\u2019t as powerful than those on the back. This makes them more reliant on good lighting to produce a quality image. Backlighting in particular poses a big problem. Apple\u2019s iPhone X used special illuminators to counter this with varying degrees of success in its FaceID system. Some reviewers reported having problems using it in direct sunlight but noted that overall it performed better than expected.Samsung is hoping to improve facial recognition by including a type of iris scanner with its latest devices. The entire system is named \u201cIntelligent Scan\u201d and includes what the company calls Eyeprint Veri\u00adfication. It works by first scanning your face and then moving on to the iris if authentication initially fails. If conditions aren\u2019t great for using either of those, it then combines them to unlock your device. It isn\u2019t clear from the company\u2019s literature whether this system uses true iris scanning, which is very secure. However, it is telling that the company is choosing to include a second biometric recognition element rather than just relying on facial.Facial recognition is likely the easiest type of biometric to spoof. Early versions on phones were fooled by a photograph. Apple\u2019s FaceID now uses 3D depth maps to register and verify the physical features of the device holder. This makes it considerably harder to fool at it requires hackers to reproduce a physical representation of a target\u2019s face. It also uses machine learning to analyze your expression whenever it sees your face, this allows it to determine whether it\u2019s an authentic unlock attempt. Further, it doesn\u2019t work if you\u2019re not awake. Even with all that Apple still provides another security check, requiring a good, old-fashioned pin code to prevent someone from siphoning data from a phone unlocked with FaceID.The ubiquity of photographs means that likely as not there\u2019s a photo of you on the internet, accessible by anyone who cares to look for it. Because phone cameras keep improving it is even likely that these photos are high-resolution. That makes it much easier for someone you don\u2019t know to develop a spoof that can fool a facial recognition system. By contrast, few people have fingerprint images that are available online and far, far fewer (possibly none) have iris or retinal scans online.All of this is why people should definitely hesitate before going over to any system that relies solely on facial recognition. Facial works best a part of a multi-factor authentication approach. Even then, though, it is a far weaker factor than either fingerprints or iris and retinal scanning.