Privacy, data protection regulations clamp down on biometrics use

The highly sensitive nature of biometric data and new regulations aimed to protect it are cause to rethink how it's used for authentication.

digital fingerprint / binary code
Monsitj / Getty Images

Biometrics are among the most personal forms of information requiring the strictest protections. Ironically, biometrics are increasingly being used as a primary or secondary authentication mechanism to protect access to other sensitive information. Regulators are taking notice.

“The liabilities associated with biometric information are extremely high because you can’t call God up and say, ‘Hey I need a new fingerprint because mine was stolen,’” says Judy Selby, a Partner at Hinshaw & Culbertson LLP who specializes in privacy and cyber insurance.

In the US, regulations in California, Illinois and New York are considered the gold standard for protecting the collection, use, storage, and reuse of biometric data. In addition, The National Biometric Privacy Act was introduced to Congress in August 2020, with the potential to become federal law. While other acts and laws are in the works, these contain the most comprehensive privacy restrictions similar to the biometrics privacy rules set in European nations under the General Data Protection Regulation (GDPR).

Biometrics use cases 

The global biometrics market is expected to reach nearly $20 billion this year, while in the US, biometric authentication and identification is predicted to reach nearly $6 billion.

“Biometrics are a great, easy way to replace passwords, but they are best used as a secondary factor in multi-factor authentication. For example, combine facial recognition on my phone with how I use my phone to validate I am who I say I am. This makes it harder for a criminal to use stolen biometrics data,” says Steve Martino, former SVP, CISO at Cisco.

Outside the US, biometrics are more commonly used for business-to-consumer applications, such as remote customer onboarding, priority in security lines, or automobile access. Whatever the use case, experts say that a company using biometrics for employees or their consumers need to protect the biometrics under the strictest regulatory requirements that they are subject to.

“Let’s say the head of a company’s factory needs to replace old-time clocks with new fingerprint readers to improve time efficiency for its employees,” Selby says. “The use of that biometric technology needs to be escalated within that company to the privacy, compliance, legal, and IT security teams to ensure compliance with the regulations and laws that the factory is subject to.”

Risk to biometrics just like any other

Biometric data collection, transmission, processing, and storage are subject to the same risks as any other sensitive data attackers can use to exploit user data, says Jay Bavisi, CEO and founder of global security certification firm, EC-Council (which includes biometrics hacking in its ethical hacking certification curriculum). “Criminals can capture a biometric during transmission through session hijacking or man-in-the-middle attacks. At the processing layer, biometrics data can be hijacked through keyloggers, denial of service, and exploits on the server,” he notes. “There are risks to biometrics at every layer.”

The problems worsen when other sensitive data is also stored on the individual, such as Social Security number, job title, home address, or banking information. “As bigger and more apps ask for more of our biometrics and personal data, it may not serve the purpose of authentication and potentially cause greater harm,” says Michelle Finneran Dennedy, cofounder of Privatus Consulting. “Don’t use biometrics as an authenticator to everything. Instead use it specifically for the application you said it would be used with.”

Once captured, the images or voice recordings could be replicated into a deep fake that criminals use for commercial or espionage purposes. So, the image itself isn’t of value unless it can be matched accurately enough to recognize a deep fake, says Andrew Bud, CEO of iProov, a UK-based facial verification vendor. “What matters in face verification is not whether this secret matches accurately. What matters is whether this is a real face or a forgery. That means the very heart of biometric verification has little to do with face recognition.”

To detect deep fakes in, say, a remote banking application, iProov uses a sequence of lights that flash on a user’s face through the user’s device. The sequences prove the skin is real and actively moving with live expressions. The lights are developed to protect photosensitive people and makes users of the applications feel that they are being properly scanned for authentication, Bud adds.

Because of its large client base in Europe, iProov also automates two of the top requirements for leading privacy laws—permission and transparency. During the registration, they get written permission to collect and use the biometrics, while they explain to users how they’re using the data. This creates a record for regulators, as well.

“Whether it happens to be a fingerprint, a facial image, or a voice-recognition pattern, a biometric, at the end of the day, is just data used to authenticate me a particular way,” Martino adds. “As such, biometrics are subject to privacy in the broader concept: Be transparent about what you’re going to do with the data, get sign-off from the owner of that data, and only use that data in the way you say you will.”

Knowledge, consent, personal benefits of using the application, and privacy protections should all be part of the security and privacy policies, Bud says. For those purchasing biometrics applications and hardware, he suggests talking to the vendor about their responsibilities and the buyer’s responsibilities to comply with regulatory requirements in their regions.

Verification is different from recognition

People are already giving away their faces and voices when they post their images to social media, webcasts, YouTube, and recorded video calls. How is this different from allowing the use of their biometrics to get into their workplace or authenticate to their banking application?

“There is a massive difference between facial or voice recognition and biometric verification. Biometric verification happens when someone knows and consents to it because they get personal benefit,” says Bud. “For example, along the process of signing up for the online banking service, the bank says we are going to capture your face now, here is what we will use it for your account logins, are you ok with that? If so, click ‘Yes’.”

Face recognition, on the other hand, happens when a monitoring system at a railroad security station identifies that you walked through the station at 7:47 am, Bud continues. “Do I know? No. Do I benefit from it? No. Do I know if my privacy is being protected? Absolutely not,” he says. “But these actions are not subject to the same laws as using these images for biometric verification.”

There are gray areas to this rule of thumb, however. In January 2020, Facebook agreed to a $550 million dollar settlement in a class action suit filed under the Illinois Biometric Information Privacy Act (BIPA), which is one of the laws that allows citizens to file civil suits for financial settlements. In this case, Facebook wasn’t using images for biometrics, but they were capturing facial images and sharing them with the Facebook tagging application—without user knowledge or consent. It’s the knowledge and consent part that Facebook failed under, say experts.

“Privacy in general is largely being transparent about what you’re going to do with the data I shared with you. On the one hand, if I’m an employee and I shared certain data for access and authentication, that employer has obligations of what they will do with the data. Anything else is a violation of privacy laws,” Martino explains. “On the other hand, If I’m using Facebook for free, I should have expectations that they are going to do certain things with my data. But Facebook should not do things with that data outside of what they explicitly say they will use the data for. That’s the core premise of what privacy is about.”

Copyright © 2020 IDG Communications, Inc.

7 hot cybersecurity trends (and 2 going cold)