sec fines
Cybersecurity Fintech North America Spotlight

Spotlight: The Limitations of Biometric Verification

Though facial recognition and other biometric verification processes are becoming ever more popular in the fintech world, bias remains rife within the technology, causing gaps in service for some customers. 

Rick Song
Rick Song, CEO, Persona

Persona is an identity infrastructure that enables companies to verify their customers are who they say they are and carry an individual’s identity across an entire business. The idea for Persona came to CEO Rick Song while working at Square as an engineer. Rick saw how challenging it was for the payments leader to manage customer identities across their growing portfolio of products. That’s when the market opportunity became clear. 

To share more of his insights into identity verification, its application in the fintech industry and how challenging bias can be, we sat down with Persona CEO Rick Song to learn more.

What problem is Persona trying to solve?

We create trust in the online world by enabling every business to verify that each user is who they claim to be throughout the customer lifecycle. Our customer Coursera must ensure that every recipient of a degree is actually the student who completed the course. In financial services, BlockFi uses Persona to protect customers from fraudulent withdrawals by bad actors.

But most companies today still treat identity as an isolated issue. They still use manual, time-consuming, and error-prone methods to confirm identities. They verify user identity for one use case, attach that person to a set of credentials for future interactions, and call it a day.

That’s why we built the industry’s first true identity infrastructure platform that can be customised for all use cases and types of businesses—from payment platforms to e-commerce companies to marketplaces.

What are the opportunities that biometric verification presents in fintech?

Biometric verification offers convenience—it’s often easier than passwords, and always ready to use because we take our face, eyes, and fingerprints everywhere. And while biometrics were previously more expensive than other verification methods, the price differential is evaporating and now there is almost parity, cost-wise, with SMS. SMS isn’t foolproof, as there are ploys and techniques for stealing SMS authorisation codes, enabling accounts to be stolen or hijacked.

Biometrics also make it much harder for unauthorised users to access your accounts and services. When your kids in college hitchhike on your Netflix, it’s probably no issue, but a stowaway in your stock brokerage account is a major problem—and illegal.

What are the limitations of biometric verification in fintech?

Accessibility is an issue for biometric verification because it requires equipment. If your laptop or phone camera doesn’t work, you might not be able to access your account if that’s the only medium a company allows you to use to log in. Also, the hardware must be of high enough quality. Not everyone will have the right equipment, so it’s important for financial institutions to allow for backup verification options—or they could risk losing legitimate customers.

Deepfakes are another problem for facial recognition, which is why financial institutions should not depend exclusively on biometric selfies to verify identities. Then there’s theft. Your fingerprints, once lost or stolen, are permanently compromised. You cannot change your fingerprint and shrug off the loss. Eventually, it may return as a targeted attack and compromise your accounts.

Every security technology has its limitations. The key with biometrics is to apply them as part of a holistic solution. For example, banks can use Persona for biometric selfies, but combine this verification method with other signals that can pick up on suspicious users. They can also step up scrutiny—and apply multi-factor authentication—on specific user activity.

What are the challenges for biometric verification?

Environmental factors can interfere. Good lighting is important for accuracy in facial matching. Background noise and masks can impede voice matching—so can the flu, actually. If your hands are wet, the fingerprint reader may not work properly. On the other hand, a password is good all the time—as long as you’re the only one who knows it.

Biometrics have also become more spoofable with time. Five years ago, deepfakes were unheard of. It takes ongoing investment to keep up with evolving fraud techniques.

Biometric data is also tougher to handle and store than passwords. It’s dense, and storing it securely and compliantly requires knowing what and when to delete. Because over time, a lot of biometric data may be—is likely to be—breached and stolen, multifactor security should be the framework for biometrics.

What happens when biometrics gets it wrong?

False positives occur from time to time—some people look alike, so an intruder may have a chance to fake their way in if they happen to be an identical twin. However, a random iPhone thief from the population at large has only the remotest chance of defeating the Face ID system.

There are also issues with certain methods like voice as it provides relatively weak security compared to other biometrics. Yes, it shows you are alive, but voice verification requires consistent conditions. A sore throat, poor microphone quality, or COVID mask could foil correct matching. It can be difficult to obtain the best control data for voice mapping, and good control data is always an important requirement for biometrics.

That’s why it’s critical to have a broad set of verification methods, so if one fails, you have other ways of verifying the user in question.

How much do bias issues factor into facial recognition tech?

Biometric verification depends on AI, and AI accuracy depends on the data used to train the model. If a system rejects legitimate users who have darker skin colour, it’s likely because the training data lacked sufficient representation of the diverse user population. As a result, the model’s training is incomplete.

We’ve dug deep into the challenges of face verification. We audited our own facial verification results. With better data, and directions for users to ensure the necessary lighting, we achieved 30% higher accuracy—meaning much less bias. We continue working on data quality so the system is less and less likely to give a false non-match for anyone.

How can we solve these issues? What are the challenges involved?

While AI may never be perfect, there are ways to address the bias in facial recognition: better lighting, cameras that adjust for facial shots, and most importantly, better—more representative—training of AI systems. We can also guide users to improve lighting settings on their phone before they snap the selfie required for verification.

What is the future of biometric IDV in the financial sector?

We’re working toward the day when you can prove who you are online without needing to give away so much personal data. Our mission is to make that a reality by becoming the identity layer of the Internet. We help companies store and manage data within a secure infrastructure to limit how much sensitive personal information they have to ingest into their platforms.

Final thoughts?

In the financial industry, which includes physical and online banks as well as stock brokerages, lenders, and insurance, bad actors have almost unlimited motivation to impersonate someone else in order to gain access and steal or defraud.

And the victims aren’t just the individuals fraudsters are attacking — these attacks can damage companies’ reputations and make it hard to attract and retain customers. According to Twilio, 86% of consumers said they’d stop using a business if their account was compromised.

Identity verification is a powerful way to help protect businesses and customers — and build trust. But while biometric verification is useful, it’s not foolproof. Instead, it needs to be part of a holistic set of signals businesses use to know their customers.

At Persona, we give businesses the building blocks they need to design and build the ideal identity experience for their specific industry, use case, and customers. Organisations can easily switch up verification methods or add more friction for extra assurance. This is increasingly important to make things unpredictable for hackers, who never, ever rest. With our holistic set of verification options and ability to dynamically step-up or step-down friction based on risk signals, businesses can ultimately make identity verification painful for fraudsters and painless for customers — in essence, mitigating fraud without hurting conversion.

Author

  • Polly is a journalist, content creator and general opinion holder from North Wales. She has written for a number of publications, usually hovering around the topics of fintech, tech, lifestyle and body positivity.

Related posts

Astra and Plaid Partner To Automate Bank Transfers, and Simplify Over $62T Worth of Transactions

Francis Bignell

nCino on Putting People at the Centre of Digital Transformation Efforts

Polly Jean Harrison

SteelEye Issues WhatsApp Warning to Small Firms After $1.8billion Fines Hit Wall Street

Tom Bleach