Generative AI Makes Current Biometric Security for Payments Obsolete

Financial Institutions must Move Quickly to Prevent Innovative New Fraud Techniques

Generative AIBiometrics has long been viewed as the solution to ensure security for payments, with passwords giving way to voice and facial recognition. Now, though, generative AI is enabling new techniques that may make those biometrics obsolete. The challenge for payments firms is how to ensure secure transactions when their existing systems are no longer adequate in the age of AI.

Current Security for Payments

Cards and payments have indeed come a long way since the magnetic stripe and a signature were leading-edge security initiatives. The one-time password (OTP) was added a couple of decades ago as an ingenious cybersecurity innovation, and it worked for a while.

More recently, voice biometrics have been used to authenticate customers, from account opening KYC (Know Your Customer) and information access to transactions. Leading voice biometric systems even use tools such as active and passive authentication or liveliness-detection to verify customers.

The integration of voice biometric technology is often relatively simple from a technical standpoint, and banks around the globe have used it to speed up customer verification. Some banks even promote voice identification as equivalent to fingerprint identification, said SDK.finance CEO Alex Malyshev. The fast-growing industry of support for voice services has been expected to reach US$3.7 billion by 2031.

Some banks have also begun to use facial recognition for account access or payments, verifying a consumer’s identity by comparing a government-issued ID with a live selfie. Fans at the Qatar World Cup used facial biometrics for payments through a Visa partnership with Pop ID, for instance, and Mastercard also partnered with the firm for its Biometric Checkout Program.

The Risks from Fraud Syndicates

Similar to the diminished effectiveness of signatures and passwords as security measures in previous decades, biometrics are also losing their value as a secure form of authentication.

John Ainsworth, the CEO of CUSO, highlighted that the SMS one-time password technology has remained stagnant and unchanged for the past two decades, lacking significant advancements.

One-time password (OTP) bots, for instance, have rapidly gained notoriety for their potent capabilities to compromise the most fortified systems and circumvent security measures.

As biometric authentication becomes ubiquitous in financial services, Keyless Technologies COO Fabian Eberle wrote that bad actors are finding opportunities to exploit these systems.

A growing threat, according to Speech Technology Magazine, is the evolution of synthetic voice attacks by fraudsters. Synthetic voices generated by AI have evolved to where they are indistinguishable from real voices and need only a minute of voice data to produce realistic results. These deepfakes can include videos, images or recordings that are convincingly altered and manipulated to create audio or video that can be used to misrepresent someone.

Organisations that still rely on security questions, PINs, and passwords are falling behind in the battle against fraud, Nuance Communications Vice President Brett Beranek opined. “They may be in a lot of trouble.”

The New Risks with Generative AI and Biometrics

Generative AI such as ChatGPT has put these growing threats on steroids, and the pace of fraud that already had significant momentum is rising even faster.

“A growing percentage of what we are looking at is not authentic,” according to the US Federal Trade Commission (FTC), and he also mentioned that “it is getting more difficult to tell the difference because fraudsters can use tools to generate realistic yet fake content quickly and cheaply.” AI tools that create synthetic media and generate content are becoming easier to use. Thanks to sites like Facebook, Instagram, and YouTube, there are plenty of images and audio for fraudsters to find.

A Wall Street Journal reporter used ElevenLabs to clone her voice and fooled Chase Credit Card’s voice biometric system, for instance, into connecting her clone to a representative.

Haywood Talcove, chief executive of LexisNexis Risk Solutions’ Government Group, said that financial companies need to stop using voice-identification tools to unlock accounts. Just using voice alone doesn’t work anymore. The same goes for facial recognition, he said, adding that the technology was at the end of its useful life.

A variety of examples show the creativity of people using generative AI to perpetrate fraud and how easy it is to use the new tools.

One example, shared by Head of Innovation Technology at the AI Corporation Oliver Tearle, is that fraudsters are creating fake websites that offer free downloads of ChatGPT. The downloads act like a cover for installing malware on their victim’s device, which enables them to access personal data or passwords. They use generative AI-powered scams to take over social media accounts, which enables fraudsters to target victims’ contacts in a trusted environment and ask close friends or family for money. They may also use social engineering to access further sensitive information, hinting at passwords and secret answer questions used for online banking logins. Fraudsters also use generative AI to bypass verification checks by creating impressive fake identity documentation, which is realistic enough to pass standard checks involved in creating a bank account and then to apply for credit, BNPL (Buy Now, Pay Later) or worse.

Doriel Abrams, the Head of Risk in the US at Forter, a fraud prevention provider, said he was able to use AI tools to generate software capable of instantly producing hundreds of thousands of counterfeit credit card numbers within minutes.

And whereas fraud used to require specialised expertise, Financial IT said generative AI has led to the democratisation of fraud. Anyone can perpetrate fraud easily.

New Solutions

While generative AI can make fraud easier for criminals to perpetrate, legitimate organisations can also use it to fight back and prevent fraud. As one example, fintech Paymob observed, generative AI can increase the reliability of voice systems by analysing vast amounts of voice data and identifying speech patterns or trends to improve voice recognition accuracy. Generative AI can also enhance voice-activated payment processing systems with real-time fraud detection and prevention capabilities so they can authenticate users’ identities and securely process payments.

Shield VP of Data Science, Shlomit Labin explained that relying solely on fraud detection measures is inadequate for risk mitigation. Financial institutions also need to enhance their capabilities to detect the indications of fraud by meticulously examining human behaviour. Generative AI-powered behavioural biometrics are already analysing how individuals interact with their devices, Labin noted, such as the angle at which they hold them, how much pressure they apply to the screen, directional motion, surface swipes, typing rhythm and more.

Eberle similarly said that complementing biometric solutions with additional security measurements such as behavioural analysis that leverages device identification and geolocation or patterns in the user’s behaviour can enhance the authentication process.

Preventing Fraud is More Complex

Generative AI is clearly creating greater risks for fraud in payments, and biometric fraud prevention technology could become outdated quickly.

Financial institutions, spanning from tiny fintech startups to large multinationals, must swiftly embrace generative AI to prevent the fraud that is heading their way. Acting swiftly is crucial for protecting their interests.

Author: Richard Hartung, Associate, Singapore, Payments Consulting Network

An experienced professional with over 20 years of expertise in the payments and consumer financial services industry, specifically in the Asia Pacific region. He has held various key roles in organizations such as Citibank, Mastercard, and OCBC Bank, and has established his own consultancy, Transcarta, to assist financial services companies with strategy, operations improvement, and market research.

***

If you enjoyed reading this article and would like to be notified when future articles are posted, please sign up for our email newsletter.

Are you interested in reading articles on a particular payments topic, company, payments industry executive or author? Click the search icon, it’s that magnifying glass on the top right-hand side of the website and type in the keywords that interest you. You will then be presented with a list of any articles that match your search criteria.