Do ‘Deepfakes’ Mean Real Trouble for Voice Banking?

voice banking

The Walls Have Ears

Is your Alexa hanging around your home bored, listening, and resorting to the occasional midnight shenanigans to remind you that she is there? Has Alexa become shelfware? While smart speakers have been all the rage in recent years, there are signs that consumer enthusiasm towards their home-based voice assistants may be waning. Curious, considering that the banking industry is rapidly moving to voice banking and payments services designed to leverage your digital assistant and your voice to reduce friction and expedite banking interactions. But, with this dynamic on the rise, are financial institutions about to discover the human voice is frighteningly susceptible to fraud?

And, equally importantly, are digital identity-based user verification and assessment technologies really the answer?

Here’s what we know. Today, voice interfaces like Alexa, Siri, and Cortana are being adopted at a faster rate than almost any other consumer technology in history—including smartphones—according to Accenture. Astonishingly, one-half of all global online consumers have them. Think about that; by some estimates there will be 4 billion people using the internet by the end of 2019 and 50% of them (2 billion people) will use a voice assistant! And as Recode reports, up to one third of the US population already owns a smart speaker like Echo or HomePod, and the number of installs is expected to double over the next two years, to 225 million.

“The Sound of Silence”

But here’s the thing: They are not that sticky and after about a month, people don’t seem to use them as much. And understandably, there are the growing privacy concerns. But in truth, home may not be the most compelling environment for voice services—and it’s starting to show. Sure, it’s cool to order your pizza, groceries, or even dog food through your home smart speaker a few times but after that, the data shows that the home digital assistant goes silent. Today, at least 45.7 million US adults have a voice assistant-enabled smart speaker at home. But 90.1 million of them have them on their smartphones. And, 77.1 million already have voice assistants in their cars. This is the key for financial institutions: while Alexa may be silent at home, she and her friends, “Siri,â€� “Erica,â€� and “Assistant,â€� do like to travel and assist us with our voice needs. So, it’s no surprise that financial institutions are racing to roll out voice-enabled, or “conversational,” banking and payments services. And out-of-home voice is definitely the focus of their attention.

Take This Show on the Road

To those just tuning into the voice assistant phenomenon, conversational banking and payments services are advanced voice bots that combine artificial intelligence and natural language processing capabilities to enable users to utter commands instead of entering them manually through a keyboard.

TD Bank’s latest Alexa “skill,” as these abilities are called, enables customers to place stock trades verbally. And Bank of America’s “Erica,” the AI-driven voice assistant on its mobile app, is even able to analyze customer banking patterns to proactively offer advice on controlling spending during the day. Think of her as your own traveling virtual CFO.

Still, while the ability to verbally ask for directions to the nearest ATM using your phone, smart watch, or hip new “audio sunglasses” may be cool, voice assistants increasingly want to join your carpool, too. And it’s easy to see why.

Today, the average American has a 51-minute round-trip commute each weekday, according to a new report from PYMENTs. More than 53% of them use voice assistants during those commutes. They also spend a whopping $230 billion per year along the way.

So it is little wonder most voice assistant manufacturers want to take this show on the road and make their applications compatible with automotive infotainment systems. But as the pub reports, Google has also struck deals with Audi and Volvo to embed its AI directly into the autos’ native systems, positioning the company well for an emerging, “connected car ecosystem.”

Banks and payments providers will no doubt follow suit. But they should expect to hit some speed bumps on the way.

Fraud Finds New Voice

Without a doubt, in-car eCommerce and banking could be huge, enabling food and entertainment purchases, as well as routine banking, for consumers on the move. In most cases, voice will be used as a biometric to confirm identity. So, it’s not like commuters will be shouting their personal identity information or login credentials while they’re weaving through traffic. But authenticating using voice is a whole different ball game and a potentially lucrative threat vector.

So, whether it’s at home, on the sidewalk, or out on the open road, these services can count on the fact that emerging, “deepfake” technologies will soon be used to defraud banking and payments providers by helping fraudsters impersonate customers like never before possible. Think about it, all a fraudster needs is a small recording or your voice captured by simply as answering a call from an unknown number or forwarding the call to your voicemail.

It’s pretty mind-blowing, actually. Using these AI-based tools, you can create audio and even audio-visual content of anyone, appearing to say or do anything—in their own voice and likeness. It’s so convincing, the Wall Street Journal is training journalists to detect deepfakes for what’s sure to be the next wave of “fake news.” But financial institutions could be facing something far worse – think fake voice authentication!

Here’s the gotcha: What happens when cyberthieves use “deepfake” tech to turn Alexa, Erica, Siri, or some other digital voice assistant to an accomplice in account takeover (ATO) attacks through voice-enabled services or call-center scams?

The Voice of Reason

For financial institutions to distinguish the real from the surreal isn’t just a matter of training or anti-deepfake technologies to counter new advances. As adoption of voice banking accelerates, banks will find they can still depend on digital identity-based user verification and assessment solutions that combine big data analytics, smart rules and machine learning to accurately authenticate customers.

That’s because these solution’s don’t differentiate between fraudsters and legitimate customers based solely on a password, a voice match, or even both. Thanks to an endless stream of data breaches, login credentials are a dime a dozen—and biometrics, as we’re learning, can be manufactured or pilfered. Instead, digital identity-based solutions take all of that data and analyze it in real-time against hundreds of different, dynamic identity elements that can’t be faked or stolen.

Those that combine online and offline identity information with globally-shared threat intelligence are able to cut through the noise and accurately recognize a good customer even when, for example, a new mobile phone, voice biometric, or other transactional attribute has just been added to a bank or payments account for the first time. Imagine a service that can discern in real time that the Paris-based customer who checked her bank account balance two minutes ago is not the same person attempting to withdraw money from the account via an in-car, voice-based transaction from Bakersfield.

Talk, Fast—and Safely

Banking and payments providers racing to roll out voice services without modern, digital identity-based user authentication will want to step things up. As it stands now, even old-school voice fraud is up 350% since 2013, so it’s easy to see how deepfakes could be used to fool human call center operators or voice-based biometrics systems into granting thieves access to user accounts. With voice on the rise, nobody wants to see it silenced over fear of fraud.

To learn more how digital identity-based user identity verification and assessment solutions can help prevent ever-evolving forms of mobile and online banking fraud, check out this case study.

The post Do ‘Deepfakes’ Mean Real Trouble for Voice Banking? appeared first on ThreatMetrix.

Source: Identity
{$excerpt:n} Protection Status