Not sci-fi, but real life: Samsung introduced a voice-cloning feature for their new Galaxy S phones this week. Train the Bixby mobile assistant on your S23, S23+, or S23 Ultra phone, and it will successfully mimic you during calls. This duplication is marketed as a way to answer calls even when you can’t speak—you instead type the answers and the phone reads them to the other party in your voice.
But there is a flip side to this thing of sci-fi dreams. AI-trained voice imitation can also be used for nefarious purposes, as Motherboard pointed out on the same day as the Samsung news. Using AI-generated imitations of his voice, writer Joseph Cox went through his bank’s automated voice verification system.
Experts have warned for years of the weaknesses of voice authentication; long before AI-based tools, they could be bypassed by recordings. But such tricks often involve misrepresenting a mark and getting them to say appropriate phrases—for example, manipulating them into saying “Yes” in response to a basic question AI changes the game, because interaction is not necessary for simulation. It also doesn’t take much of your voice to train these services. Just a few minutes is enough.
In the short term, the risk is not yet widespread. For this kind of deception to happen without you knowing, your voice must be available to the public. It also needs some work. The motherboard initially failed to fool Lloyds Bank with the output from ElevenLabs, the tool it used. Many tests and revisions must occur before the output is passed as legitimate. And this probably needs to be combined with number spoofing or something like Samsung’s Bixby call feature if a financial or other sensitive service is monitoring for other signals that you are you.
But that doesn’t mean you should rest. Motherboard’s experiment is a clear example that biometric authentication should be treated with the same consideration as any other type of account or device protection. In this context, biometric data is a set of physical passwords—and like any password, if the information is accessed or unsecured, your security is weakened.
Your voice may be recorded. Your face can be photographed. Your fingerprint can be used while you are idle. You don’t have to stop using these methods to protect your privacy, but their convenience means they are weaker forms of defense. Think of them as the equivalent of short passwords. While they can deter some people, there are those who can easily avoid them.
George Prentsas / Unsplash
The smart use of biometric authentication is to use it for low-stakes situations—a PC where you don’t stay logged in to sensitive sites, a phone with important apps protected by a passcode or password, etc. And if you can’t match your voice or face with stronger measures? Disable it, if you can.
At the very least, if your bank uses voice verification, a second factor for authentication would be nice. Because while AI tools haven’t nailed impersonations right off the bat, they will continue to improve, increasing the potential threat. An ounce of prevention is worth a pound of cure, especially if you like to stream or post videos.