A Brooklyn couple got a call from relatives who were being held ransom. Their voices—like many others these days—had been cloned.
You wake up to a call in the middle of the night.On the other end of the phone, you hear a familiar voice: your mother’s, wailing and repeating the words “I can’t do it, I can’t do it.” Then another voice comes on the line, one you’ve never heard before, and says, “You’re not gonna call the police. You’re not gonna tell anybody. I’ve got a gun to your mom’s head, and I’m gonna blow her brains out if you don’t do exactly what I say.”Charles Bethea tells the chilling storyof a couple in Brooklyn who faced this situation—and quickly paid a ransom through Venmo—only to receive yet another shock when they realized they’d been snared in an increasingly common scam. Bethea follows the rise of A.I.-based voice cloning, which is being used in all kinds of devious ways, from political manipulation to petty theft. “We’ve now passed through the uncanny valley,” one expert explains. “I can now clone the voice of just about anybody and get them to say just about anything. And what you think would happen is exactly what’s happening.” SupportThe New Yorker’saward-winning journalism.Subscribe today » |