The talk of last week, reported in a Forbes post, was (via the Wall Street Journal) the story about a deepfake voice scam. The CEO of an unnamed UK-based energy firm believed he was on the phone with his boss, the chief executive of the firm’s German parent company, when he followed his orders to immediately transfer €220,000 (approx. $243,000) to the bank account of a Hungarian supplier.
The voice belonged to a fraudster using AI voice technology to spoof the German chief executive. Rüdiger Kirsch of Euler Hermes Group SA, the firm’s insurance company (which covered all of the loss), shared the information with the Wall Street Journal. He explained that the CEO recognized the subtle German accent in his boss’s voice—and moreover that it carried the man’s “melody.”
According to Kirsch, the as yet unidentified fraudster called the company three times: the first to initiate the transfer, the second to falsely claim it had been reimbursed, and a third time seeking a follow-up payment. At this point the victim grew skeptical; he could see that the purported reimbursement had not gone through, and he noticed that the call had been made from an Austrian phone number.
While he did not send a second payment, the first had already transferred, which was moved from the Hungarian bank account to one in Mexico, and then disbursed to other locations.
Kirsch told WSJ that he believes commercial software in the vein of that used by the AI startup company Dessa was used to spoof the German executives voice—which, if true, makes this the first known instance of AI voice mimicry being used for fraud (though it is of course possible other such instances have occurred).
The fraud in question purportedly occurred in March, two months before Dessa’s video impersonating comedian and podcaster Joe Rogan went viral. You can watch that video which is embedded in Naked Security’s version of the story here.
Because no suspects have been identified, little is known about what software they used or how they gathered the voice data necessary to spoof the German executive—but this case reveals one of the many possible ways AI can be weaponized.
From my foxhole, audio deepfakes will be a tool used for political purposes and for scamming money. Dealing with the former is quite a head scratcher. As for the more conventional deepfake audio scams, if you haven’t changed your wiring funds process to include a call to the authorized person at a known good number, it’s time to get that task checked off. Unless, of course, you like making headlines.
Sharon D. Nelson, Esq., President, Sensei Enterprises, Inc.
3975 University Drive, Suite 225|Fairfax, VA 22030
Email: firstname.lastname@example.org Phone: 703-359-0700
Digital Forensics/Cybersecurity/Information Technology