Deepfake technology has made headlines over the past several years due its stunning ability to fool its intended audience with doctored photos, video and audio. The software, which can be incredibly convincing, is becoming commonplace in Hollywood movies. But recent advancements in the technology pose a substantial threat to both businesses and individuals.
In the hands of fraudsters, deepfake technology can create the illusion of a legitimate transaction. You might think you are hearing from the CEO, the CFO or the attorney related to a merger, requesting a legitimate payment. And by the time a company realizes it has been duped, it’s often too late.
Early last year, deepfake voice technology was used in a $35 million bank heist in Hong Kong. A bank manager received a call and several emails from what appeared to be a company director he had spoken with before. The director claimed that his company was making an acquisition soon and needed a $35 million transfer to complete the process. The bank manager, recognizing the man’s voice and believing everything to be legitimate, complied and sent the money.
Of course, the person who called the bank manager and sent the emails was not who they said they were, and the money was stolen. The theft has implications for companies of all sizes, as it represents the latest step on the evolutionary scale of a familiar scam that has duped well-meaning financial professionals into transferring millions into the wrong hands.
Video and Audio Deepfakes
Deepfake technology uses artificial intelligence to combine still images of one person with video footage of another. While swapping out faces in photos has been a common practice for years through Photoshop, creating deepfake videos has been a more recent development. Over time, the technology has improved to the point where very few photos—and in some cases, just one—are needed to create a convincing video deepfake.
“All the bad actors need is a couple of images to load and train an algorithm to create the deepfake face output,” noted Beatriz Saldivar, Global Payment and Treasury Advisor for Kyriba. “There are also easily obtainable programs; all you need is a smartphone or laptop. There are even bad actors for hire, who for the right price, will create a deepfake of anyone.”
Deepfake audio, or “deep voice” technology is also a recent development. Much like with video, the software may only need as little as five seconds of audio to be able to successfully copy a person’s voice.
Deepfake audio is particularly concerning, due to the fast-paced, timebound environment that treasury and finance works in. If processes are not aligned with technology that can protect organizations from fraud, they came easily make the mistake of sending a payment based on a deepfake audio instructions.
In both the $35 million bank heist and the 2019 theft of $243,000 from a UK energy company, fraudsters used deepfake audio technology to clone the voices of company executives who requested money transfers. Although the technology is new, these incidents are just the latest variation of the most common and well-known type of business email compromise (BEC) scam—the classic CEO fraud.
Both frauds were almost textbook in their execution, with the telltale signs of a classic BEC scam—urgent requests from senior company officer to transfer money. Had either of these requests been made solely via email, they likely would have been flagged.
But because the mark in each of these scams believed they were talking to a contact they knew, both of them fell for it. And that is why this new variation of BEC is so dangerous.
After multiple warnings from the FBI’s Internet Crime Complaint Center (IC3) and prominent media coverage of these types of scams, treasury and finance departments became very familiar with the telltale signs of BEC scams. Yet despite this awareness, BEC scams have persevered; the 2021 AFP Payments Fraud and Control Survey found BEC to be the primary source for fraud last year, according to 62 percent of respondents. “While treasury and finance leaders are very aware of how widespread this type of fraud has become, they are not able to obstruct it sufficiently,” AFP explained in the report.
The one thing any expert will tell you when you receive a questionable request for a money transfer over email is that you should call your contact and verify that the request is valid. But if the scam begins with your contact reaching out to you over the phone, and it sounds like them on the other line, you’re much more likely to fall for it.
Fraud always evolves. And this latest step in BEC’s evolution may be a game changer. These two incidents, particularly the latter one, prove that deepfake technology can be used successfully, and result in a massive payout for criminals.
Brad Deflin, CEO and Founder of Total Digital Security, believes this will only incentivize more criminals to try their hand at this new technique. “With this enormous payday for the perpetrators and the technology becoming more pervasive and affordable, I would have to expect that it will increase,” he said. “And it’s an element of social engineering—whatever it takes to create some level of credibility and trust. It used to be simply an email that looked good and spoke right. That would do it. Now maybe the awareness is such that that’s not the path of least resistance for criminal. And so, these things will evolve and use the advances in technology to add sophistication to their ability to socially engineer and exploit.”
But the greatest threat may be yet to come. While phone calls using deepfake audio are convincing, wouldn’t a video call over Zoom be even more so? Deepfake video may not be at the point yet where a person can realistically impersonate someone over a video call, but it stands to reason that it’s coming. Current video call technology allows semi-realistic filters and backgrounds to be applied… are we truly that far away from being able to do the same thing with a real person’s face?
Moreover, that might not even be necessary. Many people don’t use video on Zoom calls anyway, particularly in the post-COVID world. If someone hopped on a Zoom call with you and they had a convincing email address, their LinkedIn headshot on display during the Zoom call, and a familiar voice, would you question it? The $35 million heist was just the tip of the iceberg, and it’s almost a mathematical certainty that more fraud attempts are underway.
“AI is becoming mainstream, and cybercriminals are notoriously early adopters of next-gen technology,” Deflin said. “The BEC exploits of the future will look nothing like those of the past. AI will tee up hacks that humans could never conceive themselves, and it’s game over for anyone unprepared.”
Avoiding a Deepfake-out
The following tips can help treasury and finance professionals identify audio and video deepfakes.
- Never trust an incoming call. A tried-and-true method that often thwarts BEC scams has always been to pick up the phone and call your contact (with a number you already have on file for them) to verify that they actually requested money be transferred. In this new threat paradigm, the same rules should still apply. If your contact called you on your phone or invited you to a Zoom call and requested a transfer, it’s still a good idea to call them afterward on a number you know to be legitimate and clarify.
- Use a verification measure in the discussion. A simple but effective way to confirm that you are speaking with your true contact is to verify their identity during the call. Have the other person answer a series of questions or provide a password that only your contact would know. Much like two-factor authentication on your computer, this practice adds an extra layer of security.
- Use authenticating tools. Microsoft has developed a new tool that analyzes photos and videos that can provide a confidence score of whether the material has been altered. It can detect subtle signs that indicate an image has been artificially produced, such as fading or greyscale pixels at the edges of where one person’s face has been merged with another. Unfortunately, since deepfake technology is rapidly advancing, these tools might quickly become outdated.
- Streamline manual processes. Beatriz Saldivar of Kyriba recommends having strong payment controls interlinked with sound technology to prevent fraud. Any manual verification outside of an ERP system or TMS is a huge risk exposure for any organization to become a target of the fraudsters.
Ultimately, the best advice financial professionals can follow is to always apply a critical eye. While it isn’t easy to be hyperaware of the threats around us, exactly what we need to be in this current environment. “We need to raise awareness and drive critical thinking at the individual level,” Deflin said. “Everything we’re going to see, like deepfake technology, is something we’ve never seen before. So, we need to equip people to see things they’ve never seen before and still have some ability to question it and respond to it rather than just react.”