With Deepfake Video And Audio, Essentially, You Can’t Trust Your Own Eyes And Ears – Expert Insight

By   ISBuzz Team
Writer , Information Security Buzz | Nov 20, 2019 03:17 am PST
Notify of
1 Expert Comment
Oldest Most Voted
Inline Feedbacks
View all comments
Danny Thompson
Danny Thompson , SVP of Market and Product Strategy
November 20, 2019 11:21 am

With deepfake video and audio, essentially, you can’t trust your own eyes and ears.

We expect fraudsters to increasingly use these attacks, especially deepfake audio, to disrupt the supply chain by diverting payments to fraudulent bank accounts. The good news is, advancements in supplier portal technology are helping companies address this issue head-on.

Due to the rise of deepfakes, business email compromise and other common types of fraud, we recommend companies no longer accept remit-to bank account change requests from suppliers by mail, email, fax or phone. And, due to insider fraud, they should never accept supplier remit-to bank account change requests from internal sources. Instead, companies should accept changes only through a secure supplier portal—one that supports multi-factor authentication, logs their IP address, and tracks the supplier’s online behavior to identify suspicious activity.

The rise of deepfake audio reinforces our long-standing recommendation about portals, and it requires companies to go a step further.

The traditional best-practice secondary control on bank account change requests—regardless of submission method—has been a confirmation call-back to the supplier contact on record. The rise of deepfake audio means companies can no longer rely on call-backs to verify bank account change requests.

Luckily, a new countermeasure has become available over the last year: bank account ownership validation. This new control uses real-time integration with a consortium of banks to confirm the supplier is the actual bank account holder. Bank account ownership validation provides the ultimate control on fraudulent remit-to bank account changes. This would have stopped a recent deepfake audio attack in which a fraudster impersonated a company’s CEO on the phone, resulting in a transfer to a fraudulent bank account.

Last edited 4 years ago by Danny Thompson

Recent Posts

Would love your thoughts, please comment.x