Comments On Deepfakes Security Issues

Cybercriminals are now perfecting deepfakes to impersonate people to steal money and anything that might be valuable. The technology has been improved to reach a higher level where it becomes difficult tell the difference between a fraud or friend.

https://twitter.com/gabrielwilder/status/1202500978785705984

Subscribe
Notify of
guest

1 Expert Comment
Most Voted
Newest Oldest
Inline Feedbacks
View all comments
Robert Capps
Robert Capps , VP
InfoSec Expert
December 7, 2019 5:49 am

While an attacker can use deepfake techniques to convincingly emulate the likeness of an individual, it still difficult to digitally impersonate ones voice without fairly obvious imperfections. Deepfake audio or video cannot currently be rendered in real time, without an attacker having a large volume of computing resources and a lot of high quality audio and video source material to train computer machine learning algorithms. While Deepfakes can be convincing to other humans, they are unable to pass physical or passive biometric verification, so coupling strong liveness detection, along with the collection of passive and physical biometric signals to verify a user’s identity, largely mitigate the current risks presented in banking transactions.

Last edited 2 years ago by Robert Capps
Information Security Buzz
1
0
Would love your thoughts, please comment.x
()
x