Deepfake Videos Could ‘Spark’ Violent Social Unrest – Comment

By   ISBuzz Team
Writer , Information Security Buzz | Jun 17, 2019 08:30 am PST

Yesterday, the Foreign Policy Research Institute stated that deepfake videos could spark violent outbreaks and social unrest. Commenting on this, Kelvin Murray, Senior Threat Researcher at Webroot, believes that deepfakes hold little use in today’s society, but will continue to evolve as more advanced technology becomes available. 

Kelvin Murray, Senior Threat Researcher at Webroot: 

Deepfakes create a number of very real concerns for enterprises and individuals. For example, in the cybersecurity realm, we know that this technology is now being used to create high fidelity phishing attacks where the phishing target (financial institution, health care provider, auction site, email provider) is indistinguishable from the real entity. You can also imagine scenarios where a competitor creates a deepfake video with another company’s CEO making false statements, or user testimonials reporting problems with the company’s products. This takes bot product reviews to the next level.   

“I’m not sure most enterprises see deepfakes as providing potential value. Maybe the entertainment industry, or advertisers, or virtual reality companies who want to create realistic content. But the very idea of deepfakes implies that most consumers will be misled by the content, and not be able to distinguish the simulation from reality.   

Graphic Processing Units (GPUs) will continue to evolve, putting more and more image compute horsepower in the hands of bad actors. Perhaps the scariest scenario is the use of AI as a component of the production of the deepfakes, where it could be used to automatically edit out artefacts and “glitches” that can be used to differentiate between real and fake.  Ultimately casting suspicion on all digital content and damaging trust.” 

Subscribe
Notify of
guest
0 Expert Comments
Inline Feedbacks
View all comments

Recent Posts

0
Would love your thoughts, please comment.x
()
x