FakeNews Gets Dirty

March 11, 2018

As 2018 turned out to a year of women's empowerment increasing the number of female candidates getting into politic around the world, it is also a year where Fake News become a permanent part of the social and political reality. While the malicious nature of political contest doesn’t change,  the tools get better, and DeepFake is a fast maturing weapon that creates malicious falsehood out of real-life materials.

 

DeepFake is a software that face-swap everyday photo unto pornographic videos. The result is very convincing and lifelike, and hard to disprove because the software makes multiple attempts to create the most lifelike video, with a smooth transition and good face mapping. Most of the fake porn video will pass a careful examination. and may take forensic examination down to the pixel level to document the fraud. NewsWeek 

 

While DeepFake is new, it is not hard to state the impact of a campaign. Imagine an email claiming that someone unearthed a sex video of the female candidate hosted on a porn website. When viewing the video, it states it was anonymously uploaded to multiple websites hosted in other countries outside the electoral jurisdiction. While the uploaded content does not mention the candidate by name, the candidate’s face is clearly recognizable. It would be near-sighted to think this attack affects only a female candidate since a male candidate would be attacked with a DeepFake video involving another man, a child, or even just another woman to attack the integrity of a married candidate or even stir up racial agitations.  (In highly religious areas, DeepFake would be a direct incitement to assault and violence, possible death, against the candidate.) And of course, it is in human nature to forward and shares these novelties, especially in a hotly contested campaign.

 

What is operationally dangerous about DeepFake is that it doesn’t require any expensive software or experienced video editors to create. It can be done through an App. To create a fake porn video of a candidate, it only requires a standard porn video (downloadable from any porn website) and a small quantity of video and photos of the candidate. Ironically, a campaign generates a large number of photos and video as part of its normal campaign process, in addition to the tonnes of materials an incumbent generates in their daily communication with their consistency and to advocate issues. Because of this low-barrier to use of DeepFake, it can be deployed by a campaign of any budget, and more likely by a campaign that is low on fundings or low on ideas. 

 

Legislation designed to protect privacy and to deal with revenge porn is not useful since the DeepFake material is not authentic so the most relevant laws would be those designed for defamation. While there are some legal protections, finding out the creator of the DeepFake and is practically impossible unless there is a serious cyber-forensic resource at disposal, and prosecuting the offending party in a relevant venue would be impossible. Even in a perfect case where DeepFake is deployed to help a campaign win an election, and the offending party is later convicted, they may suffer no more than a fine without having to vacate the elected public office. 

 

Just as campaigns evolve to deal with the tactics and surprises as technology changes, an election in the era of Fake News and DeepFake requires preventative planning and solid reputation monitoring. While no plan survives the first contact with the enemy, not having a plan ensures it is the enemy who survives.

 

AutoPolitic manages election campaign with Artificial Intelligence and Social Analytics across South East Asia.  We also provide pre-election strategy assessment and staff training.

 

 

 

Please reload

Featured Posts

FakeNews Gets Dirty

March 11, 2018

1/1
Please reload

Recent Posts

March 11, 2018

Please reload

Archive
Please reload

Search By Tags
Please reload

Follow Us
  • Facebook Basic Square
  • Twitter Basic Square
  • Google+ Basic Square

Call

+65 9617-0294​ (Asia)

@2020 AutoPolitic