It is getting easier to make a “deepfake video”. In the beginning of this month, an application in China went viral through which anyone is able to put his or her head in a movie scene of your choice. For this application, a Chinese phone number is required. What might be the effects when this technology would be easily accessible to everyone with a smartphone?
The technology making deepfakes possible is already existing for years. However, it began to get popular when it was firstly used for porno movies. The use of deepfakes for porn is questionable, since the concerned person has mostly not given their approval. Therefore, Rebbit, the forum where it all started, has banned the controversial deepfake videos. Imagine that once the technology is easily accessible to everyone, even minors might create virtual porn, which is punishable in most countries.
The public prosecutor of the Netherlands is worried about the use of the fake videos to persuade people to do something punishable for example. As a result, the maker of the concerned deepfake video can be prosecuted for defamation or slander in most countries.
Besides that, a deepfake video could be used for blackmail, which could lead to dangerous situations. Researchers argue that in six to twelve months from now, you are probably not capable of distinguish a deepfake from a real video (NOS, 2019). One of the reasons is that it is currently complicated to falsify audio.
The question is whether prohibitions are the right way to prevent the dangers of deepfakes, since social media is also not banned. It is not about the technique, but about what you do with the deepfakes (NOS, 2019). For example, it might be utilized for educational as well as psychological purposes. Think about virtually talking to a deceased loved one or teach a class with virtual historical figures. Regulatory institutions should reconsider the law to make these applications possible.
Sources
Nos.nl (2019). Nederlandse Publieke Omroep: Zorgen OM over deepfakes: ‘Risico op oplichting en afpersing’. [online] Available at: https://nos.nl/artikel/2300688-zorgen-om-over-deepfakes-risico-op-oplichting-en-afpersing.html [Accesed 17 Sep. 2019].
Wow, incredible how something that was probably intended as a cool gimmick turns out to be an easy accessible, dangerous tool. I wonder what happens if this is used during elections, when this technique can be used to create ”fake news”. Do you believe Facebook will be able to detect the deep fake video’s? Will it be possible to track and prosecute all people for producing these video’s? Or will there be a fine, just like in Germany, where you get a fine for downloading movies illegally. Any way, the dangers of this new technique should not be underestimated.
I agree! The dark sides of this technique could be devastating. I believe it will be possible in the nearby future to detect deep fake videos, considering the improving technologies of machine learning for example. Hopefully, Facebook will be able to do so before deep fakes could affect our society and upcoming elections among other things.
It is shocking to me to see how far this technique has come and how convincing it is. This indeed could be used for many wrong purposes including false evidence in court.
Perhaps people could attach to ‘real’ videos some sort of trademark/stamp which cannot be copied. I don’t know how that would work exactly. Or, by law, all deepfake programs would need to add a deepfake stamp to the videos so it is clear they are fake. If you lay the responsibility with the technology providers you can be more sure it get’s taken seriously then if the responsibility lays with the users.
@Josje DiepeveenI like the idea of a secure trademark/stamp on videos that have / have not been edited and there are already detection mechanism and tools. I think in a way it is also comparable to the upcoming of Photoshop or movie editing techniques. It is necessary for society to create a certain distance to videos / images that are uploaded and published online and not relying on vague sources. Trustful sources will need to implement checks for the (video) material published to remain reliable.