The world of Deep fakes
There is an ongoing discussion about the risk and dangers of software in our daily life. Especially since the introduction of the documentary ‘the social dilemma’ on Netflix, this discussion has been a hot topic again. The series shows the dark side of AI programs behind social media platforms, search engines, news websites, and so on.
One specific and potentially dangerous AI based innovation is the one of deep fakes. Deep Fakes are essentially videos in which one person impersonates someone else by using a software that changes the appearance and voice of that person. Deep fakes have been around for a while, and it is getting increasingly harder to spot a Deep Fake video. In 2018, Jordan Peele created a fake video of President Obama to demonstrate how easy it was to put words in someone else’s mouth. Although not everyone bought it, people were confused about the video to say at least. The technology behind Deep Fakes is rapidly improving, even as worries increase about its potential to do harm.
Deep fakes, or realistic-looking fake videos and audio, gained popularity as a means of adding famous actresses into porn scenes. They are named for the deep-learning AI algorithms that make these videos possible. It works by putting real audio or video of a specific person into a software, the more, the better. The software tries to recognize patterns in speech and movement, and with the introduction of a new element like someone’s face or voice, a deep fake is born.
Jeremy Kahn, a Tech reporter for Bloomberg, says that nowadays it is extremely easy to make one of these videos. Some breakthroughs from academic researchers who work with this particular kind of machine learning resulted in a drastic reduction of the amount of input the software needs. However, with increasing capability comes increasing concern. Jeremy Khan even calls deep fake videos ‘fake news on steroids’. In a world where fakes are easy to create, authenticity also becomes easier to deny. People caught doing guinely objectionable things could claim evidence against them is fake.
Deep fakes do however have some positive potential. Take CereProc, a firm which creates digital voices for people who lose theirs from disease. Or take the innocent deep fake program which turns as many movies into ‘Nicolas Cage movies’ as possible. To conclude, deep fakes are getting more and more popular on the internet. So, don’t believe everything you see on the internet!
- Bloomberg: It’s getting harder to spot deep fakes https://www.youtube.com/watch?v=gLoI9hAX9dw
- Jordan Peele: You won’t believe what Obama says in this video https://www.youtube.com/watch?v=cQ54GDm1eL0
- Website Cereproc: https://www.cereproc.com
- Forbes: Deepfakes are going to wreak havoc on society and we are not prepared https://www.forbes.com/sites/robtoews/2020/05/25/deepfakes-are-going-to-wreak-havoc-on-society-we-are-not-prepared/#11b027937494
I’d totally agree with you! Whereas the practical use for deep fakes is there, and I’d state it to be very entertaining, the ultimate tool for propaganda is there as well. Even your human senses are not able to distinguish what’s real and fake. And how do we protect ourselfs against these deep fake tools? At this moment in time our digital platforms aren’t even capable (or willing to) efficiently apply filters on the fakes from real. Concerning! And very interesting to see how humanity will deal with these modern propaganda tools. Thanks for sharing Tristan!
A very interesting topic! It is great that both advantages and disadvantages are included, as this topic normally only appears in very negative connotations and news stories, painting very dystopian images of the future. However, the technology in itself is not evil. It can create tremendous opportunities as seen above. On the other side, one very negative – yet more macro-perspective – point is the effect this can have on the trust in society and the internet as a whole. When deep fakes become more widespread it has the potential to deteriorate trust in nearly all material that is shared online.