May justice be deleted by algorithms?

9

October

2020

No ratings yet.

Put up your hands, Shouts a gunman who captured and kneeled 10 people in front of him. The gunman shows a sign of hesitation but fires his gun. Fighters in war zones often post such atrocities on the internet to show their deeds and spread fear around the globe. In 2016, the international criminal court issued its first warrant relying mostly on that video, identifying Mahmoud al-werfalli a Libyan terrorist as the gunman in the video. Prosecutors often did not use any footage posted on social media due to the possibility for deep fakes, or other manipulated footage. However, over the last decade prosecutor are relying more on videos to identify the war criminals and terrorists.

Social media platforms have been working strenuously to protect users against such horrific content. However, each day, more than 500 hours of content is uploaded to YouTube per minute. In order to control this, algorithms have been put in place to immediately flag content that seems against the rules. In the beginning, algorithms largely helped to flag content for moderators to asses, now, Facebook points out more than 98% of content violating its rules on extremism is flagged and taken off automatically before any of the public could see it.

Both Facebook and Youtube are notoriously enigmatic about their mechanisms to train algorithms responsible for deleting content. Also, it is commonly known that algorithms are untrustworthy for construing context of the videos, resulting often in removal of content which had nothing to do with such war crimes. In this manner, there is absolutely no control over what is flagged, what is deleted and what must be preserved to bring people to justice. Now, multiple human rights groups in the Middle-East estimated that around 21% of the nearly 1.75m Youtube videos that were archived as evidence, are no longer available.

An increasing amount of institutions now argue for regulation over what happens to removed content. They argue deleted content must be stored and passed on to third party digital archives. However, true question remains: In order to bring such people to justice, must social media platforms pass on all their deleted content to third party archives? Or is that breach of privacy for the users of the platform?

References:

https://www.economist.com/middle-east-and-africa/2020/09/26/social-media-platforms-are-destroying-evidence-of-war-crimes

https://www.economist.com/international/2020/09/21/social-media-platforms-are-destroying-evidence-of-war-crimes

https://time.com/5798001/facebook-youtube-algorithms-extremism/

Click to access Transparency_MacCarthy_Feb_2020.pdf

Please rate this

Leave a Reply

Your email address will not be published. Required fields are marked *