Is Apple Allowed To Scan Its ‘Rotten’ Part?

9

October

2021

No ratings yet.

At what point does it become acceptable to violate someone’s privacy? For some people, the answer may be never. They will argue that privacy is a human right and point to article 12 of the universal declaration of human rights. Here It is stated that no person should have their privacy arbitrarily interfered with (United Nations, n.d.). Most people, however, will agree that it is ok to breach someone’s privacy if they are a big enough threat to society. This would explain why there is no public outcry every time the police listen to phone calls of suspected criminals or terrorists. But would it still be ok if everybody’s privacy would be violated to increase safety and stop criminals?

For Apple, the biggest company in the world, the answer to this question is yes. Their reasoning: to make their products safer for children. In August, they revealed plans to scan the files in everybody’s iCloud storage to detect child sexual abuse material (CSAM) (Whittaker, 2021).

This decision led to a lot of backlash against Apple. Everyone agrees that this is a serious topic and companies should make their products and services safe for children. But some experts see this decision as a pandora’s box and are scared that governments will use this technology one day as a surveillance tool. Other people are afraid of the possibility of false positives, where innocent people will be flagged as predators (Barrett & Hay Newman, 2021).

In September, Apple came with the announcement that the project was being paused. They will take the coming months to improve the system but are not abandoning it (Barrett & Hay Newman, 2021). I would love to hear what you think about this topic in the comments! Should Apple be able scan everybody’s files to detect CSAM or is this violation of privacy too big? Is the threat of this being used as a surveillance tool a real threat, given Apple’s track record of not cooperating with governments?

References

Barrett, B., & Hay Newman, L. (2021, September 3). Apple Backs Down on Its Controversial Photo-Scanning Plans. Wired. https://www.wired.com/story/apple-icloud-photo-scan-csam-pause-backlash/

United Nations. (n.d.). Universal Declaration of Human Rights. United Nations; United Nations. Retrieved October 9, 2021, from https://www.un.org/en/about-us/universal-declaration-of-human-rights

Whittaker, Z. (2021, August 5). Apple confirms it will begin scanning iCloud Photos for child abuse images. TechCrunch. https://techcrunch.com/2021/08/05/apple-icloud-photos-scanning/

Please rate this

2 thoughts on “Is Apple Allowed To Scan Its ‘Rotten’ Part?”

  1. Thanks for your interesting post. I’ll share my take on this topic.

    I think Apple should be able to scan everybodys files on iCloud because of the way they are planning on doing so. If Apple will go through with this plan, they will not actually look at the content of the files that are in the cloud. Law enforcement will share a list of metadata of files containing sexual abuse material. Apple will use this list of metadata and compare it to the metadata of its users files on the cloud. Apple is therefore only looking at metadata, and not at the files content itself.

  2. It is of course always interesting to watch the debate on privacy versus public safety. In this case as well, I think if you’d ask people whether they’d like to prevent child abuse and catch child predators everyone’s answer would be yes, yet if you’d suggest actual methods like the one’s Apple is using, many people will be hesitant. Personally I’m all for Apple’s idea to scan for CP, and I think if you use Apple products you should agree to their privacy standards as well.

Leave a Reply

Your email address will not be published. Required fields are marked *