Apple’s next move: one step too far?

29

September

2021

No ratings yet.

The global pandemic has influenced the whole world for the past two years. It has caused significant amounts of harm to economies, welfare and general wellbeing of people. In countries such the United Kingdom and the United States, the percentage of people that experiences symptoms of depression and anxiety has reached new heights. Especially in the United States as 42% of the adults surveyed stated that they feel a form of anxiety or depression (Abbott, 2021). The rise in mental health issues will become the focal point of already exhausted healthcare industries across the world.

Apple sees an opportunity to indirectly aid the industry as according to Wall Street Journal (2021), the company is working on iPhone features that are capable of detecting mental-health concerns. In collaboration with UCLA and Biogen, Apple is researching whether it can expand its health detecting capabilities from the physical aspect already implemented in the smartwatches to the mental aspect. With the use of sensors, data on mobility, physical activity, sleep patterns and typing behaviour is collected to create an algorithm that should be capable of reliably capturing digital signals that could illustrate mental health issues.

For the research, data from the iPhone video camera, keyboard and audio sensors will be tracked. The data will include analysis of facial and vocal expressions, speed of typing, frequency of typos and what type of content is typed (Winkler, 2021). It is uncertain that if this feature is implemented, how it will be implemented but it is certain that a large part of privacy is sacrificed. Although, previous research has illustrated that people with certain mental health issues use their phones differently, it still remains uncertain if reliable algorithms can be created.

If Apple is successful, they have the ability to affect millions of people worldwide which however consequently raises privacy issues. Earlier this year, it has been reported that Apple will warn authorities if it detects any form of child pornography (McMillan, 2021). Although this time around for widely accepted reasons, this does illustrate that if the company deems that it is necessary to share certain information with third parties, they will.

Therefore, the question is whether we as customers should be accepting of this innovation. On the one hand, it can have a positive influences as it can potentially help millions of people detect and perhaps prevent mental health issues. But on the other hand, if we become accepting of it, we allow that the digital device that already knows so much about us, to gain even more power. Should we be willing to share the most sensitive type of information with a company whose core objective is to be profitable?

References

Abbott, A., 2021. COVID’s mental-health toll: how scientists are tracking a surge in depression. [Online]
Available at: https://www.nature.com/articles/d41586-021-00175-z
[Accessed 29 September 2021].

McMillan, R., 2021. Apple Plans to Have iPhones Detect Child Pornography, Fueling Privacy Debate. [Online]
Available at: https://www.wsj.com/articles/apple-plans-to-have-iphones-detect-child-pornography-fueling-privacy-debate-11628190971?
[Accessed 29 September 2021].

Winkler, R., 2021. Apple Is Working on iPhone Features to Help Detect Depression, Cognitive Decline. [Online]
Available at: https://www.wsj.com/articles/apple-wants-iphones-to-help-detect-depression-cognitive-decline-sources-say-11632216601
[Accessed 29 September 2021].

Please rate this

3 thoughts on “Apple’s next move: one step too far?”

  1. Hi Anne,

    I think this is a very interesting subject.
    What I’m personally worried about next to the privacy matter is what would happen when mental health issues are established. These people might fall victim to firms that take advantage of their situation.
    Besides, what if a person is mistakenly labeled as someone with a mental health issue?
    This person might receive so much information about this mental health issue that a placebo effect could occur and actual mental health issues would start to develop

  2. Hi Anne,

    Very interesting read! As a fan of their products and services, I have to say that this is one step too far for me. As you rightly pointed out, Apple’s core objective is to be profitable. Therefore I do not think we can trust them to use this sensitive data to our benefit at all times.

    If they could profit from this data without facing consequences (e.g. us knowing), I have no doubt that they would. In my opinion, mental health issues certainly need to be identified and addressed, but surely there must be another way than sacrificing our privacy.

  3. Very interesting read! I share your opinion on this part. Apple has put a lot of effort into building a brand that surrounds privacy. While the objective to learn more about mental illnesses is quite nobel, the data that has to be gathered is extremely personal. In every case, the user should be in full control of the data that he/she is sharing.That is what Apple did before with their Fitness studies as well. Apple’s ambition to scan the iCloud photo library goes into a similar direction It seems like Apple’s brand and its connection to privacy is challenged with the recent developments.

Leave a Reply

Your email address will not be published. Required fields are marked *