Tristan Harris has a message for you

19

September

2019

Center for Humane Technology blog post banner
5/5 (3)

Ever found yourself falling down a deep Youtube rabbit hole? Ever compared yourself to influencers on Instagram and Pinterest? Ever been overwhelmed on Twitter by election campaigns? Then Harris has a story to tell you.

 

Tristan Harris, a former Google employee, ‘the conscience of Silicon Valley’ and the man behind the ‘Time Well Spent’ movement (Harris, 2019), has co-founded a new non-profit organisation called the Center for Humane Technology (CHT). You might ask: ‘why is this important to me?’ – Let me tell you.

 

Harris is a former Design Ethicist Engineer at Google, where he realised how much power Big Tech companies hold as their business models are built to capture the attention of humans (Johnson, 2019). More alarmingly, he recognised how these companies have the power to shape millions of people’s minds, yet, according to Harris, they are not taking enough moral responsibility for this. Harris explains how technology is manipulating our instincts through what he calls “the race to the bottom of the brain stem” and how the biggest problem in this is our “attention economy” (Thompson, 2019; Rouse and Wigmore, 2019b).

 

See it this way; an abundance of information has created a scarcity of attention, and any resource that is scarce is worth money (Newton, 2019). So companies like Google, Facebook, Twitter and YouTube are built to compete in a commercial race for people’s attention (Johnson, 2019). An example of this can be YouTube’s business model: the longer they can get you to watch videos, the more views you generate. Views mean more ads were seen, so the longer they capture your attention, the more companies are willing to pay for them to run their ads. This is how most Big Tech companies are making their money (Johnson, 2019).

 

Not only is our attention worth money, attention is also what steers politics, builds relationships, decides elections and creates culture. What Harris is trying to clarify is that if Big Tech companies are directing what we pay attention to, don’t they then, in effect, dictate our culture? This is one of the big issues that Harris is trying to bring into the light: “Tech addiction, polarization, outrage-ification of culture, the rise in vanities and micro-celebrity culture are all, in fact, symptoms of a larger disease: the race to capture human attention by giants” (Johnson, 2019).

 

These symptoms are all contributory to a phenomenon called ‘human downgrading’, a term coined by Harris and his co-workers at the CHT (Rouse and Wigmore, 2019a). Human downgrading is the combined negative effects of digital technology on people and society  (Thompson, 2019). Harris explains that while our data was being used to upgrade machines, it has downgraded peoples’ civility, decency, democracy, mental health, relationships, attention and more. So even though Big Tech is working hard on making technology smarter, they are indirectly making all of us dumber, meaner and more alienated (Johnson, 2019). Harris has explained human downgrading by describing it as the social climate change of culture (Center for Humane Technology, 2019). Similarly, it can be catastrophic, however the difference is that only a few companies need to change to alter its trajectory. That is those companies that are creating the technologies which are causing these issues: the artificial social environments, the overpowering AIs and algorithms that sense and exploit our vulnerabilities (Johnson, 2019).

 

So how does Harris plan on solving this ‘human downgrading’? Back in May, he discussed this on an episode of Vox’s podcast: Recode Decode (Johnson, 2019). The short answer: design and regulation. However, it is more sophisticated than that. Harris starts with explaining that the answer is not as simple as just turning technology off. Since people spend almost a fourth of their lives in artificial social systems, these digital environments have become an important daily habitat for almost 2 billion people worldwide (Johnson, 2019; How a handful of tech companies control billions of minds every day | Tristan Harris, 2017). And even those who don’t participate on social media platforms will have to deal with the consequences, think of the 2016 US elections. So Harris poses the question: If people spend this much time in those digital social environments, shouldn’t these be regulated?

 

A big problem is that the data needed to assess the impact and effects of human downgrading is guarded by companies like Facebook, since they own that data (Johnson, 2019). Therefore, Harris calls the Big Tech companies, especially Google and Apple, to action in changing their ways as “the central banks of the attention economy” (Thompson, 2019). He wants them to start a race to the top, which focuses on changing tech design to “help people focus, find common ground, promote healthy childhoods, and bolster our democracy” (Newton, 2019). This is why he created the Center for Humane Technology, to create a common language and infuse the vocabulary into the minds of Silicon Valley, to start the conversation and create a shared understanding and language (Center for Humane Technology, 2019). His organisation has promised to provide a guide for organisations on how to promote more humane designs. They also started a podcast on the topic called Your Undivided Attention to provide a platform to speak about these topics (Apple Podcasts, 2019). Lastly, they will be holding a conference in 2020 to bring the right minds together to figure out how to design social systems that encourage healthy dialogue, civility and bring out the best in human nature. As Raskin, the other co-founder of the CHT put it: “We need to move away from just human-centered design to human-protection design” (Thompson, 2019).

 

Like the last wave of digital wellness awareness, it is difficult to predict whether Harris’ new Team Humanity movement catches on. Even though digital wellness is becoming more of a trend and initiatives like Apple’s Screen Time and Google’s Digital Wellbeing are a step in the right direction (Pardes, 2018), we are far from where we need to be.

 

Do you agree with Harris and think Big Tech needs to take responsibility for human downgrading? Or do you think Tristan is underestimating the capability of humans to control their technology use? Will you join Team Humanity?

 

Leave your thoughts and comments below!

 

 

Bibliography

Apple Podcasts. (2019). Your Undivided Attention on Apple Podcasts. [online] Available at: https://podcasts.apple.com/us/podcast/your-undivided-attention/id1460030305 [Accessed 18 Sep. 2019].

 

Center for Humane Technology. (2019). Center for Humane Technology: Realigning Technology with Humanity. [online] Available at: https://humanetech.com/ [Accessed 18 Sep. 2019].

 

Harris, T. (2019). Tristan Harris. [online] Tristanharris.com. Available at: https://www.tristanharris.com/ [Accessed 18 Sep. 2019].

 

How a handful of tech companies control billions of minds every day | Tristan Harris. (2017). YouTube: TED.

 

Johnson, E. (2019). Tristan Harris says tech is “downgrading” humanity — but we can fix it. [online] Vox. Available at: https://www.vox.com/recode/2019/5/6/18530860/tristan-harris-human-downgrading-time-well-spent-kara-swisher-recode-decode-podcast-interview [Accessed 18 Sep. 2019].

 

Newton, C. (2019). The leader of the Time Well Spent movement has a new crusade. [online] The Verge. Available at: https://www.theverge.com/interface/2019/4/24/18513450/tristan-harris-downgrading-center-humane-tech\ [Accessed 18 Sep. 2019].

 

Pardes, A. (2018). Quality Time, Brought to You by Big Tech. [online] Wired. Available at: https://www.wired.com/story/how-big-tech-co-opted-time-well-spent/ [Accessed 18 Sep. 2019].

 

Rouse, M. and Wigmore, I. (2019a). What is human downgrading?. [online] WhatIs.com. Available at: https://whatis.techtarget.com/definition/human-downgrading [Accessed 18 Sep. 2019].

 

Rouse, M. and Wigmore, I. (2019b). What is an attention economy?. [online] WhatIs.com. Available at: https://whatis.techtarget.com/definition/attention-economy [Accessed 18 Sep. 2019].

 

Thompson, N. (2019). Tristan Harris: Tech Is ‘Downgrading Humans.’ It’s Time to Fight Back. [online] Wired. Available at: https://www.wired.com/story/tristan-harris-tech-is-downgrading-humans-time-to-fight-back/ [Accessed 18 Sep. 2019].

Please rate this

1 thought on “Tristan Harris has a message for you”

  1. Interesting post! Like Harris has said, this seems like a pretty complicated issue to tackle. Like he pointed out, it’s not as easy as just not using technology.

    While I believe regulations could be a good start in saying what companies can or cannot do, I think that’s only addressing the edge of the nature of this problem. For firms like Apple, Google or Facebook to become more ethical, so much more has to change. Twitter and Facebook already have AI to find and delete fake accounts, “troll” accounts, or bots, but these are not always effective nor accurate. Accounts like these just fuel more hatred to people who cannot identify them as “fake” accounts. But even worse, is the real accounts that fuel hatred for their agenda. So often nowadays, especially now with the US elections coming up relatively soon (and all the campaigns being ran) you see a big problem the United States population has, which due to social media is slowly moving into other cultures: bipartisanship. Since Trump became head of the US, American politics have been filled with hatred, from right-wing people who despise the left, and left-wing people who despise the right. How can big tech companies prevent this? If they prevent the dialogue, they will be accused of preventing freedom of speech, but often it’s these threads under big political heads (as an example) that lead people to not find a common ground. With children growing up in these environments, it might become worse in the future. These companies definitely have the moral obligation to do something, but it is a very difficult problem to address, that I believe stems more from what the users want than what the companies provide. And although the companies should change how they provide what the consumers want (and I’m sure this is the larger part of the discussion), they may simply not want to lose their power by stopping to give/limiting their services to users. After all, people like to use Twitter as they can share their opinions (or others’ opinions, posts, etc.) for others to see. If this feature is limited, why would people use Twitter, and what would prevent other competitors to do the same?

    I think it’s an interesting debate, and as massive corporations, these companies can do so much more. But there are many sides to this argument that need to be taken into account. I believe the discussion to split these massive corporations is a good start (although I do not necessarily agree with the approach), but there is still much to be agreed on.

Leave a Reply

Your email address will not be published. Required fields are marked *