How Salesforce provides Value through Acquisitions

8

October

2021

No ratings yet.

Salesforce (SFDC), another Silicon Valley based venture, has been increasingly relevant in today’s cloud-based world. The firm, led by Marc Benioff, has been growing immensely and they are now even listed not the New York Stock Exchange. And to those that are Salesforce shareholders, it has shown to be a pretty good investment over the past few months. However, their growth strategy can be considered quite aggressive. Whereas only a couple of years ago they have acquired smaller companies like Radian6, Buddy Media, they eventually acquired ExactTarget (which then included recently acquired Pardot) to make a holistic Marketing Suite product and compete with players like Marketo and HubSpot. This was only the start of their aggressive acquiring and growth strategy. Since then, they have completed some really major purchases with Mulesoft and Tableau. Even this year, SFDC has completed their biggest acquisition ever. The company they acquired is called Slack, and the price tag was an astonishing US$ 27.7 billion.

Salesforce: Strong Cloud Ecosystem With Red Flags All Over It (NYSE:CRM) |  Seeking Alpha
Image 1: The Salesforce Customer 360 with recently acquired Slack.

The question that remains is, why is Salesforce acquiring so many companies and adding their products to their offering? The answer is as shown above. Salesforce has a Customer 360 vision. In short, they aim to provide their customers will all the tools to engage with their customers in a highly personalised manner and create value across the entire value chain. They really want to be the one-stop-shop where you go for all your customer engagements. Whether it is Sales, Service, Marketing, Social Media, Communities, eCommerce, or Communications; Salesforce wants to have it. Providing some academic background, they are now an Ecosystem Driver according to the Weill & Woerner model from 2015. They provide value to their customers by opting for a complete package and even extend beyond that with 3rd party products (available via the AppExchange) and free learning tools such as Trailhead or the Trailblazer Community.

It can be argued that this acquisition strategy in the long term may not be sustainable. Will customers remain to see the value? Based on the stock price I would say: YES! What are your thoughts?

References:

  • https://www.cnbc.com/2020/12/01/salesforce-buys-slack-for-27point7-billion-in-cloud-companys-largest-deal.html
  • https://www.salesforce.com/nl/blog/2019/07/alles-wat-je-moet-weten-over-Salesforce-Customer-360.html
  • https://www.salesforceben.com/the-drip/a-brief-history-of-salesforce-marketing-cloud-and-pardot/
  • Weill, P.; Woerner, S.L., 2015. Thriving in an increasingly digital ecosystem. MIT Sloan Management Review. Available at: https://sloanreview.mit.edu/article/thriving-in-an-increasingly-digital-ecosystem/ [Accessed October 6, 2021].

Please rate this

Author: Kwint Jansen

Hi all, My name is Kwint. I recently started the MSc Business Information Management at RSM. Previously I have been working with fintech start-ups and cloud computing companies, after finishing my BSc IBA at the VU in Amsterdam. My interests are in the field of digitalisation, financial technologies, diversity & inclusion and personal things like travelling and cooking!

Tristan Harris has a message for you

19

September

2019

5/5 (3) Ever found yourself falling down a deep Youtube rabbit hole? Ever compared yourself to influencers on Instagram and Pinterest? Ever been overwhelmed on Twitter by election campaigns? Then Harris has a story to tell you.

 

Tristan Harris, a former Google employee, ‘the conscience of Silicon Valley’ and the man behind the ‘Time Well Spent’ movement (Harris, 2019), has co-founded a new non-profit organisation called the Center for Humane Technology (CHT). You might ask: ‘why is this important to me?’ – Let me tell you.

 

Harris is a former Design Ethicist Engineer at Google, where he realised how much power Big Tech companies hold as their business models are built to capture the attention of humans (Johnson, 2019). More alarmingly, he recognised how these companies have the power to shape millions of people’s minds, yet, according to Harris, they are not taking enough moral responsibility for this. Harris explains how technology is manipulating our instincts through what he calls “the race to the bottom of the brain stem” and how the biggest problem in this is our “attention economy” (Thompson, 2019; Rouse and Wigmore, 2019b).

 

See it this way; an abundance of information has created a scarcity of attention, and any resource that is scarce is worth money (Newton, 2019). So companies like Google, Facebook, Twitter and YouTube are built to compete in a commercial race for people’s attention (Johnson, 2019). An example of this can be YouTube’s business model: the longer they can get you to watch videos, the more views you generate. Views mean more ads were seen, so the longer they capture your attention, the more companies are willing to pay for them to run their ads. This is how most Big Tech companies are making their money (Johnson, 2019).

 

Not only is our attention worth money, attention is also what steers politics, builds relationships, decides elections and creates culture. What Harris is trying to clarify is that if Big Tech companies are directing what we pay attention to, don’t they then, in effect, dictate our culture? This is one of the big issues that Harris is trying to bring into the light: “Tech addiction, polarization, outrage-ification of culture, the rise in vanities and micro-celebrity culture are all, in fact, symptoms of a larger disease: the race to capture human attention by giants” (Johnson, 2019).

 

These symptoms are all contributory to a phenomenon called ‘human downgrading’, a term coined by Harris and his co-workers at the CHT (Rouse and Wigmore, 2019a). Human downgrading is the combined negative effects of digital technology on people and society  (Thompson, 2019). Harris explains that while our data was being used to upgrade machines, it has downgraded peoples’ civility, decency, democracy, mental health, relationships, attention and more. So even though Big Tech is working hard on making technology smarter, they are indirectly making all of us dumber, meaner and more alienated (Johnson, 2019). Harris has explained human downgrading by describing it as the social climate change of culture (Center for Humane Technology, 2019). Similarly, it can be catastrophic, however the difference is that only a few companies need to change to alter its trajectory. That is those companies that are creating the technologies which are causing these issues: the artificial social environments, the overpowering AIs and algorithms that sense and exploit our vulnerabilities (Johnson, 2019).

 

So how does Harris plan on solving this ‘human downgrading’? Back in May, he discussed this on an episode of Vox’s podcast: Recode Decode (Johnson, 2019). The short answer: design and regulation. However, it is more sophisticated than that. Harris starts with explaining that the answer is not as simple as just turning technology off. Since people spend almost a fourth of their lives in artificial social systems, these digital environments have become an important daily habitat for almost 2 billion people worldwide (Johnson, 2019; How a handful of tech companies control billions of minds every day | Tristan Harris, 2017). And even those who don’t participate on social media platforms will have to deal with the consequences, think of the 2016 US elections. So Harris poses the question: If people spend this much time in those digital social environments, shouldn’t these be regulated?

 

A big problem is that the data needed to assess the impact and effects of human downgrading is guarded by companies like Facebook, since they own that data (Johnson, 2019). Therefore, Harris calls the Big Tech companies, especially Google and Apple, to action in changing their ways as “the central banks of the attention economy” (Thompson, 2019). He wants them to start a race to the top, which focuses on changing tech design to “help people focus, find common ground, promote healthy childhoods, and bolster our democracy” (Newton, 2019). This is why he created the Center for Humane Technology, to create a common language and infuse the vocabulary into the minds of Silicon Valley, to start the conversation and create a shared understanding and language (Center for Humane Technology, 2019). His organisation has promised to provide a guide for organisations on how to promote more humane designs. They also started a podcast on the topic called Your Undivided Attention to provide a platform to speak about these topics (Apple Podcasts, 2019). Lastly, they will be holding a conference in 2020 to bring the right minds together to figure out how to design social systems that encourage healthy dialogue, civility and bring out the best in human nature. As Raskin, the other co-founder of the CHT put it: “We need to move away from just human-centered design to human-protection design” (Thompson, 2019).

 

Like the last wave of digital wellness awareness, it is difficult to predict whether Harris’ new Team Humanity movement catches on. Even though digital wellness is becoming more of a trend and initiatives like Apple’s Screen Time and Google’s Digital Wellbeing are a step in the right direction (Pardes, 2018), we are far from where we need to be.

 

Do you agree with Harris and think Big Tech needs to take responsibility for human downgrading? Or do you think Tristan is underestimating the capability of humans to control their technology use? Will you join Team Humanity?

 

Leave your thoughts and comments below!

 

 

Bibliography

Apple Podcasts. (2019). Your Undivided Attention on Apple Podcasts. [online] Available at: https://podcasts.apple.com/us/podcast/your-undivided-attention/id1460030305 [Accessed 18 Sep. 2019].

 

Center for Humane Technology. (2019). Center for Humane Technology: Realigning Technology with Humanity. [online] Available at: https://humanetech.com/ [Accessed 18 Sep. 2019].

 

Harris, T. (2019). Tristan Harris. [online] Tristanharris.com. Available at: https://www.tristanharris.com/ [Accessed 18 Sep. 2019].

 

How a handful of tech companies control billions of minds every day | Tristan Harris. (2017). YouTube: TED.

 

Johnson, E. (2019). Tristan Harris says tech is “downgrading” humanity — but we can fix it. [online] Vox. Available at: https://www.vox.com/recode/2019/5/6/18530860/tristan-harris-human-downgrading-time-well-spent-kara-swisher-recode-decode-podcast-interview [Accessed 18 Sep. 2019].

 

Newton, C. (2019). The leader of the Time Well Spent movement has a new crusade. [online] The Verge. Available at: https://www.theverge.com/interface/2019/4/24/18513450/tristan-harris-downgrading-center-humane-tech\ [Accessed 18 Sep. 2019].

 

Pardes, A. (2018). Quality Time, Brought to You by Big Tech. [online] Wired. Available at: https://www.wired.com/story/how-big-tech-co-opted-time-well-spent/ [Accessed 18 Sep. 2019].

 

Rouse, M. and Wigmore, I. (2019a). What is human downgrading?. [online] WhatIs.com. Available at: https://whatis.techtarget.com/definition/human-downgrading [Accessed 18 Sep. 2019].

 

Rouse, M. and Wigmore, I. (2019b). What is an attention economy?. [online] WhatIs.com. Available at: https://whatis.techtarget.com/definition/attention-economy [Accessed 18 Sep. 2019].

 

Thompson, N. (2019). Tristan Harris: Tech Is ‘Downgrading Humans.’ It’s Time to Fight Back. [online] Wired. Available at: https://www.wired.com/story/tristan-harris-tech-is-downgrading-humans-time-to-fight-back/ [Accessed 18 Sep. 2019].

Please rate this