Being Human in the Age of Black Box Algorithms and Subjective Truths

17

October

2019

5/5 (4)

esther-jiao-ADv0GiMBlmI-unsplash
Photo by Esther Jiao on Unsplash

Algorithms are everywhere and play an important role in our daily lives. They decide what we see on our social media feeds, which ads are used to target us and what route we should take to get places.

The problem is that many algorithms are black boxes. They are complex systems that shape our world, whose internal workings are hidden or not easily understood (Oxford English Dictionary Online, 2011). With these algorithms, which often have a complex design, it is unclear how the output or conclusions were reached. With historically little oversight or accountability regarding their design, this problem has a profound effect on society as our day-to-day lives and our personal decisions are increasingly controlled by algorithms (Carey, 2018; Illing, 2018). Most of us have no idea what algorithms are or how exactly we are being influenced by them. And how could we if we cannot look inside, ‘under the hood’? And if we could, if sometimes even the coders that built an algorithm do not know how the system reached its conclusion (Carey, 2018), how should we?

Does this mean that we cannot trust algorithms anymore? Hannah Fry, an Associate Professor in Mathematics at University College London and author of the book “Hello World: Being Human in the Age of Algorithms”, explains in an interview with Sean Illing that our behaviour to algorithms tends to be in extremes (Illing, 2018). On the one hand, we have very high expectations of algorithms and will trust them blindly. On the other hand, as soon as we see that an algorithm or the outcomes are somewhat inaccurate, we do no longer trust them and disregard them. Fry thinks the right attitude is somewhere in the middle: “we should not blindly trust algorithms, but we also should not dismiss them altogether” (Illing, 2018).

Subjective Truths
A larger concern with algorithms is that they often contain the biases of the people who create them and that they reinforce biases and stereotypes we may inherently have, but might now be aware of (Li, 2019). As Bill and Melinda Gates (2019) describe this can even be the result of non-existent or sexist data. This is especially dangerous with black-box algorithms, which do not explain their results to their programmers – let alone to the end-users.

And what if information is deliberately misrepresented or differs depending on who you are or where you are from? Take for example Google Maps. Google claims to be objective in marking disputed regions in various parts of the world (Boorstin, 2009). Depending on from what country you access Google Maps, you will see Crimea portrayed as part of Ukraine or Russia (Usborne, 2016). If you consider that at least 124 countries are involved in a territorial dispute, there is a lot of potential for subjective truths (Galka, n.d.; Metrocosm, 2015). Another example is Apple. If you are in Hong Kong or Macau, from iOS 13.1.1 onwards you will no longer find the Taiwanese flag ?? on the emoji keyboard (Peters & Statt, 2019). Generally, as an user, you are not made aware of these intentional differences, but they do shape our perception of reality.

Conclusion
When it comes to algorithms, the people behind them or really anything in life, you should not blindly trust the information that is presented to you. Besides, as Fry argues, we should not think of algorithms themselves as either good or bad, but we should rather focus on the people behind the scenes that create these algorithms (Illing, 2018). Although algorithms may not be perfect and they often are biased, they still are extremely effective and have made our lives easier.

Whereas endings are is inevitable, the direction of technological progress is not. We have to ensure that technological progress remains aligned with human’s best interests. There might be unintended or undesired consequences, but as French philosopher Paul Virilio said:

“When you invent the ship, you also invent the shipwreck; when you invent the plane you also invent the plane crash; and when you invent electricity, you invent electrocution…Every technology carries its own negativity, which is invented at the same time as technical progress.” (Virilio, Petit & Lotringer, 1999).

 

References:
Black box. (2011). In Oxford English Dictionary Online. Retrieved 12 October 2019, from https://www-oed-com.eur.idm.oclc.org/view/Entry/282116
Boorstin, B. (2009, December 4). When sources disagree: borders and place names in Google Earth and Maps. Retrieved from https://publicpolicy.googleblog.com/2009/12/when-sources-disagree-borders-and-place.html
Carey, S. (2018). How IBM is leading the fight against black box algorithms. Retrieved 16 October 2019, from https://www.computerworld.com/article/3427845/how-ibm-is-leading-the-fight-against-black-box-algorithms.html
Gates, B. & Gates, M. (2019, February 12). Our 2019 Annual Letter. Retrieved from https://www.gatesnotes.com/2019-Annual-Letter#ALChapter4
Galka, M. (n.d.). Every Disputed Territory in the World [Interactive Map]. Retrieved 16 October 2019, from http://metrocosm.com/disputed-territories-map.html
Illing, S. (2018, October 1). How algorithms are controlling your life. Retrieved from https://www.vox.com/technology/2018/10/1/17882340/how-algorithms-control-your-life-hannah-fry
Li, M. (2019, May 13). Addressing the Biases Plaguing Algorithms. Retrieved from https://hbr.org/2019/05/addressing-the-biases-plaguing-algorithms
Metrocosm. (2015, November 20). Mapping Every Disputed Territory in the World. Retrieved from http://metrocosm.com/mapping-every-disputed-territory-in-the-world/
Peters, J., & Statt, N. (2019, October 7). Apple is hiding Taiwain’s flag emoji if you’re in Hong Kong or Macau. Retrieved from https://www.theverge.com/2019/10/7/20903613/apple-hiding-taiwan-flag-emoji-hong-kong-macau-china
Usborne, S. (2016, Augustus 10). Disputed territories: where Google Maps draws the line. Retrieved from https://www.theguardian.com/technology/shortcuts/2016/aug/10/google-maps-disputed-territories-palestineishere
Virilio, P., Petit, P., & Lotringer, S. (1999). Politics of the very worst. New York: Semiotext(e).

Please rate this

Leave a Reply

Your email address will not be published. Required fields are marked *