Help! The Computer is Racist!

24

October

2017

No ratings yet.

Artificial Intelligence (AI) has taken a major leap in the last decade with great advancements. Advancements that sometimes we are and sometimes are not aware of being confronted with on a daily basis. Such as ridesharing apps that optimally match you with other passengers to minimize detours and Facebook its ability to recognize faces on pictures you uploaded. (Narula 2017)

Machine Learning is a subfield of AI and encompasses the statistical arm. The main focus of machine learning is programming algorithms so that we can learn from data, complete tasks and make predictions (Schmidt 2016). Through machine learning we teach computers to independently recognize patterns in data. It is easy to think that computers will recognize such patterns without involving prejudices in coming up with these patterns.

A report, however, from Human Rights Data contradicts this. The report shows that the selection bias of ML systems has great, detrimental implications for society. The police in the United States use software that helps predict crime by using patterns from the past. There were higher crime rates in certain suburbs where there is a higher precedence of people of color so police would go on patrol there more often which led to more arrests of people of color. This created a negative-feedback loop, because of which the software become even more and more prejudiced about these suburbs. (Buranyi 2017)

Google urges users and developers to be aware of the fact that even though something is based on data it is not automatically neutral (Van Noort 2017). Gupta, research director of a Google ML-lab, explains that ML learns similarly like a child or a dog so when you feed them prejudices the computer will give you prejudices back (Van Noort 2017). A well-known example of ML giving back prejudices is Google’s own search engine. When one searched gorilla’s there would be pictures of people of color between the search results (Kasperkevic 2015). Yet another output of the Google search engine that has caused discontent was that when the search items for “CEO” only showed images of old white men (Hellman 2017). AI is not programmed by people but teaches itself what decisions it makes on the basis of large amounts of data. Such outcomes however are not preferable. This selection bias of the system is not preferable at all. They feed current prejudices and computers will become racist.

It is important to find a way through which we can integrate honesty and integrate in the ML systems. Is it possible to teach ML systems this? How do we ensure honesty in our judgments? Is this knowledge transferable or is it tacit and stuck in Polanyi’s Paradox. What do you think? Should more ethics specialists be involved in the ML adventure?

 

 

References

 

Buranyi, S. (2017). Rise of the racist robots – how AI is learning all our worst impulses. [online] the Guardian. Available at: https://www.theguardian.com/inequality/2017/aug/08/rise-of-the-racist-robots-how-ai-is-learning-all-our-worst-impulses [Accessed 2 Oct. 2017].

Hellman, G. (2017). Truth in Pictures: What Google Image Searches Tell Us About Inequality at Work. [online] Code Like A Girl. Available at: https://code.likeagirl.io/truth-in-pictures-what-google-image-searches-tell-us-about-inequality-at-work-554583cfe99d [Accessed 3 Oct. 2017].

Kasperkevic, J. (2015). Google says sorry for racist auto-tag in photo app. [online] the Guardian. Available at: https://www.theguardian.com/technology/2015/jul/01/google-sorry-racist-auto-tag-photo-app [Accessed 2 Oct. 2017].

Narula, G. (2017). Everyday Examples of Artificial Intelligence and Machine Learning. [online] TechEmergence. Available at: https://www.techemergence.com/everyday-examples-of-ai/ [Accessed 3 Oct. 2017].

Schmidt, M. (2016). Clarifying the uses of artificial intelligence in the enterprise. [online] TechCrunch. Available at: https://techcrunch.com/2016/05/12/clarifying-the-uses-of-artificial-intelligence-in-the-enterprise/ [Accessed 2 Oct. 2017].

van Noort, W. (2017). De Computer is Racistisch. NRC. [online] Available at: https://www.nrc.nl/nieuws/2017/09/19/de-computer-is-racistisch-13070987-a1573906 [Accessed 1 Oct. 2017].

 

 

Please rate this

Beaming Balloons to the Rescue

12

October

2017

No ratings yet.

Nowadays, everything around us seems to be connected. Information can readily be shared through the Internet. Keeping in touch with friends and loved ones on the other side of the world has never been so easy. The terrible and horrific hurricanes Irma and Maria, however, left many thousands of people homeless, without food or water and unreachable in Puerto Rico. The storms disabled almost all of the cell towers leaving the inhabitants not only cut off from the outside world but also from each other. Many people went through a terrible time of not knowing whether their loved ones or family members were still alive and where they were. (Roof 2017)

Project Loon, led by Alphabet, the parent company of Google, designed a new technology with which they can help make Puerto Rico reachable again (Hruska 2017). They have designed floating cell towers. These floating cell towers are large polyethylene balloons and can be compared to the size of a tennis-court (Hale 2017). The balloons are floated up into the Earth’s stratosphere where they will float around 20 kilometers into the sky. They are floated into the stratosphere by using helium. From here, the polyethylene balloons can beam down cellular service onto Puerto Rico with a reach of 5,000 square kilometers on land. High speed Internet is transmitted up to balloons from telecommunication partners on the ground, the Internet is relayed across the network of the balloons and then transmitted down to the users on the ground. The balloons can transmit Internet with a connection speed of up to 10mbps directly to the LTE phones of the people. The balloons provide the connectivity in the areas where it is much needed regardless of the conditions on land.

Project Loon will provide the Puerto Ricans with much-needed access to cellular service so that they can connect with loved ones and get access to life-saving information. Worth mentioning is that half of the world’s population still does not have access to Internet. Project Loon aims to launch and maintain a fleet of the polyethylene balloons to provide access to people in rural and remote areas across the globe (X.company 2017).

To learn more about their technology click on this link for an interactive overview: https://x.company/loon/technology/

Hale, T. (2017). Internet-Beaming Balloons Will Deliver Communications to Puerto Rico. [online] IFLScience. Available at: http://www.iflscience.com/technology/google-s-internet-beaming-balloons-will-deliver-communications-to-puetro-rico/ [Accessed 10 Oct. 2017].

Hruska, J. (2017). Project Loon Gets Green Light to Deploy Over Puerto Rico – ExtremeTech. [online] ExtremeTech. Available at: https://www.extremetech.com/internet/257200-google-project-loon-gets-fcc-green-light-deploy-puerto-rico [Accessed 11 Oct. 2017].

Roof, K. (2017). Google parent Alphabet looks to restore cell service in Puerto Rico with Project Loon balloons. [online] TechCrunch. Available at: https://techcrunch.com/2017/10/07/google-parent-alphabet-looks-to-restore-cell-service-in-puerto-rico-with-project-loon-balloons/ [Accessed 9 Oct. 2017].

X.company. (2017). Project Loon. [online] Available at: https://x.company/loon/ [Accessed 11 Oct. 2017].

Please rate this