How to teach ethics to a computer? No ratings yet.

12

October

2019

While companies are interested in the use case of artificial intelligence (AI), they should also be aware of its hurdles.

A 2018 Deloitte survey found that 32% of the 1,400 participants, consisting of U.S. executives knowledgeable about AI, ranked ethical issues on their top three risks of AI (Forbes Insight Team, 2019). Keeping in mind that AI is just in its infancy, this result is surprisingly high.

AI gone bad: an example

In 2016, Microsoft developed a chat bot, built by mining public conversations on Twitter. The bot would also learn and adapt based on these conversations. Sadly, it took internet trolls less than 24 hours to let it spew out racist, offensive tweets (Li, 2019).

In 2017, Google’s AI algorithm was reported to be sexist. Many languages mark gender differently than English. For instance in Turkish, the translation algorithm would decide a gender by itself when translating the Turkish gender neutral single third-person pronoun “o”. This lead to translations like “he is a doctor” and “she is a nurse” (Li, 2019).

An industry that needs savior

Social networks work by leading people to click on ads, whether it is for a shoe or a job application, the algorithm is calibrated to earn the most profit for its company (Forbes Insight Team, 2019). Journalism has become whatever the algorithms spit out. Whether the news is reliable or not is not of importance, the algorithm boosts sensational stories in order to draw more clicks (Smith, 2016). The problem is that we put the power in the hands of few –the creators.

Managerial implications

With legislations running behind the facts (Forbes Insight Team, 2019), there is an opportunity for companies to make a difference by using ethics as a business metric (Fjord, 2019). First, we must understand that AI risks are business risks (Li, 2019). AI can pick up trends that we are not even aware of while learning, including biases as well as blind spots of its creators (Compton, 2019). To mitigate these risks, companies can introduce anti-bias training alongside their AI and machine learning training, while actively correcting spotted bias. Furthermore, they can include a non-technical course on AI possibilities and risks (Li, 2019). Two types of people should experience these trainings; the AI engineers and the business leaders. This goes hand in hand with a step away from the traditional top-down business structure. We need a change in approach towards more agile methods, where feedback loops and iteration determine the development of the system (Compton, 2019) and managers and engineers work together.

When AI engineers and business leaders work together, they can create AI empowered by good ethical rulebook, instead of misguided capitalism.

Does this sound familiar to you?

 

References:

Compton J. (2019). Ethics And Inclusion In AI: Designing For The AI Future We Want To Live In. Forbes. Available at: https://www.forbes.com/sites/intelai/2019/03/27/ethics-and-inclusion-designing-for-the-ai-future-we-want-to-live-in/#b589dae255f5

Fjord (2019). The ethics economy. Trends. Available at: https://trends18.fjordnet.com/the-ethics-economy/

Forbes Insight Team (2019). Rise Of The Chief Ethics Officer. Forbes. Available at: https://www.forbes.com/sites/insights-intelai/2019/03/27/rise-of-the-chief-ethics-officer/#75b2cdfc5aba

Forbes Insight Team (2019). 4 Industries That Feel The Urgency Of AI Ethics. Forbes. Available at: https://www.forbes.com/sites/insights-intelai/2019/03/27/4-industries-that-feel-the-urgency-of-ai-ethics/#1718de5472be

Li M. (2019). Addressing the Biases Plaguing Algorithms. Harvard Business Review. Available at: https://hbr.org/2019/05/addressing-the-biases-plaguing-algorithms

Smith A. (2016). The pedlars of fake news are corroding democracy. The Guardian. Available at: https://www.theguardian.com/commentisfree/2016/nov/25/pedlars-fake-news-corroding-democracy-social-networks

Please rate this

Leave a Reply

Your email address will not be published. Required fields are marked *