Technology of the week: Electronic markets in the recruitment industry (group 61)

15

October

2017

No ratings yet.

Online marketplaces for recruitment are platforms that bring employers and job seekers together. Recruiters can place their job offers on the platforms and job seekers can upload their resume, working as an electronic broker. It allows both parties to have a much larger reach compared to traditional methods, opening up a whole new world of additional opportunities and better matches. Over the last couple of years, there has been a growing number of these online marketplaces and even search engines, such as Indeed, bringing together the listings of all different platforms. There is one online marketplace, however, that has a different strategy: LinkedIn. LinkedIn is more than just an online marketplace for employers and job seekers. It functions as a professional social media platform. With over 300 million active users, LinkedIn has transformed into professional digital business cards, clearly outlining large parts of the working society. One of LinkedIn’s strong assets is their user database, which is very extensive. Network effects play a big role: the more employers and job seekers are on the platform, the better it works and the more value it creates. This is where LinkedIn has an advantage, compared to other online recruitment marketplaces, as LinkedIn also contains passive candidates. Research shows that twelve percent of the workforce is actively looking for jobs, yet almost 75% of the working population is not actively looking for a job but considers themselves as ‘approachable’ – meaning that they would be willing to negotiate when being approached for new positions. This is a massive disruption, as recruiters never had access to this group of people and never were able to find the perfect candidate so goal-oriented. Therefore, it is of high importance for this industry.

Recruitment has changed dramatically over the past decades due to the Internet. The traditional recruitment sources, such as newspaper advertising, job fairs, word-of-mouth or traditional recruitment agencies are considered outdated, out fashioned and not effective enough. The main task of the human resource department of a company is to keep their employees satisfied, attract new people and retain them within the company. Recruitment however, only focuses on attracting potential employees by actively searching for job-seekers, organizing events, doing interviews and working on their brand and company image.

The communication effect played a big role in the disruption of the recruitment industry. This effect implies that recruiters can access a much larger pool around the world in considerably less time. This allows HR departments to drastically cut in coordination costs. These include costs of gathering information from jobseekers and selecting from them.

However, electronic markets have their downside. They give rise to two types of information asymmetry: product uncertainty and seller uncertainty (Dimoka et al., 2012). Seller uncertainty arises because of the applicant’s unwillingness to truthfully disclose his or her true characteristics and capabilities. Product uncertainty however, refers to the applicant’s inability to describe his/her capabilities also due to unawareness of his/her true capabilities. Nonetheless, both forms of information asymmetry are no new phenomenon within the recruitment industry; it has always been one of the biggest challenges within recruitment.

The online recruitment industry set some revolutionary goals for the long term focused around one concept: transparency. These goals all point down to developing something called the economic graph. The economic graph is aiming to create total transparency of the global workforce. It wants to map out the world economy and all the connections that are in it. The firms’ view on the economy will improve as transparency increases. The increasing richness of data will contribute to the overall efficiency of economies given that high unemployment rates in Western countries are often explained by a mismatch of expertise and location. People can gain more insights of where pools of expertise are, and business can choose locations more precisely.

LinkedIn has a distinctive business model since it combines an electronic market place with a social media platform. With this combination, LinkedIn clearly distinguishes themselves from other online recruitment marketplaces. This is mainly because its users make up large part of the working force, opposed to recruiters and job seekers only. It is clear that the economic graph is quite a revolutionary goal, but we think that LinkedIn is going to be a huge contributor to the economical graph and thus the future of the recruitment industry.

Sources

Malone, T.W., Yates, J., and Benjamin, R.I. 1987. Electronic Markets and Electronic Hierarchies. Communications of the ACM 30(6) 484-497.

Dimoka, A., Hong, Y., and Pavlou, P.A. 2012. On Product Uncertainty in Online Markets: Theory and Evidence. MIS Quarterly 36(2) 395-426.

Lu, Y., Gupta, A., Ketter, W. and van Heck, E. 2016. Exploring Bidder Heterogeneity in Multi-channel Sequential B2B Auctions: Evidence from the Dutch Flower Auctions. MIS Quarterly, 40 (3), pp.645-662.

 

Please rate this

Forget your friends – Google knows you best

7

October

2017

5/5 (4)

google_cameras_inside1-1024x576

 

Entering a word or small sentence into the Google search bar is something that almost everyone does on a daily basis. It may seem as innocent words that may reveal your thoughts a little, but when combining all of your search records, the true you is exposed.

Using the privacy of their keyboard, people confess their weirdest thoughts (Stephens-Davidowitz. 2017). After the search for ‘is it normal to want to’ the word ‘kill’ follows most often, and when adding the word kill, the search engine suggests ‘my family?’ Stephen-Davidowitz shows the sinister side of human beings.

Apparently, reality differs from how people like to present themselves. In questionnaires, although they are often anonymous, people still feel the urge to present themselves in a better way, especially with sensitive topics. On Google however, people generally feel safe to enter any question that pops into their mind. If you’re into racist jokes, you ask Google to provide them for you. You may be in denial about a depression, but your search words ‘crying for no reason’ and ‘how to fall asleep easier’ may suggest differently.

But Google is not limited to the words you enter into their search engine – it can also combine data with your location. In 2015, Google actually helped people to view their activity by creating a hub called ‘My account’, where you can view the information that Google is collecting and to change your settings. If you click on ‘My activity’, you can see almost everything you do that is related to your Google account (Cadie Thompson, 2016).

Although it is quite a scary phenomenon that Google has all this information about you, it also improves your experience with Google. After reading this, would you think twice when entering your question into the Google search engine? Are you going to change your privacy settings, or do you enjoy the benefits you have when Google tracks your search history?

 

Sources

Stephens-Davidowitz, S. (2017) Everybody Lies, Big Data, New Data, and What the Internet Can Tell Us about Who We Really Are. Harper Collins, pg. 338.

Thompson, C. (2016) Business Insider. How to see everything Google knows about you. http://www.businessinsider.com/how-to-see-everything-google-knows-about-you-2016-6?international=true&r=US&IR=T

Please rate this

Are computers becoming racist?

25

September

2017

5/5 (5)

Recently, Google launched a warning about machine learning (ML), a subfield of artificial intelligence which gives computers the ability to learn, without being explicitly programmed. Google shows that machine learning teaches computers to independently discover patterns in data. It is likely to assume that this does not come with any prejudices, but the fact that something is based on data, does not make it neutral. ML learns the same way a child or dog learns, and when you teach them prejudices, this will get back to you eventually (Hannah Devlin, 2017).

ML makes independent decisions based on patterns in huge amounts of data, which as a result causes patterns from the past to be continued or even reinforced. As an example, when entering ‘CEO’ into Google Images, the majority of images will be white middle-aged men (Hannah Devlin, 2017). Companies that use ML realize the severity of this problem, but have not been able to find a proper solution yet.

Google blames these problems on biases that occur with machine learning (Wouter van Noort, 2017). Users make the system biased, which is known as the interaction-bias. For example, Google asked users in an experiment to draw shoes. Most people drew male-shoes, and as a result, the machine learning system did not recognize women-shoes as shoes. The selection-bias occurs when data that is used to train machine learning, contains a disproportional amount of people from a specific group. This teaches the system to be better in recognizing that specific group. This results in huge implications for predictive pooling; recruitment and selection using ML is occurring more and more recently.

Does this mean that machine learning is an unfair system based on biases? Not necessarily. When you leave decisions to be made by humans, they are far from perfect as well. Currently, companies try to develop a fairer system for self-learning computers. A ‘Partnership on Artificial Intelligence’ is set up in which big ML companies, such as Google, Apple and Facebook unite to deal with ML and artificial intelligence related problems (Wouter van Noort, 2017). However, a lot of systems are developed in big technology companies that are not completely transparent. Although they say they are taking ethics seriously, often the company does not employ people specialized in ethics.

To conclude, machine learning can be very useful in analyzing data and its use should not be underestimated. However, it is always good to keep a human eye on the results.

Sources

Devlin, H. (2017) ‘AI programs exhibit racial and gender biases, research reveals’, The Guardian, 13th of april 2017, https://www.theguardian.com/technology/2017/apr/13/ai programs-exhibit-racist-and-sexist-biases-research-reveals

Van Noort, W. (2017) ‘De computer is racistisch’, NRC, 19th of September 2017, https://www.nrc.nl/nieuws/2017/09/19/de-computer-is-racistisch-13070987-a1573906

 

Please rate this