Home assignment Crowdsourcing and Crowdfunding

1

November

2012

No ratings yet.

In this blogpost I am going to talk about my home assignment about crowdsourcing and crowdfunding. So I will start with the four articles that were all related to that weeks topic.

Jeff Howe (2006) described Crowdsourcing as ‘the act of taking a job traditionally performed by a designated employee and outsourcing it to an undefined, generally large of people in the form of an open call.’ He divides crowdsourcing into five different kinds of platforms: Crowdvoting, wisdom of crowds, crowdfunding, microwork and inducement prize contests.

Malone, Laubacher & Dellarocas (2012) maps out different collective intelligence systems with four questions ‘Who is doing it?, What is being done?, Why are they doing it? and How is it being done?’. The authors of the article provide with a framework which identifies the underlying building blocks ‘the genes’ that are the center of the collective intelligence systems.

Pisano, Verganti (2008) discusses four basic modes of collaboration. They present it in a framework where two issues are being raised. ‘Should membership in a network be open or closed?’ (Participation) and ‘Should the network’s governance structure for selecting problems and solutions be flat or hierarchical?’ (Governance)

Afuah & Tucci (2012) gives reasons why it makes sense that a ‘seeker’ will use crowdfunding. Crowdsourcing transforms distant search into local search. For a seeker organization is a distant search costly and risky. Crowdsourcing however is easy to broadcast to a large crowd. The knowledge required to solve a problem mostly falls outside the seeker’s knowledge environment.

Jeppesen & Lakhani (2010) describes in ‘Marginality and Problem Solving Effectiveness in Broadcast Search’ several findings about the submissions of ‘crowdsourced’ open innovation projects. This article gives an interesting perspective as it gives a perspective of the solver instead of the seeker in the other articles.

Amazon Mechanical Turk (MTurk) is a popular crowdsourcing marketplace for small tasks that cannot be easily automated by computers. Amazon Mechanical Turk uses a network of humans to perform tasks that computers are ill-suited for, but easy for humans to do. Employers are known as ‘requester’ post tasks, which are called Human Intelligence Tasks (HIT). These are picked up by online users, ‘the workers’ who complete them in exchange for a small payment, usually a few cents per HIT. (Ipeirotis, 2010)

Strengths: It is very low cost, it is anonymous, the mTurk service doesn’t allow asking for identifying information.

Weaknesses: HIT are very simple, repetitive, boring tasks and users are often only paid a few cents to complete them.

CrowdSpring is a crowdsourcing platform which allows buyers to run competitions for company logos, website designs, t-shirts. For the buyer of the design it means they have more choice of choosing the design for the fraction of the costs. For aspiring designers, it means a shot at stealing work from entrenched design firms. (Steiner 2009)

Strengths: Crowdflower reduces the cost of having an actual design firm design your logo.

Weaknesses: When you have to do a complex job it is not recommended to use CrowdSpring.

The designs are not always copyrighted, where people can use stolen work to submit.

Please rate this

Leave a Reply

Your email address will not be published. Required fields are marked *