Wikipedia is one of the most classical and popularly-known form of peer production. It is an online encyclopedia where volunteers collectively and collaboratively create, share and classify articles in which anyone could edit. The main criticism mentioned in the assigned HBC is the accuracy and reliability of the article contents on Wikipedia, including the lack of expertise within the content-production process. I was interested in the reliability issue of Wikipedia and as I was researching, I came across an interesting article.
It was presented that robots are writing more articles for Wikipedia than any single individual in this world! An increasing number of entries on Wikipedia are being authored by automated software, or bots, that pull raw information from databases, then use algorithms to generate text in standardised templates.The bot writes and posts the Wikipedia articles automatically. The most prolific wikibot, as mentioned in the article is called “Lsybot”. It is a single bot programme created by a Swedish university administrator, Sverker Johnansson whom has written a total of 2.7 million articles, meaning on general up to 10,000 Wikipedia articles per day. It is responsible for 8.5% of the articles available on the system!
In my opinion, the effectiveness of using bots to create Wikipedia articles can be disputed. On one hand, as wikibots gather information from different databases, it can be argued that it brings higher accuracy and reliability to the information presented. It also produces articles at a much faster pace than human individuals, with lower chances of human errors such as spelling and grammar mistakes. However, I feel that this is affecting the initial operating model of Wikipedia. As a peer production system, the idea of Wikipedia is to gain the collaborative power of individuals to collectively gather knowledge and contribute articles of information. With wikibots, this is no longer the same as robots are beginning to contribute more than human and could be editing the source of information we establish. It also poses a problem of whether it is correct to allow robots to design the information in which we human “learn” knowledge from.
Would you agree with Johnansson, the creator of Lysbot, that having robots to write Wikipedia articles can increase the reliability and accuracy of article contents? Do you support this practice? Do you feel that wikibots are removing the benefits of peer production and would soon pull Wikipedia away from the pure peer production and open source framework? Is it an argument of short-term to long-term benefit? Can you think of any other possible consequences of having robots to write Wikipedia?
Sources:
http://www.popsci.com/article/science/bot-has-written-more-wikipedia-articles-anybody
http://news.discovery.com/tech/robotics/wikipedia-bot-writes-10000-articles-a-day-140715.htm
It amazes me that so many articles on Wikipedia are written by robots. I think robots work effective, fast and reliable, but I also think that a robot can not work very precisely. I think a person can type a text more precise than robots. Robots are still making spelling and grammatical errors in texts. So I agree only partly with Johnansson.
In addition, I do think that the goal of Wikipedia is lost because of the bots. People will not have the tendency to write an article, because they think that this is already being done by robots. I definitely think that the bots ensure that Wikipedia will no longer pursue their idea to gain the collaborative power of individuals collectively to gather knowledge and contribute articles of information.
Furthermore, bots can not argue why they have written an article. So if there is an article discussed in the “articles-for-deletion” process, the author can not defend his/her article by contributing arguments.
I believe Wikipedia could use the robots as addition, but they should not be totally dependent on the work of these robots.