“Chatbots have turned to crime, using ever-slicker methods to steal cash or identities – and these cheating algorithms are passing the Turing test every day.” – Peter Nowak
Increasingly, chatbots are being programmed to scour different social networking platforms to gather peoples’ personal information, then reach out to these people to try to convince them to divulge their personal information. These chatbots are fairly ubiquitous on social networks, messaging apps and webmail, have different names along these different medium, and ultimately operate by using the same number of conversation strings and responses based on such categories as the victim’s profession, interests, etc. These chattiest try to develop a relationship with their mark, build trust, then solicit cash or attempt identity theft, depending on the chatbots’ ultimate programmed goal.
Chatbots have fooled their victims not just through sexual gambits, but also through appealing interested in the person’s hobbies and passions. Indeed, some chatbots steal identities to trick you into divulging your personal information on some social media platforms. Anyone can easily become a victim of these enchanting, criminal chatbots. For instance, former Loebner prize director and psychologist Robert Epstein was engaged in a prolonged email correspondence with a criminal chatbot for for over two months, despite being labeled as a chatbot expert. The Loebner prize was established in 1990 to recognize and honor significant achievements in chatbot evolutional development, and is awarded on the chatbots ability to pass the Turing test. To pass this test, a chatbot must be able to fool a number of people into believing it is human in virtual conversations.
Besides the fact that there are an unknown number of criminal chatbots lurking out there, preying on innocent, unsuspecting victims, what is truly troublesome is that these ‘bots are passing the Turing test due to how difficult it can be to identify some of the more sophisticated criminal chatbots. Unfortunately, the Internet has enhanced the chatbots abilities, too. Instead of pre-programming certain lines and phrases into the chatbot, creators can now create chatbots with self-learning mechanisms. Thus, while chatbot technology has beed used to “beef” up security, it can also then just as easily be co-opted by criminals. Indeed, it is estimated that greater than 80 percent of online crimes are perpetrated by chatbots. It has also been increasingly unduly difficult for law enforcement agencies to track down these ever more sophisticated chatbots because of their international reach.
This begs the question: how many individuals annually fall prey to nefarious chatbots? What can security departments due to track down the chatbots’ source with international jurisdictions?
Nowak, N. (2012, June 20). Silicon sirens: The naughty bots out to seduce you. Retrieved from https://www.newscientist.com/article/mg21428705.900-silicon-sirens-the-naughty-bots-out-to-seduce-you/