Would you build a system to kill?

14

October

2018

5/5 (1)

The Pentagon issued a $10 billion cloud contract to build cloud services for the Department of Defense, which is called JEDI. This contract is enormous is size, and shrouded in secrecy. Therefore, the CMO of the Department explained the impact as follows: “We need to be very clear. This program is truly about increasing the lethality of our department.”

Employees of some of the few companies that were able to bid on the project, expressed their concerns about the project. Google executives recently stated that they will not use AI “for weapons, illegal surveillance, and technologies that cause ‘overall harm.” More recently employees of Microsoft wrote an open letter to their management, asking them not to bid on the JEDI project.

The few tech companies that are able to even consider bidding on the project, are in conflict with their own principles. Google was interested in the bid, and its $10 billion price tag, but only withdrew from it after a large number of employees opposed. Afterwards, Google made the world believe that they withdrew because it would not align with their AI principles.

The question now is whether other companies will also back down from the bid, because their principles not to harm, or kill, other people. Or will they put aside their own principles for the $10 billion bone that is hold in front of their noses. Microsoft is most likely the first company to decide on the matter, since their employees do not want to build systems with the intent of killing people. Only last year Microsoft defined six core principles when it comes to AI. Are they willing to violate them, which they will when accepting the offer, for a large bag of gold? The future will show which company cares less about other people.

https://medium.com/s/story/an-open-letter-to-microsoft-dont-bid-on-the-us-military-s-project-jedi-7279338b7132

https://techcrunch.com/2018/09/26/putting-the-pentagon-10b-jedi-cloud-contract-into-proper-perspective/?guccounter=1

 

Please rate this

Solving Crimes With Data Mining

4

October

2018

No ratings yet.

What do the Unabomber and Marc Dutroux have in common? Your first thought might be that they both killed multiple people. That is true, of course! The other similarity is the fact that in both cases the police took a very long time to identify and arrest the right person. New technological developments could help the police to solve future cases faster.

 

In case your first thought was: Unabomber? Marc who? here is a short description of them. Ted Kaczynski, known as the Unabomber, conducted a bombing campaign in which he targeted people who were involved with implementing modern technology. Killing three, and severely injuring 23 others. Marc Dutroux abducted, molested, and murdered several young children in Belgium in the 90’s.

 

Solving crimes is a very complex task, and requires a lot of work and experience. well, maybe not always if you look at the following headlines “Florida man uses wanted poster as Facebook profile picture”, and “Woman arrested after trying to test her drugs for Ebola”. In most cases it is a time-consuming and costly process. Therefore, new ways of solving crimes are desirable.

 

Data mining could help the police to solve crimes faster, as it can be used to model crime detection problems. Data mining will capture years of human experience in solving crimes into computer models. Their algorithms will then help in the identification of crime patterns. Researchers are now trying to integrate machine learning to further accelerate the potency of data mining.

 

Maybe Ted Kaczynski knew that this was coming, and therefore tried to save himself, along with his fellow murderers, by bombing people that were involved with modern technology.  We should all be thankful that the bombings did not stop the world from implementing new technologies, because they can assist us in multiple fields, like solving crimes.

 

source: https://link.springer.com/chapter/10.1007/978-1-4020-6264-3_70

Please rate this