Will tomorrow’s smart living room be a spy in your house?

16

October

2019

5/5 (1)

Nowadays, smart home devices are becoming more and more popular among households. In the future, around 30 million U.S. households will add smart home technology to their house (Mordorintelligence, 2019). Furthermore, the growth rates of smart home markets in North America and Europe are prospected to be high. Tech giants like Google and Amazon have brought different types of smart home devices on the market, that are most often voice-controlled devices. Examples are Google Home and Amazon Echo (Alexa). These devices are driven by IoT (Internet of Things), in which the smart home device is interconnected with all kinds of other devices in your home. These days, smart home devices are connected to your TV, music system, thermostat, doorbells, and in the near future with your fridge, oven, and kettle (Luimstra, 2019). These will be all controllable with simple questions and commands given by the customer.

Obviously, the implementation of a smart home device could give the customer various advantages. At first, the customer will be given more convenience in and outside the house. Several devices in your household will be more easily being controlled, often with small voice commands. Whether you want to know the best route to your destination when leaving your house, or you want to arrive in a house that’s heated up, or you want to know what movie to watch after the serie you’re almost done with: it’s all possible with a smart home device (Marr, 2019). Secondly, since many voice-assisted home devices are connected with your energy regulator, smart home devices are an ideal way to save on your future energy expenses. According to your preferences, preprogrammed temperatures and lighting schedules can be implemented, which are also easy to modify after. Lastly, smart home devices tend to increase the safety of your house. Smart doorbells are able to livestream the person that’s at your door, which gives a customer more information about people with possible bad intentions (Luimstra, 2019).

However, the safety of smart home devices (and mainly the voice-controlled ones) has been criticised lately. All information that smart home devices need are saved in the cloud (Marr, 2019). While this is convenient, it also creates an easy point for abuse of your personal information. Voice-assistant devices are known to be easily activated by a so-called ‘wake word’, like the word ‘Alexa’ for Amazon’s voice assistant (Karch, 2019). If activated, all information is recorded and saved in the cloud, and is therefore also accessible for hackers or other wrongdoers. Furthermore, the smart home devices have voices that are increasingly sounding like a normal human voice (Weinberger, 2019). Since we emotionally attach value to voices, this could become a problem when a voice-controlled smart home device will talk to outsiders or family people. So, while the positive aspects of smart home devices are obviously present, its negative threats related to security and trust into these systems may not be neglected. Will voice-controlled smart home devices become almost 100% safe? Are we able to distinguish between voices of smart home devices and the voice of one of our relatives? These question are ripe for future discussion.

References:
Karch, M. (2019). Is Your Smart Device Spying on You? How Can You Stop It?. [online] Lifewire. Available at: https://www.lifewire.com/is-your-smart-device-spying-on-you-4141166 [Accessed 15 Oct. 2019].

Luimstra, J. (2019). De slimme IoT-huiskamer: groot goed, of potentiële spionage?. [online] Sprout. Available at: https://www.sprout.nl/artikel/technologie/de-slimme-iot-huiskamer-groot-goed-potentiele-spionage [Accessed 15 Oct. 2019].

Marr, B. (2019). The 7 Most Dangerous Technology Trends In 2020 Everyone Should Know About. [online] Forbes.com. Available at: https://www.forbes.com/sites/bernardmarr/2019/09/23/the-7-most-dangerous-technology-trends-in-2020-everyone-should-know-about/#16a928687780 [Accessed 15 Oct. 2019].

Mordorintelligence. (2019). Smart Homes Market | Growth, Trends, and Forecast (2019 – 2024). [online] Available at: https://www.mordorintelligence.com/industry-reports/global-smart-homes-market-industry [Accessed 15 Oct. 2019].

Weinberger, D. (2019). Can We Trust Machines that Sound Too Much Like Us?. [online] Harvard Business Review. Available at: https://hbr.org/2019/05/can-we-trust-machines-that-sound-too-much-like-us [Accessed 15 Oct. 2019].

Please rate this

The danger of technology development – Deepfakes

11

September

2019

No ratings yet.

Nowadays, technology trends are everywhere. While the development of new and innovative technologies can make life easier and more convenient, also the dangerous sides of technology development may negatively impact people in extreme ways. One example is the rise of Deepfake technology. This AI-type of technology is used to manipulate reality, in the form of images and videos of real people that say or do things that they have never said or done. In recent years, many famous people like Obama, Mark Zuckerberg and Kit Harrington have appeared in Deepfake videos. The advancement of machine learning techniques helps editors to make the deepfake content even more realistic, thereby making them difficult to distinguish from reality. Multiple professors and the Dutch Public Prosecution Services are getting more and more concerned about the developments of Deepfake technology, which may escalate to dangerous levels. But why is it so dangerous for certain people and maybe even the whole society?

At first, the advantages of Deepfake technology may not be neglected. For example, Deepfake technology can be used for educational purposes, by providing information to students in more innovative ways. Other benefits may be more psychologically based, in the sense that Deepfake technology may help people with certain disabilities to experience pornographic or video-game related matters in a better and more autonomous way. However, the enormous risks that Deepfake technology may bring, often overrule its possible benefits.

Namely, the manipulation of photos and videos of famous and/or influential people may be negatively used to spread misinformation or damage these people’s reputation. This may have impact not only in the business world (where CEOs may be depicted in a negative light) but also in the political and societal environment (in which influential politicians may say or do things against democracy). Besides the advancements in AI and machine learning technologies, the tools and edited videos can be accessed and distributed more easily nowadays. And what’s even the most terrifying is that deepfake content can’t be prevented from being created, since the technologies and access around it keep on evolving. This future problem doesn’t have a clear solution yet, however researchers propose that tracking systems need to be build, that are able to distinguish deepfake videos from real ones. Yet, this will be a tough task.

Sources:
Chesney, Robert and Citron, Danielle Keats, Deep Fakes: A Looming Challenge for Privacy, Democracy, and National Security (July 14, 2018). 107 California Law Review (2019, Forthcoming); U of Texas Law, Public Law Research Paper No. 692; U of Maryland Legal Studies Research Paper No. 2018-21. Available at SSRN: https://ssrn.com/abstract=3213954 or http://dx.doi.org/10.2139/ssrn.3213954

Eadicicco, L. (2019). There’s a terrifying trend on the internet that could be used to ruin your reputation, and no one knows how to stop it. [online] Business Insider Nederland. Available at: https://www.businessinsider.nl/dangerous-deepfake-technology-spreading-cannot-be-stopped-2019-7?international=true&r=US [Accessed 11 Sep. 2019].

Nu.nl. (2019). Openbaar Ministerie uit zorgen over mogelijke afpersing via deepfakes | NU – Het laatste nieuws het eerst op NU.nl. [online] Available at: https://www.nu.nl/tech/5989399/openbaar-ministerie-uit-zorgen-over-mogelijke-afpersing-via-deepfakes.html [Accessed 11 Sep. 2019].

Please rate this