Wearable, yet invisible, technology: smart clothes

3

October

2022

5/5 (1)
Wearable, yet invisible, technology: smart clothes

One can claim that wearable technologies originate from the 16th century, with the introduction of pocket watches. However, in my opinion, one of the first real wearable technologies was the watch that a mathematics professor, Edward Thorp, invented in the 1960s. A number of devices helped popularize and advance wearable technology over the following two decades. The first well-known was the calculator wristwatch, introduced by Casio. This watch can be seen in the picture below (Reliance Technology, 2019).

Nowadays, wearable technologies are becoming more and more adopted: smartwatches, fitness trackers, headphones, smart glasses, etc. Moreover, recent developments by researchers at Imperial college London could be a game changer. They designed a conductive thread suitable to embed sensors into pieces of clothing. The new thread, called PECOTEX, is revolutional because it is affordable, compatible with existing sewing machines, and fully waterproof, therefore machine-washable (Evans, 2022).

When thinking about the possibilities this technology brings, you are only limited by your imaginations. The first applications I could off were in terms of tracking: during fitness, when sleeping, but also for medical purpose tracking. That way, patients no longer have to wear visible machine on their body. They can just put on a t-shirt. Lastly, suppose GPS functionality can be embedded, parents could use these clothes to protect their kids when in an emergency (Evans, 2022).

Other developers in the field have presented similar research (Mulko, 2021), showing the potential of ‘smart clothes‘.

  • Smart clothes could potentially cool you down when hot, and keep you warm when cold.
  • Smart clothes could harvest energy: a human body radiates heat. This heat could be captured and could in the future be able to charge your phone on the go.
  • Smart clothes could clean themselves, or at least disinfect your clothes. this is possible once the smart fibers are capable of sending out UV-C radiation.

References:

Evans, S. (2022, September 28). Researchers Design Conductive Thread to Embed Sensors Into Clothing. IoT World Today. https://www.iotworldtoday.com/2022/09/28/researchers-design-conductive-thread-to-embed-sensors-into-clothing/

Mulko, M. (2021, December 16). What is Smart Clothing Technology and How Does it Work?https://interestingengineering.com/innovation/what-is-smart-clothing-technology-and-how-does-it-work

Reliance Digital. (n.d.). Wearable Technology – Then & Now | | Resource Centre by Reliance Digital. Retrieved 3 October 2022, from https://www.reliancedigital.in/solutionbox/the-amazing-evolution-of-wearable-technology/

Please rate this

Who is responsible for decisions made by algorithms?

9

September

2022

No ratings yet.

The number of processes that are being taken over by Artificial Intelligence (AI) is rapidly increasing. Additionally, the results of these processes are not solely to assist humans in their decision-making process anymore: the results yielded by the algorithm contain the decision, more often than not. In these cases, if there is a human being involved in the process, he/she most of the time simply needs to adhere to the results of the algorithm (Bader & Kaiser, 2019). 

This brings up the accountability question: in case the algorithm misperforms, who is accountable for the consequences? The most logical options are either the designers/ creators of the algorithm or the users of the algorithm. However, both options don’t seem to have an immediate preference over the other, since both possibilities raise a lot of potential difficulties.

Firstly, assessing accountability at the designers or creators of the algorithm raises concerns. One of the first scientists to be concerned about accountability in the use of computerized systems was Helen Nissenbaum. In 1996, much ahead of her time, she wrote a paper in which she described four barriers that obscure accountability in a computerized society. These four barriers are rather self-explanatory: many handsbugscomputer as scapegoat, and ownership without liability (Nissenbaum, 1996). To this day, these four barriers very well illustrate the difficulty to designate accountability when a process is aided by (or even fulfilled by) an algorithm (Cooper et al., 2022). 

Secondly, placing responsibility on the user is difficult, as, in a significant proportion of the cases, the user has zero to very little influence on the content of the algorithm. Also, as stated before, users are sometimes obliged to adhere to the outcome presented to them by the algorithm (Bader & Kaiser, 2019). 

Currently, most case studies show that the creators of the algorithms sign off their accountability to the users during the acquisition of the product containing the algorithm. For example, when buying a Tesla with ‘Full Self-Driving Capability’, Tesla simply states that these capabilities are solely included to assist the driver and that therefore, the driver is responsible at all times (Tesla, 2022; Ferrara, 2016). 

In my opinion, it would be wise to explore the gap in the possibilities of what can be done to not only legally (as Tesla does), but also morally, sign-off accountability to the users of the algorithm. Maybe already during the design phase of the algorithm. A proposed research question that could be addressed can be stated as follows: 

“What can be done about the design of an artificially intelligent algorithmic system to maintain accountability on the user side?”

References

  1. Bader, V., & Kaiser, S. (2019). Algorithmic decision-making? The user interface and its role for human involvement in decisions supported by artificial intelligence. Organization26(5), 655–672. https://doi.org/10.1177/1350508419855714
  2. Cooper, A. F., Laufer, B., Moss, E., & Nissenbaum, H. (2022). Accountability in an Algorithmic Society: Relationality, Responsibility, and Robustness in Machine Learning. ArXiv:2202.05338 [Cs]http://arxiv.org/abs/2202.05338
  3. Ferrara, D. (2016). Self-Driving Cars: Whose Fault Is It? Georgetown Law Technology Review1, 182.
  4. Nissenbaum, H. (1996). Accountability in a computerized society. Science and Engineering Ethics2(1), 25–42. https://doi.org/10.1007/BF02639315
  5. Tesla. (2022). Autopilot and Full Self-Driving Capabilityhttps://www.tesla.com/support/autopilot

Please rate this