Accelerating growth in technology: The pacing problem and transhumanism as a potential solution

5

October

2022

No ratings yet.

We all know that technology can be extremely beneficial and help us with various things. Especially in the BIM programme, many students likely have a thorough understanding of the capabilities of new (information) technologies. However, often little attention is directed to the potential existential threats that can come from technological innovation. I will briefly discuss how accelerating growth in technology results in the pacing problem and how this effects humanity.  

Technological innovations show accelerating growth. Depicted in the graph below, we can see that new technologies are following previous ones at a very fast pace, arguably at an exponential rate. While this generally results in great benefits for society, there is also serious risks associated with this. While technologies and its associated capabilities are becoming increasingly complex at a fast pace, our human understanding of these technologies and its unforeseen consequences do not increase at a similar rate, resulting in a pace problem (Downes, 2009). In other words, there is an increasing gap between actual technological capabilities and the understanding of these technologies together with its associated (unforeseen) risks. Sagan (1994) worded this nicely: “many of the dangers we face indeed arise from science and technology- but, more fundamentally, because we have become powerful without becoming commensurately wise. The world altering powers that technology has delivered into our hands now require a degree of consideration and foresight that has never before been asked of us”.

The pacing problem affects humanity in numerous ways, and has already affected us in the past. New technological innovations in the area of nuclear energy resulted in serious existential threats through the devastating powers that nuclear power brings and our limited understanding of this technology. Even technological developments during the Industrial Revolution have resulted in dramatic consequences years later. Only when our understanding of the previously unforeseen effects of increased greenhouse gas emissions increased did we understand how serious the threat actually is. What exactly are the (unforeseen) effects of rapidly evolving technologies such as Artificial General Intelligence, nanotechnology or bioengineering? Can we even understand these unforeseen consequences? Indeed, as Toby Ord pointed out, technological progress has been at the base of some recent major existential threats and there is strong reason to believe that existential risk will increase this century, as technological progress continues (Ord, 2020).

If we believe that the pacing problem indeed is a serious threat to humanity, three potential solutions can be identified. I am curious to hear which solution you like best. First, ‘renouncement’ is the idea that we can reduce the risks from new technologies by stepping away from it. For example, a potential ban for certain technologies. Second, ‘continuation’ indicates that we continue with our current norms. We continue developing world-engineering technologies at a fast pace to shape our world and if a problem arises from an unforeseen effect, we develop and implement new technologies that can hopefully solve these effects. And third, ‘transhumanism’, meaning that we do not only allow world-engineering, but also person-engineering. In other words, we do not only alter the world around us with new technologies, but also alter ourselves. Humans are merged with technologies so that we enhance human capabilities and are better equipped against existential threats. What are your thoughts on the following statement? “Although it may be dangerous to pursue the idea of transhumanism, it may be more dangerous not to pursue it.” I am curious to hear your thoughts!

References

Ord, T., 2020. The Precipice. New York: Hachette Books.

Downes, L., 2009. The Laws Of Disruption. Basic Books.

Sagan, C., 1994. Cosmos. London: Warner Books.

Please rate this

1 thought on “Accelerating growth in technology: The pacing problem and transhumanism as a potential solution”

  1. Nice post. I am curious how you would determine which technologies should be renounced, since this is difficult to determine. It would have to have a proven negative effect so bad that the technology should never be developed untill there are solutions to the problem. I feel like the second solution you propose alligns with the current way of thinking. Especially when looking at AI, there is very little regulation being done in controlling the development of AI. Transhumanism seems like the idea Elon Musk has mentioned, namely putting microchips in our brains so as an extension of ourselves (the same as a phone, the only problem is the bandwidth and interface issue with microchips). My idea would be to enforce harsh regulations on the development of AI as to make the development proces more regulated. Since AI transgresses human thinking (not in all ways, but in most) it is essential to be able to control it. Good post!

Leave a Reply

Your email address will not be published. Required fields are marked *