Will Quantum Computing finally take off?

13

October

2022

No ratings yet.

But first, what is quantum computing?

Quantum computing is a new operating technology which can harness the quantum mechanic laws to solve complex problems substantially faster than the classical computer. The main difference between a quantum computer and a classic computer is that a quantum computer uses qubits to run multidimensional quantum algorithms instead of bits to run its operations. As a bit can only hold either the value 0 or 1, a qubit can be in a superposition which allows them to be in any proportion of both states at once. With 4 bits for example, there are 24 combinations out of which you can only use one. With 4 qubits in superposition however, can be in all the combinations at once. 20 qubits in superposition can already store over a million values at once. At the moment though, it is still under debate whether quantum computing is just a very specialized tool or a revolution for mankind.

Quantum computing has many application and in recent years many companies are attempting to implement it in their digital strategy. ExxonMobil for example partners up with IBM and makes use of quantum computing to find optimal fleet routes for their ships accounting for many uncertainties like weather and demand (ExxonMobil Strives to Solve Complex Energy Challenges, n.d.). While Mercedes-Benz is partnering with IBM and Google to explore quantum computing and its ability to accurately simulate batteries at a molecular level of detail which involves huge complexity in running algorithms. With these simulations, Mercedes-Benz can explore new materials to create more efficient batteries (Mercedes-Benz Group, n.d.).

Many companies in this branch aim to have the best quantum computer with the most superconducting qubits to reach quantum supremacy. Quantum supremacy is reached when a quantum device is able to solve a problem that no classical computer is able to solve in a feasible time. Google have claimed to reach quantum supremacy back in 2019 by performing a series of operations in 200 seconds which would take a supercomputer about 10,000 years (Waters & Murgia, 2019). In 2020 a group based in the University of Science and Technology of China claimed to reach quantum supremacy by generating samples which took their quantum computer about 20 seconds while taking an ordinary supercomputer 600 million years of computing (Zhong et al., 2020).

References

ExxonMobil strives to solve complex energy challenges. (n.d.). IBM. Retrieved October 13, 2022, from https://www.ibm.com/case-studies/exxonmobil/

Mercedes-Benz Group. (n.d.). The Art of Quantum Simulation. Retrieved October 13, 2022, from https://group.mercedes-benz.com/innovation/partnerships/collaboration/quantum-computing-google.html

Waters, R., & Murgia, M. (2019, September 20). Google claims to have reached quantum supremacy. Financial Times. Retrieved October 13, 2022, from https://www.ft.com/content/b9bb4e54-dbc1-11e9-8f9b-77216ebe1f17

Zhong, H., Wang, H., Deng, Y., Peng, L., Luo, Y., Qin, J., & Wu, D. (2020, December). Quantum computational advantage using photons. Science, 370(6523), 1460–1463. https://doi.org/10.1126/science.abe8770

Please rate this

Q-Day and the fall of Internet:

7

October

2022

No ratings yet.

To those who have never heard the term Q-Day it may sound mysterious, as if it was a major event from a Sci-Fi novel which has changed the fate of the whole humanity. This description is not far off the truth, as the technology hiding behind the “Q” is quantum computing, a concept which for decades were constrained to such novels. So, what is “Q-Day” than? It is a day in which the quantum computers become stable enough to operate for the prolonged period of time. But don’t we have operational quantum computers right now? Similarly, to the physics behind the concept, the answer is not straightforward. In order to understand it, we first have to understand the difference between quantum and semiconductor-based computers (duh, physics). Regular computers operate based on bits – electrical signals which can take value of 0 or 1. They are processed by the CPU, a device consisting of millions of transistors etched onto a silicon chip – for example, a CPU in iPhone 14 has 16 million transistors (Ganti, 2022). Those transistors are organized into logic gates, which execute operations according to the predefined programs (Gayde, 2019). Quantum computers operate using qubits which can also take a value from 0 to 1. However, contrary to regular bits, they are in the state of superposition between 0 and 1 (Nielsen & Chuang, 2010). They can be treated as being 0 and 1 at the same time (a bit of oversimplification, but detailed explanation is outside of the scope of this article). It means, that with every added qubit their power grows in a quadratic fashion: 1 qubit = 1 bit, but 1000 qubits = 1000000 bits. It means that their theoretical power vastly outperforms those of the standard computers. So, you may ask, what is the problem with quantum computers and why the Q-Day has not arrived yet? The main issue is maintaining the state of superposition. It requires the qubits to be fully isolated from their surrounding – they have to be kept at temperature close to absolute zero (Jones, 2013) and shielded from any outside interactions, since things as miniscule as cosmic radiation can break the quantum state of superposition (Vepsäläinen et al., 2020). To illustrate how big of the hurdle it is, on the 30th of September 2022 researcher from the University of South Wales announced a breakthrough – they have managed to maintain the quantum state of superposition for the staggering 2 milliseconds (100 times more than the previous record) (For the Longest Time: Quantum Computing Engineers Set New Standard in Silicon Chip Performance, 2022). Despite being operational for such a fleeting period of time, quantum computers have already shown immense power. In 2019 team of scientists from Google and NASA achieved the so called “Quantum Supremacy”. Quantum computer developed by them managed to conduct calculations which the most powerful traditional supercomputer, Summit, would calculate for 3 million years (Liu et al., 2021). There is no official definition of the Q-Day but try to imagine that the very same computer could operate for 2 minutes. Then surely a point of no-return will be reached.

But how will the Q-day contribute to the fall of Internet? It all boils down to cryptography and how the digital information is secured. Nowadays, vast majority of online data is encrypted via TLS/SSL protocols. In a nutshell, the main idea behind them is multiplication of prime numbers. To give an example, a 2048-bit encryption would mean that a server would send in a public message (visible to everyone) a 2048-digit number which is a product of two primes. In order to authorize the access, the user’s computer would have to provide the server with those two primes. Trying to find two divisors of 2048 digit number by brute force is virtually impossible – according to some estimates, it would take 300 trillion years for a standard computer to break this encryption. In this case, how is it even possible that you can log in into your bank account without waiting for a heat death of the universe? Every account has a private prime number which matches one of the prime numbers sent by the server. The only thing the computer has to do is to divide one number by the other, which can be done in milliseconds. How does it compare to quantum computers? A quantum computer with 4099 qubits (this threshold has already been reached  (Rolston-Duce, 2022)), could break the 2048-bit encryption in 10 seconds! It means that someone with a quantum computer able to maintain quantum superposition for long enough could gain access to anything on the internet – bank accounts or government secrets, nothing will be able to withstand the unbelievable power of a stable quantum computer. Does it mean that the world will have to go back to pre-digital era, since nothing cannot be safely encrypted ever again? Fortunately, major players in the encryption business have recognized the problem. In 2016 US government organization, National Institute of Standards and Technology (NIST), has asked scientist to submit propositions of encryption algorithms which will be ready for post quantum future. The results of the contest were announced this year, with the winner (in public Key-Encryption area) being Crystals-Kyber encryption method (Bos et al., 2018; NIST, 2022). Unfortunately, despite my best efforts I am unable to explain how this method works, it makes sense that the complex problem requires complex solution. Even though solutions to the problem exists today, companies are reluctant to implement them. They face similar dynamic when it comes to Post Quantum (PQ) encryption as they do with climate change. Implementation of the solutions is costly and does not offer immediate benefits, and the only incentive to implement them is in the future. There is little awareness to this problem, hence companies face little pressure from the consumers to improve the security of their encryption. Thus, the question remains, will the internet as we know it succumb to the unimaginable power of future quantum computers? Or will we be able prepare ourselves for the inevitable emergence of the quantum monster?

References:

Bos, J., Ducas, L., Kiltz, E., Lepoint, T., Lyubashevsky, V., Schanck, J. M., Schwabe, P., Seiler, G., & Stehle, D. (2018). CRYSTALS – Kyber: A CCA-Secure Module-Lattice-Based KEM. Proceedings – 3rd IEEE European Symposium on Security and Privacy, EURO S and P 2018, 353–367. https://doi.org/10.1109/EUROSP.2018.00032

For the longest time: Quantum computing engineers set new standard in silicon chip performance. (2022). https://archive.ph/HikMD

Ganti, A. (2022). Apple A16 Bionic announced for the iPhone 14 Pro and iPhone 14 Pro Max – NotebookCheck.net News. https://www.notebookcheck.net/Apple-A16-Bionic-announced-for-the-iPhone-14-Pro-and-iPhone-14-Pro-Max.647967.0.html

Gayde, W. (2019). How CPUs are Designed and Built, Part 2: CPU Design Process | TechSpot. https://www.techspot.com/article/1830-how-cpus-are-designed-and-built-part-2/

Jones, N. (2013). Computing: The quantum company. Nature, 498(7454), 286–288. https://doi.org/10.1038/498286A

Liu1, Y. A., Liu1, X. L., Li1, F. N., Fu, H., Yang, Y., Song, J., Zhao, P., Wang, Z., Peng, D., Chen, H., Guo, C., Huang, H., Wu, W., & Chen, D. (2021). Closing the “quantum supremacy” gap: Achieving real-Time simulation of a random quantum circuit using a new sunway supercomputer. International Conference for High Performance Computing, Networking, Storage and Analysis, SC. https://doi.org/10.1145/3458817.3487399

Nielsen, M. A., & Chuang, I. L. (2010). Quantum Computation and Quantum Information. www.cambridge.org

NIST. (2022). Post-Quantum Cryptography | CSRC. https://csrc.nist.gov/Projects/post-quantum-cryptography/selected-algorithms-2022

Rolston-Duce, K. (2022). Quantinuum Announces Quantum Volume 4096 Achievement. https://www.quantinuum.com/pressrelease/quantinuum-announces-quantum-volume-4096-achievement

Vepsäläinen, A. P., Karamlou, A. H., Orrell, J. L., Dogra, A. S., Loer, B., Vasconcelos, F., Kim, D. K., Melville, A. J., Niedzielski, B. M., Yoder, J. L., Gustavsson, S., Formaggio, J. A., VanDevender, B. A., & Oliver, W. D. (2020). Impact of ionizing radiation on superconducting qubit coherence. Nature 2020 584:7822, 584(7822), 551–556. https://doi.org/10.1038/s41586-020-2619-8

Please rate this

Quantum Computing: The computer of Future?

20

September

2021

No ratings yet.

Hello fellow students! To start out, I would recommend watching the video below where Quantum computer is introduced and explained. 

Quantum computing harnesses the phenomena of quantum mechanics to deliver a huge leap forward in computation to solve certain problems (IBM). Up until now, organizations have relied on supercomputers to solve complex problems, however, some problems are so complex that even supercomputers cannot solve them. This is where Quantum computers come in. 

Regular computers use bits and bits can be in one of two states – 0 or 1. Quantum computers have qubits or quantum bits. These bits are in a so-called superposition, which means that they can be both 0 and 1 at the same time. 

So what is so special about Quantum computers? Quantum computers can create vast multidimensional spaces in which to represent these very large problems (IBM). Algorithms that employ quantum wave interference are then used to find solutions in this space and translate them back into forms we can use and understand (IBM). This is especially beneficial for combinatorial optimization problems such as calculating the most cost-effective routes for a logistics company or the risk an investment companies portfolio is facing. These are problems where the calculations are complex, however, quantum computers can also deal with other problems. For example, pharmaceutical companies can simulate molecules to understand drug interactions, hackers can break the most complex encryption methods to access vulnerable data. In a nutshell, quantum computers can solve problems, we never thought could be possible to solve. 

And talking about impossible problems, the one problem I am personally looking forward to being solved is weather forecasting. Simulating nature is one of the hardest problems for humanity and quantum computers can potentially solve the problem and simulate nature. Imagine checking the weather app and being sure that the weather forecast is 100% accurate for the next week or even month. 

Have you heard about Quantum computing? Are there any problems you are excited to be solved? 

References

https://www.ibm.com/quantum-computing/what-is-quantum-computing/

Quantum Computing Is Coming. What Can It Do? (hbr.org)

https://en.wikipedia.org/wiki/Quantum_computing#:~:text=Quantum%20computing%20is%20the%20exploitation,are%20known%20as%20quantum%20computers.&text=The%20study%20of%20quantum%20computing%20is%20a%20subfield%20of%20quantum%20information%20science.

Please rate this

On the edge of something new

7

October

2020

No ratings yet. We are entering an era where new two technologies for computing are becoming more and more crucial. The two computing types, quantum and edge, will have a crucial impact on computing power and will increase the processing abilities enormously.

I already quickly mentioned quantum computing in my other article about the DARQ technologies (see here), and in this article I want to dive deeper into what quantum computing is, its benefits, and its differences to edge computing, since the two are sometimes seen as similar, which they aren’t.

 

The most important about quantum computing
While the explanation of quantum computing and its functionalities can fill books, I try to put it short and point out the most important. Basically, quantum computers are able to solve problems that ‘traditional’ computers cannot solve, mainly because the ‘traditional computers’ can only process information displayed through 1s or 0s. The ability of quantum computers to solve more difficult problems is derived from the ability for the 1s and 0s to exist in two states (qubits) at once, making the bytes able to hold four values at the same time: 00, 01, 10, and 11. That way, a quantum computer can perform computations in parallel, crucially increasing their computing power and therefore their efficiency in comparison to ‘traditional’ computers.

However, to perform those actions, quantum computers require special algorithms, which have yet to be defined. Scientists have been researching for ages with yet to find a way to define usable algorithms to make quantum computing usable in large scale.

 

The most important about edge computing
With the constant development and improvement of technologies like XR, autonomous vehicles, or IoT, the demand for instant calculations and minimal latency of data exchange is increasing. Most technologies do not ‘have time’ to wait for their requests to travel across networks, reach a computing core, be processed, and then be send back. The computing is required to be performed either closer to the device, or ideally, within it, in order to reduce latency.

To meet this need, edge computing is on the rise. The idea of edge computing is, to perform the computations either near or right at the source of data, reducing the latency that cloud-computing cannot avoid, by running fewer processes in the cloud. However, edge computing is not there to replace cloud-computing but rather to work alongside with it. A clear division of computations that need immediate feedback along with processes that can withstand a certain latency will drastically increase the speed and efficiency of processes*.

 

Why are the technologies crucial?
Both technologies have a direct impact on several other technological advances, like the DARQ technologies mentioned in my other article, increasing speed, efficiency and security, but also technologies used in the healthcare or automotive industry for example.

The necessity and potential of both computational technologies can be seen in the increased research efforts by big companies like Google, Amazon or Verizon. In 2019, Google set a new benchmark for computational speed with a new kind of processor, and Verzion/Amazon introduced a 5G edge cloud computing partnership, to launch IoT devices and applications the edge.

 

With the constant increase in the collection of data and the requests being computed by processors, the need for technological advances is there. Both of the technologies create ample opportunities within the industries to succeed and drive innovation and change. However, as usual the big tech companies are at the forefront of exploring and developing those technologies.

 

What’s your pick?
Will smaller companies be able to shape and use the technologies soon or do they need to wait until bigger companies will make them available in large scale?

 

 

_____________________________________________

*Please note: When we talk about ‘immediate’ feedback to computational requests, the differences between edge computing and cloud computing are within microseconds. However, this difference could become crucial in several situations, as for example in the avoidance of traffic accidents through autonomous vehicles, which is why it is mentioned at this point.

 

Sources
https://futuretodayinstitute.com/trend/quantum-and-edge/
https://www.keyinfo.com/ai-quantum-computing-and-other-trends/
https://www.upgrad.com/blog/trending-technologies-in-2020/

Please rate this