A large part of our lives depend on computing, and quantum computing has the potential to change the coding we depend on, as well as our fields of scientific study. This post explores this exciting and completely new form of computing and how it can solve a variety of currently unsolvable problems.

## I. Classical Computing: Where Are We Today

First, we should take a quick look at “classical computing”. Classic computing covers every computer we interact with today, from our laptops to our smartphones. The history of classical computing is a story of human ingenuity, as we use anything at our fingertips to count and speed calculations.

And the fingers are the perfect place to start. Our number system is Base-10, which means we use ten digits (0-9) before we need a second digit to describe the next number (10). We take it for granted that we count by tens, but it doesn’t have to be that way. Classic computers, for example, work with Base-2 because they can only recognize 0 and 1. And once they go beyond 0 and 1, Classic computers need another number. (Classical computers denote the number 2 as “10”, You can play around with these conversions here(The reason we counted by tens is because we have ten fingers.)Although four fingers on each hand may be useful).

Our hands are our first computers. Anyone who has helped a child learn to add or subtract has seen him resort to pointing fingers, and numbers over ten usually require you to put out a few fingers to help. How to count with your fingers You can even give up where you grew up (in Korea, The Chisanbop method makes your hands reach 99), but everyone around the world is looking at their hands to do some basic arithmetic.

In the end, we found other materials with which we can calculate numbers. We can, of course, always use things other than fingers to represent numbers. The counter dates back to 2700 BC and allowed us to manipulate objects to calculate the output more quickly (You can learn how to use one here).

By the early 19th century, Charles Babbage was theorizing How to build and use a mechanical device for calculations. By the middle of the nineteenth century, he joined forces Ada Lovelace to envision the first multi-purpose mechanical computer to manipulate symbolic logic, and Analytical Engine. Once we understood electromagnetism, vacuum tubes and transistors in the 20th century brought Babbage and Lovelace’s ideas to life and gave us the classic computers that are ubiquitous in our lives today.

The history of computing is the history of using everything around us to speed up and improve information processing.

## secondly. Quantum mechanics

Quantum computing builds on this history of computing by taking advantage of the very peculiar properties of the microscopic universe (i.e. atoms and subatomic particles). Quantum mechanics is the study of the physics of this microscopic universe, and the way things work on this scale is unexpected and strange. Here is just a sample of how weird it is:

**Uncertainty principle:**The better we understand a particle’s position, the less we understand its momentum (and vice versa).**overlap:**The uncertainty principle means that we cannot determine the properties of a subatomic particle with certainty before observation, but we can determine*Probably*That property will be noticed. Prior to observation, the properties of the particles are in a “superposition” of all possible states, and we can say with the probability that a particular state will only be measured once it is observed.**Quantum entanglement:**Particles interacting with each other can become so “entangled” that they cannot be described separately from their counterparts. Even particles that are very far from each other remain entangledAlbert Einstein referred to this property asremote work scary. “**Wave-particle duality:**Particles exhibit wave-like and particle-like properties at the same time. Since particles are waves, they can be affected by interference.

A colleague at Holland and Knight once told me that when she was studying quantum mechanics in college, the textbook made her “sick” to read because the subject was so weird. You are not alone. This is how some of the 20th century’s brightest minds talked about the field (ranked by when they won the Nobel Prize):

- “Anyone who hasn’t been shocked by quantum mechanics hasn’t understood it yet.” – Niels Bohr (Nobel Prize, 1922)
- “The universe is not only stranger than we think, it is even stranger than we think.” – Werner Heisenberg (Nobel Prize, 1932)
- “I don’t like it, and I’m sorry I had nothing to do with it.” – Erwin Schrödinger (Nobel Prize, 1933)
- “I think I can safely say that no one understands quantum mechanics.” – Richard Feynman (Nobel Prize, 1965)
- “Quantum mechanics makes absolutely no sense.” – Roger Penrose (Nobel Prize, 2020)

Quantum mechanics is full of paradoxes and contradictions, but as strange as these facts of the universe are, these facts of the universe are like anything else: they are something we can use for counting and arithmetic.

## Third. Quantum computing and quantum algorithms

As early as 1980, Paul Benioff proposed a Quantum model of computing. Now, forty years later, the field has begun to produce actual quantum computers that are beginning to solve problems that classical computers cannot.

The way quantum computers work is strange because they reflect the strange properties of the microscopic universe. Scientists must first isolate and contain subatomic particles called “quantum bits” (or “qubits”) that serve as basic units of computation. The classic computing analogue of qubits are the bits 0 and 1 represented by transistors, but qubits wouldn’t be creatures of quantum mechanics unless they were considerably more exotic.

Prior to noting, the qubits are in a state of superposition that actually makes them have the values 0 and 1 *At the same time*. Some experts compare a qubit overlay to an inverted coin in flight – possibly heads or tails before landing (observation event). The concept of qubit superposition is beautifully expanded to perform massive computations in parallel. In fact, instead of waiting for a classical computer to try out all the possible combinations of 0 and 1 over a long period of time, a quantum computer could compute all of these possibilities simultaneously to compute an answer.

If you ever lose the combination to a bike lock, you can do what you tried in the early ’90s: try each four-digit combination until the lock unlocks. With a four-digit bicycle lock, there are 10,000 possible combinations, because the solution would be a number between 0000 and 9999 (or 10^{4} = 10000). Ten thousand tries is a lot of effort to save a $5 bike lock, and that was honestly enough for me to give up. (The hope, of course, is that would-be thieves give up, too.) A quantum computer can effectively try every possible lock combination simultaneously to find the correct answer.

And problems like the “bike lock problem,” but which have much larger (significantly larger) sets of possible solutions, are where quantum computers will shine in the coming years. It’s something of a secret in computer science, but there are problems that computers can’t practically solve in a reasonable amount of time. When faced with these issues, developers usually build shortcuts, or “inferential methods”, to give a good, if not the best answer.

However, there are no shortcuts to solving some important problems. One of these problems isPreliminary analysis“where computers are required to know which prime numbers can be multiplied together to create a very, very large number. The higher the number, the harder it is to perform prime analysis. This problem is the basis of modern cryptography because even the most powerful classical computers are not up to the task. If You can solve for prime factors, almost all ciphers, Including those that enable cryptocurrencies, in danger. Quantum computers are well suited to solving this kind of problem and can do so in theory because It has already been solved for a very small number.

Impossible problems on classic computers are everywhere. Chemistry, biology, materials science, meteorology, artificial intelligence, economics, etc. face problems of an exponential size that quantum computing can tackle.

This field is still in its infancy, but quantum computing will lead to exciting technological and scientific advances as it progresses to a more mature state. These developments will emerge quickly once quantum computing reaches an inflection point. Lawyers, especially those who advise clients on data privacy and IT security issues, should monitor quantum computing as it evolves. Future posts on IP/Decode will look to address the implications of quantum computing on these important legal issues.

#Exploring #quantum #computing #ideas