September 20, 2022

Exploring Quantum Computing

Holland & Knight IP/Decode Blog
Jacob W. S. Schneider
IP/Decode Blog

So much of our lives is reliant upon computing, and quantum computing has the potential to upend the encryption we rely upon, as well as scientific fields of study. This post explores this exciting, entirely new form of computing and how it stands to solve a variety of currently unsolvable problems.

I. Classical Computing: Where We Are Today

First, we should take a quick look back at "classical computing." Classical computing covers every computer we interact with today, from our laptops to our smartphones. The history of classical computing is a story of human ingenuity, where we use anything at our fingertips to count and speed calculations.

And fingers are the perfect place to start. Our numerical system is Base-10, which means we use ten numerals (0-9) before we need a second numeral to describe the next number (10). We take it for granted that we count by tens, but it did not have to be this way. Classical computers, for example, operate using Base-2 because they can only recognize and use 0 and 1. Once they move past the numbers 0 and 1, classical computers need another digit. (Classical computers notate 2 with "10," and you can play with these conversions here.) The reason we count by tens is because we have ten fingers (even though four fingers on each hand might have worked).

Our hands are our first computers. Anyone who has helped a child learn to add or subtract has seen them resort to referencing their fingers, and numbers over ten usually require you to put up a few fingers to help out. How you count with your fingers can even give away where you grew up (in Korea, the Chisanbop method gets your two hands up to 99), but everyone around the world is looking at their hands to perform some basic math.

Eventually, we found other materials with which we could calculate numbers. We could, of course, always use objects other than fingers to represent numbers. The abacus dates back to 2700 BCE and allowed us to manipulate objects to more rapidly compute output (you can learn how to use one here).

By the early 1800s, Charles Babbage was theorizing how to build and use a mechanical device to perform calculations. By the mid-1800s, he joined forces with Ada Lovelace to conceive of the first mechanical, general-purpose computer that manipulated symbolic logic, the Analytical Engine. Once we understood electromagnetism, vacuum tubes and transistors in the 20th century brought Babbage's and Lovelace's ideas to life and gave us the classical computers that are omnipresent in our lives today.

The history of computing is a history of using whatever surrounds us to speed and improve processing of information.

II. Quantum Mechanics

Quantum computing builds on that history of computing by leveraging the very, very weird properties of the microscopic universe (i.e., atoms and subatomic particles). Quantum Mechanics is the study of that microscopic universe' physics, and the way things work at that scale is counterintuitive and strange. Below is just a sample of how unusual it is:

  • Uncertainty Principle: The better we understand a particle's location, the less we can understand its momentum (and vice versa).
  • Superposition: The Uncertainty Principle means that we cannot determine the properties of a subatomic particle with certainty before observation, but we can determine the probability that a property will be observed. Before observation, particles' properties are considered to be in a "superposition" of all possible states, and we can tell the probability that a given state will be measured only once it is observed.
  • Quantum Entanglement: Particles that interact with one another can become "entangled," such that they can no longer be separately described from their counterparts. Even particles very far from one another remain entangled, and Albert Einstein referred to this property as "spooky action at a distance."
  • Wave-Particle Duality: Particles exhibit both wave-like and particle-like properties at the same time. Because particles are waves, they can be affected by interference.

A colleague at Holland & Knight once told me that when she was studying quantum mechanics in college, the textbook made her "nauseous" to read because the subject was so bizarre. She was not alone. This is how some of the brightest minds of the 20th century talked about the field (organized by when they won their Nobel Prize):

  • "Anyone not shocked by quantum mechanics has not yet understood it." – Niels Bohr (Nobel Prize, 1922)
  • "Not only is the Universe stranger than we think, it is stranger than we can think." – Werner Heisenberg (Nobel Prize, 1932)
  • "I do not like it, and I am sorry I ever had anything to do with it." – Erwin Schrödinger (Nobel Prize, 1933)
  • "I think I can safely say that nobody understands quantum mechanics." – Richard Feynman (Nobel Prize, 1965)
  • "Quantum mechanics makes absolutely no sense." – Roger Penrose (Nobel Prize, 2020)

Quantum Mechanics is filled with paradoxes and contradictions, but strange though it is, these realities of the universe are like anything else: They are something that we can use to count and calculate.

III. Quantum Computing and Quantum Algorithms

As early as 1980, Paul Benioff proposed a quantum model for computing. Now, forty years later, the field is beginning to yield actual quantum computers that are beginning to be capable of solving problems that classical computers cannot.

The way quantum computers work is strange because it reflects the strange properties of the microscopic universe. Scientists must first isolate and contain subatomic particles called "quantum bits" (or "qubits") that act as the basic units of computation. The classical computing analog to qubits are the 0 and 1 bits represented by transistors, but qubits would not be creatures of quantum mechanics unless they were substantially more bizarre.

Prior to observation, qubits exist in a state of superposition that effectively makes them carry the values 0 and 1 at the same time. Some experts compare a qubit's superposition to a flipped coin in flight – equally probably heads or tails before it lands (the observation event). That concept of qubit superposition scales beautifully to perform massive calculations in parallel. In effect, instead of waiting for a classical computer to try all possible combinations of 0s and 1s over some long period of time, a quantum computer can calculate all such possibilities at once to compute an answer.

If you have ever lost the combination to a bike lock, you could do what I tried in the early 1990s: Try every combination of four numbers until the lock opens. With a four-digit bike lock, there are 10,000 possible combinations, because the solution will be some number between 0000 and 9999 (or 104 = 10,000). Ten thousand tries is a lot of effort to save a $5 bike lock, and that was frankly enough for me to give up. (The hope, of course, is that would-be thieves also give up.) A quantum computer could effectively try every possible lock combination at once to find the right answer.

And problems like the "bike lock problem," but have much larger (exponentially larger) sets of possible solutions, are where quantum computers will shine in coming years. It is something of a secret in computer science, but there are problems that computers cannot practicably solve in a reasonable amount of time. When faced with these problems, developers usually build shortcuts, or "heuristics," to give a pretty good answer, if not the best answer.

There are no shortcuts, however, to solve certain important problems. One such problem is "prime factorization," where computers are asked to figure out which prime numbers could be multiplied together to create a very, very large number. The larger the number, the harder it is to perform prime factorization. This problem is the basis of modern cryptography because even our most powerful classical computers are not up to the task. If you can solve prime factorization, then nearly all encryption schemes, including those that enable cryptocurrencies, are in danger. Quantum computers are well-suited to solve this type of problem and can do so in theory because they have already solved it for very small numbers.

Problems that are impossible on classical computers are everywhere. Chemistry, biology, materials science, meteorology, artificial intelligence, economics and so forth all have problems of exponential magnitude that quantum computing could address.

The field is still in its infancy, but quantum computing will result in dramatic technological and scientific advancements when it advances to a more mature state. And those advancements will rapidly appear once quantum computing hits an inflection point. Attorneys, particularly those counseling clients on data privacy and information technology security issues, should monitor quantum computing as it develops. Future posts on IP/Decode will look forward to address the implications of quantum computing on these important legal issues.

Related Insights