Special Section

A Quantum Leap

A Quantum Leap

By R. Greg Szrama III

This fascinating technology may be in its infancy, but its promise can’t be ignored.

QUANTUM COMPUTING. The vocabulary might seem more at home in pulp science fiction than in scientific journals. Terms such as supremacy, superposition, and entanglement fill the literature. Digging into the quantum phenomena that power quantum computers is not for the faint of heart. Recent work by IBM, and by startups such as IonQ and QuEra, however, helps ensure we can employ quantum computers without first needing advanced degrees in theoretical physics to understand how and why they work.

In fact, the field has matured tremendously since 1981 when physicists Richard Feynman and Paul Benioff first began seriously talking about a new type of computer to model complex quantum mechanical phenomena. Simply developing the manufacturing technology to produce a quantum computer took decades. Finally, in 2019, IBM unveiled the Q System One, the world’s first commercially available quantum computer. Now, the top three cloud providers (Amazon, Google, and Microsoft) all offer on-demand quantum computation services.

Jay Gambetta of IBM describes the current state of the field as “Quantum Utility.” He writes, “Quantum is now a computational tool, and what makes me most excited is that we can start to advance science in fields beyond quantum computing, itself.” He envisions a future, fast approaching, where high-performance quantum-centric supercomputers are the norm.

Quantum computers hold real promise for solving certain problems exponentially faster than traditional computers—and, importantly, with far less energy usage. Understanding where to apply this technology will help position us to take advantage of it when and as it arrives.

Quantum Information Storage

Understanding the applications of quantum computing requires a basic understanding of the quantum effects they exploit. This starts by understanding how these systems store data. In a classical computer, data is stored in bits that hold a discrete value of 0 or 1. Every time you “read” a bit of information (excepting error situations), you will get the exact same value. You can think of this as laying a penny on a table. If you see Abraham Lincoln’s smiling face, then you know that every time you look at that same penny, you’ll see the same heads side unless you expend effort to flip it over.

Quantum information, by contrast, is stored in a quantum-bit, or qubit. Instead of a discrete 0 or 1, a qubit encodes the probability that, when measured, you will read either a 0 or a 1. Put differently, until measured, the qubit represents both 0 and 1. When you measure it, due to the laws of quantum mechanics, you force the qubit to resolve to a discrete value. Repeating the same operation on the same qubit may give different results with different executions but, over time, the number of 0 and 1 results will converge on the qubit’s probability state.

If a classical bit is a penny on the table, you can think of a quantum bit as a penny tossed up in the air spinning end over end. While in the air, there is no measured value yet, but there is an even 50% chance that when you measure it, you will get one side or the other. Measuring the state of a qubit is like slapping that penny down on the table. You have now disturbed the system to get a measurement. If you slap the penny down on the table and see the Lincoln Memorial glinting up at you, then every time you observe that penny from then on, you will see the same result. If you repeat the algorithm (flipping the coin), then you expect that—over time—half your results will show heads and half your results will show tails. This mirrors the probabilistic nature of qubit information.

One critical limitation of quantum computers is the difficulty of maintaining the quantum mechanical state of qubits. This is referred to as quantum coherence. Losing coherence means losing the quantum information encoded in the qubits. In our example of flipping a penny, you need an even flip to get an even probability of either side. You might consider the system coherent as long as the penny is nicely spinning where you want it to spin. A gust of wind might blow the penny to the side, making it land on the ground and ruining your measurement. A passerby may slap it down on the table before you do, causing it to resolve before you were ready for it to stop. These exogenous events altered the state of your penny-flipping system and made it lose coherence.

Maintaining coherence requires extreme control of the environment, accounting for temperature, vibration, and even incidental ionizing radiation. Under research conditions, scientists can maintain coherence for minutes at a time. Under real-world scenarios, IonQ, a pioneer in the field, only guarantees an average coherence of 10—100 seconds for determining the value of a qubit, and 1 second for measuring the phase state of the qubit.

Maintaining coherence is also critical for preserving other quantum attributes of the system. One such attribute is quantum entanglement[1] of two or more qubits. Measuring one entangled qubit causes the other entangled qubits to measure the same value. If you have two entangled qubits and measure one of them as a value of 0, then the other will also measure a guaranteed value of 0. This contrasts with a classical computer, where each bit always stores its own discretely measurable value, unaffected by any other measurement.

This is one of the stranger attributes of a quantum system. Einstein famously called it “spooky action at a distance.” If we stick with our penny example, imagine we have two pennies flipping through the air at the same time. If these pennies were in a quantum entangled state, then slapping one penny down on the table and seeing our friend Mr. Lincoln would immediately cause the other penny to stop spinning in the air and land heads-side-up as well. I found no easily accessible explanation for this, but Caltech and others have proved it works in practice for particles in a quantum mechanical state.

Building a Quantum Computer

Storing quantum information is a major feat of science and engineering, but it accomplishes very little in a practical sense. Making use of these qubits requires building a computer that can execute algorithms, is scalable, and is fault-tolerant.

Researchers are investigating several possibilities for executing algorithms on a quantum computer, but the most common at present is the quantum circuit model. This model, in concept, operates similarly to a classical computer. In a classical computer, information is stored in binary format in the computer’s memory and logical gates such as AND, OR, and NOT manipulate the values of individual bits of data. In a quantum computer, quantum gates instead alter attributes of the quantum state in the qubit. These alterations influence the results when the qubit is finally measured and allow the quantum computer to exploit quantum phenomena like entanglement or interference when executing an algorithm.

Continuing our thought experiment of flipping a penny, we might decide we really like Abraham Lincoln and we want his face to show 95% of the time. We might contrive a system with high-speed cameras and lasers to nudge the penny so that heads almost always shows up when we measure it. In the same way, quantum gates can alter the probabilities of the qubits, changing the probability of measuring a 0 or a 1.

Once the quantum algorithm executes, the individual qubits are measured to determine its results.[2] The value of an individual qubit is yielded probabilistically in accordance with its probability state. Say for a given input the algorithm yields a qubit with a 90% chance of 0 value and a 10% chance of a 1 value. If you execute the algorithm 10 times, you should expect to measure a value of 0 on nine of those executions.

But measuring a qubit, remember, destroys all quantum information stored in the qubit. It is an irreversible operation. Once a qubit is measured as a 0 or 1, you lose information on the probability state that led to that value. With our penny example, using our rig to alter the probability of getting a result of heads, then if we measure a result of tails, we have no implicit way of knowing if it was a fair flip or an altered flip. We only know that we found a result of tails. We know as an outside observer that we altered the state of the flip, but the result itself—seeing the memorial—tells us nothing by itself about what we did prior to the measurement.

Achieving scalability is largely an engineering and materials science question. Building individual qubits is relatively straightforward with modern fabrication technologies, but connecting the hundreds and thousands of qubits needed for practical usage is something else entirely. This is compounded by the need for cryogenic temperatures. Even so, IBM’s Condor processor, unveiled in December 2023, contains 1,121 qubits. Amazon Web Services provides general availability to QuEra Aquila quantum computers with up to 256 qubits.

The complexity of available quantum computers, in terms of qubit storage and quantum gates, is increasing incredibly quickly. IBM’s quantum roadmap gives one benchmark.[3] IBM anticipates that by 2025, for example, readily available systems will operate on processors made of 1,000+ qubits, and multiple such processors used in parallel will support highly complex super-computing workloads. In other words, the number of pennies we can flip at once, and the number of ways we can influence them, is becoming less a manual dexterity problem and more a manufacturing problem.

Fault-tolerance is the final critical part of ensuring accurate, predictable, and repeatable results from a quantum computer. Losing coherence during a quantum calculation introduces errors in the results. For this reason, computations are often executed multiple times to ensure an accurate result. The length of time its qubits maintain coherence is the primary limiting factor in the time duration and complexity of an algorithm the computer can execute. One way around this is to combine many qubits into one logical qubit that is self-correcting. This requires combining as many as 1,000 physical qubits into one computational unit. Google, in 2023, proved this is possible in practice and is working on scaling the prototype up to a practical long-lived qubit.

How Do You Interact With a Quantum Computer?

Perhaps unsurprisingly, cloud providers are among the largest purchasers of quantum computers. Amazon, Google, and Azure (Microsoft) all provide connected platforms, programming languages, and other support for using quantum computing. Thankfully these companies do the heavy lifting of installing the large devices, maintaining the cryogenic conditions, and providing the computing infrastructure necessary to efficiently use these new computers.

With the logistics solved, researchers and users are free to focus on algorithms. The real power of quantum computers comes in massaging the probability states of the individual qubits. Quantum gates are designed and implemented to perform physical manipulation of individual atoms, thereby tweaking these probabilities. Algorithm design focuses on how to nudge the probabilities such that when you read the qubits, you are highly likely to get the right answer. Running the algorithm over several iterations or time slices and then processing the results allows for error correction and analysis.

Once the algorithm is designed, programming a quantum computer has a few steps. First, a classical computer configures the quantum state of the qubits. In a quantum circuit model, logic gates are programmed and then operations are queued up for execution. In other execution modes, adiabatic for example, the quantum state and entanglement of multiple qubits are prepared. The algorithm is then executed, results are measured, and the classical computer performs error-correction, analysis, and processing. Quantum algorithms are often executed multiple times to account for errors during processing and to ensure meaningful results.

The current reality is, however, that the number of qubits and the availability of quantum gates or operations is still very limited. Commonly available quantum computers have tens of qubits available, with some specialized processors like QuEra’s Aquila processor scaling to 256 qubits for specific types of models. These computers still do not outperform classical computers in useful calculations. The main goal (for now) is not speeding up execution but proving the effectiveness of algorithms in practice and gaining expertise in algorithm design while researchers continue iterating on the hardware.

Why Discuss This Now?

Quantum computing is awesome (and mind-bending) in theory, but why talk about it now? Even though Google and others successfully achieved quantum supremacy[4]—the point at which a quantum computer completes a calculation vastly more efficiently than a classical computer—the executed algorithm has no real-world value. The algorithm was intentionally designed for the sole purpose of being an algorithm a quantum computer can solve easily, and a classical computer cannot. We are still likely years away from the more ambitious quantum advantage, the point where we achieve quantum supremacy for a real-world problem.

Even so, preparing now means statisticians, scientists, and business professionals will be well-suited to adopting the technology when and as it matures. One early example of its utility is in running Quantum Monte Carlo simulations[5] for modeling chemistry problems in areas such as protein folding or developing photovoltaic materials. The quantum computer helps provide better approximations of the state of molecules under consideration, giving improved accuracy and better trial states. This example still relies on a classical computer to perform the primary simulation and uses the quantum computer as a co-processor, but it provides an early glimpse of the direction the technology is heading.

The most immediately applicable areas for quantum computing fall into a few broad categories.[6] The first is simulations of physical and quantum processes, the exact use case Feynman and Benioff first envisioned. The most common such simulation often mentioned is nitrogen fixation, a process used in making ammonia and which accounts for approximately 2% of all global energy production. Modeling this is impractical with current computing technology, but a system with quantum computers having as few as 100 qubits could theoretically solve these problems.

Other areas of application include linear systems, such as those underlying artificial neural networks. Optimizing these processes are crucial to advances in computer imaging, searching through data, and providing recommendations. Cost optimization, prime number factorization, and encryption are some other key areas of ongoing research.

What to Watch For

To make full use of quantum computing, a few key things need to happen. First, quantum computers need to grow to encompass large numbers of logical qubits with acceptably low error rates. Second, we need to see commercially viable quantum advantage demonstrated reliably on commercially available quantum computers. Third, we need to solve some key problems in the realm of data transfer between quantum computers and classical computers to enable processing on large-scale datasets.

Proponents of quantum computing keep predicting these advances will come any day. Some of the best resources for tracking real progress, however, are roadmaps published by researchers at IBM, Google, and other innovators. These provide benchmarked and peer-reviewed results on the path toward achieving quantum advantage.

Planning now for the rollout of widescale quantum computing means businesses heavily invested in statistical analysis, modeling, and other related fields can quickly see gains as new quantum computers become available.

Quantum Computations—Or, How I Learned to Stop Worrying and Embrace the Qubit

Wu-Tang Clan’s GZA aside, most people’s exposure to quantum physics stops with Scott Bakula and Quantum Leap. Among those aware of quantum computers, most seem to only know one of a few things about them. They might know that quantum computers use qubits, and they may have heard about superposition, the trait where a qubit is “both” 0 and 1 until read. When asked what a quantum computer will “do,” though, few even among this small group of people can provide an answer. The one tidbit you will most likely hear (especially from computer security experts) is that quantum computers will probably, someday, make classic computer encryption obsolete. They are not entirely wrong about that, but there is more to the story.

Still, the question of capabilities is a pertinent one. If quantum computers are still in their infancy—and the optimistic roadmap from IBM anticipates them not reaching maturity until the 2030s—why worry about it now? The answer lies in understanding the types of problems quantum computers are particularly suited to solve and understanding how they can augment classical computers now and in the very near future. Algorithms are still mostly in the proof-of-concept phase, but mastering the intricacies of them now means that when the technology reaches maturity, we can quickly take advantage of it.

So what, exactly, can a quantum computer do?

Simulating Natural Processes

The original idea for these machines was to create a computer that could better model natural processes. Even top supercomputers struggle when modeling complex reactions in chemistry, nuclear physics, and biotechnology with the precision researchers require. When considering electron interactions in chemical reactions, for example, each additional electron modeled with classical computers doubles the computational power needed. This puts a hard limit on the complexity of the molecules we can effectively model with any precision.

As mentioned above, one research area where scientists are limited in this fashion is in studying nitrogen fixation. This is required for producing ammonia for fertilizers and other industrial needs. The currently used Haber-Bosch process requires extreme pressure (approximately 200 atm) and temperature (400 C or higher) and uses approximately 2% of global energy output. Scientists know of a natural catalyst, nitrogenase, that can accomplish nitrogen fixation with far less energy required. Modeling this molecule, however, requires tracking 100 strongly correlated electrons. This is well beyond what current computers are capable of processing.

Quantum computers, on the other hand…

A paper published in 2017 shows that quantum computers are particularly well suited to modeling this reaction. The research paper[7] details how quantum computers can act as co-processors for classical computers in tracking electron distributions. The approach uses a classical computer to determine which molecular structures to inspect and then offloads the electron calculations to a quantum computer. The authors showed how a network of quantum computers, each having on the order of hundreds of qubits, can readily handle these problems. This problem may soon be solved.

Complex Statistical Modeling

The probabilistic nature of quantum computers also makes them a natural fit for performing statistical analyses. Amazon and Google both published proof-of-concept work using them as co-processors running Quantum Monte Carlo simulations. As with other current examples, a classical computer guides the analysis and offloads those tasks that the quantum computer is best suited to perform.

The statistical modeling capabilities of quantum computers are of course not limited to quantum physics problems. In 2023, Mushin Tamturk published a research paper[8] in which he describes a method using readily available cloud-based quantum computers to evaluate insurance capital reserves and model risk exposure. He explores how his proposed algorithm can account for changes in risk behaviors resulting from climate change, war, pandemics, and other sources. As with other current examples, the work is still in its early stages, but it shows several promising avenues for increases in efficiency and accuracy.

Solving Linear Systems

The math involved in quantum computer processing leans heavily into linear algebra. Vector and matrix algebra underlies virtually everything these machines accomplish. This leads to one of the most promising areas for speed increases—that of solving systems of linear equations. These types of equations are critical to any number of disciplines, including engineering, finance, economics, and computer science. Critically, the neural networks that power artificial intelligence systems such as ChatGPT and Dall-E are also direct implementations of linear systems.

Quantum computers will provide massive increases in the computational efficiency of solving these systems. The HHL algorithm, for example, designed by Aram Harrow, Avinatan Hassidim, and Seth Lloyd, offers an exponential speed increase over the algorithms used by classical computers.[9] In 2016, Zhikuan Zhao et al. demonstrated a concrete general-purpose version of the HHL algorithm and applied it to Bayesian deep learning. [10]

These algorithms will directly help provide faster and more accurate results from AI systems. A paper in Nature in 2019 described a method for implementing quantum generative adversarial networks (qGAN).[11] The authors describe an application of qGANs to facilitate financial derivative pricing and show how they can achieve a significant reduction in training time for the AI models under real-world learning conditions. Other work focuses on using qGANs for high-quality image generation,[12] seeing comparable performance to classical computers with as few as 10 qubits. These approaches rely on a classical computer describing the problem and a quantum computer acting as an accelerator for key computations.

Information Security

This is the big one. For all the impressive problem-solving enabled by quantum computers, the pending cryptographic apocalypse is what keeps security professionals awake at night. This sentiment traces back to 1994 when Peter Shor described quantum algorithms to solve the problems of period-finding, discrete logarithms, and, most importantly, integer factorization. The algorithm for integer factorization remains one of the most important proposals for practical real-world use of a quantum computer.

Unfortunately, publication of Shor’s Algorithm (almost always referring to his integer factorization solution) sounded the death knell for public-key cryptography using traditional algorithms, including RSA, Diffie-Hellman, and Elliptic Curve Diffie-Hellman. The security of these algorithms rest in a shared trait: the assumption that factoring large integers is unsolvable in reasonable timeframes for computers.

The RSA algorithm, to pick an example, is based on multiplication of two very large prime numbers. At present, a 2048-bit RSA key is considered “secure enough.” The RSA algorithm picks two prime numbers and multiplies them together. With that 2048-bit key length, the prime numbers are typically represented with 1024 bits each, which can represent a prime number with 308 decimal digits. Using a classical computer to find the prime factors of an RSA key would take (by best current estimates) an amount of time best expressed by (very large) multiples of the age of the universe.

Thankfully (for the sake of information security) implementing Shor’s Algorithm is still impractical. A perfectly stable system could feasibly factor a 2048-bit RSA key in days or even minutes, but this is still well beyond the time quantum computers can maintain qubit coherence. With imperfect (noisy) qubits, factorization in a matter of hours may require as many as 20 million qubits to achieve.[13] In the meantime, businesses and governments have invested heavily in “post-quantum” cryptography. The U.S. National Institute of Standards and Technology (NIST) conducted
a years-long effort to identify new cryptographic algorithms that will withstand quantum computing attacks. NIST published the first three encryption standards on August 13, 2024.[14]

The standards identified by NIST are interesting but focus on allowing traditional computers to exchange information securely. Quantum cryptography, by contrast, will use attributes of quantum mechanics that require quantum computers to generate or process. Quantum key distribution (QKD), for example, allows two parties to generate a secret key that only they know and which can be used to encrypt or decrypt messages. Because of the nature of quantum mechanics, any attempt by a third party to determine the key disturbs the system by making a measurement—and is instantly detectible.

This is one of the few current areas of quantum computing seeing real-world use. The first instance of performing a bank transfer using QKD happened in April 2004.[15] Multiple companies, such as Toshiba,[16] now market integrated platforms and services for QKD. Admittedly these systems require custom hardware and often have limited range. They provide an early example, however, of the kinds of techniques quantum computing enables.

Defining a Qubit

The bits in a traditional computer are just abstractions of physical phenomenon. Individual bits, at a basic level, are represented as voltage ranges during computation on the CPU or during transmission, or by different electrical charges when stored on physical media such as an solid-state drive. When sending a signal over a live wire, a voltage at or near 0v in a specific time slice, for example, might represent a value of 0, while a voltage around 5v might represent a value of 1. Knowing about the physical representation of a bit can help in understanding why, for example, shielded versus unshielded cabling can impact data transmission quality.

With a qubit, though, the physical representation is more exotic. Since the field is in its infancy, there is not yet wide agreement on the “best” version of a qubit. Research is also pointing to different configurations for different tasks, just as classical computers may represent bits differently when transmitted via radio or when stored on magnetic tape.

One approach for physical representation of qubits is to capture individual atoms for physical manipulation. These come in two flavors—those using neutrally charged atoms and those using ions. Each has its own benefits in terms of coherence and utility, enabling different configurations.

The QuEra Aquila computer, available now on Amazon Web Services in a 256-qubit configuration, uses neutral Rubidium-87 atoms. The Aquila computer operates at room temperature and uses laser pulses to interact with these single atoms, cooling them to microkelvin temperatures to achieve a quantum mechanical state. Lasers also perform the different steps involved in performing a quantum algorithm, interacting with the atoms to alter their probability state.

Conversely, the IonQ Aria quantum computer uses trapped ytterbium atoms with one electron stripped off, yielding a positively charged ion. Its system loads these ions into a linear chain and can gain complete connectivity between suspended ions—that is, each ion can interact with each other ion in the chain. The Aria computer also uses laser-based techniques to cool atoms to microkelvin temperatures and to perform operations on the individual ions.

Other approaches to quantum computing involve manipulation of photons or even cooling electronic circuits to cryogenic temperatures, allowing them to function in a superconducting state and therefore behave as artificially created atoms. Manipulating these qubits may involve laser or microwave pulses, manipulation of charges, or other mechanisms.

Regardless of the configuration, some attribute of the quantum system must be chosen to represent a “0” and some attribute must represent a “1.” In a neutral atom or trapped ion configuration, for example, manipulating the energy level of the atom can change its spin attribute. In this case, an “up” spin may resolve to a “0” and a “down” spin may resolve to “1.” Remember that in a quantum system, until measured, the outside observer does not know the spin of the ion. Measuring the spin disturbs the system and forces a probabilistic result.

Endnotes

  1. “Proving that Quantum Entanglement is Real”; Caltech; Sept. 20, 2022.
  2. “What is Measurement in Quantum Computing”; QuEra website, “Measurement” hub; 2023.
  3. “IBM Debuts Next-Generation Quantum Processor & IBM Quantum System Two, Extends Roadmap to Advance Era of Quantum Utility”; IBM Newsroom; Dec. 4, 2023.
  4. “Quantum supremacy explained”; Big Think; Aug. 30, 2023.
  5. “Quantum Monte Carlo on Quantum Computers”; AWS Quantum Technologies Blog; Nov. 30, 2022.
  6. “Feeding the World with Die Rolls: Potential Applications of Quantum Computing”; Dartmouth Undergraduate Journal of Science; 2017.
  7. “Elucidating reaction mechanisms on quantum computers”; Proceedings of the National Academy of Sciences; July 3, 2017.
  8. “Quantum Computing in Insurance Capital Modelling”; Mathematics special issue, Quantum Computing for Industrial Applications; Jan. 28, 2023.
  9. “Quantum algorithm for solving linear systems of equations”; Physical Review Letters; Sept. 30, 2009.
  10. “Bayesian Deep Learning on a Quantum Computer”; Quantum Machine Intelligence; May 17, 2019.
  11. “Quantum Generative Adversarial Networks for learning and loading random distributions”; Nature; Nov. 22, 2019.
  12. “Latent Style-based Quantum GAN for high-quality Image Generation”; Su Yeon Chang et al.; June 4, 2024.
  13. “How to factor 2048 bit RSA integers in 8 hours using 20 million noisy qubits”; Quantum; April 13, 2021.
  14. “NIST Releases First 3 Finalized Post-Quantum Encryption Standards”; NIST; Aug. 13, 2024.
  15. “World Premiere: Bank Transfer via Quantum Cryptography Based on Entangled Photons”; City of Vienna; April 21, 2004.
  16. “Quantum Key Distribution (QKD)—Delivering provably secure networking for the quantum computing age”; Toshiba website, “Quantum-Secure Networking” hub; 2023.
print
Next article Actuaries Play an Important Role in Behavioral Health Care, Now and in the Future
Previous article Generative AI— Applications for Actuaries

Related posts