Skip to main content

Classical Computing and Quantum Computing Can Work Together

Breathless ‘end of the classical computer’ articles, quantum supremacy and ‘broken security’ hysterics to the contrary, pitting classical versus quantum computing is not the issue. What is at issue is determining how to most effectively exploit each architecture.

Quantum computing’s major strength doesn’t lie in simply analyzing Big Data. While the speed of quantum is extremely impressive, massaging massive data amounts is actually more in the realm of high-performance computing and supercomputers. The real source of quantum’s strength lies in its ability to provide new insights by drawing more information from smaller amounts of data that can be intensively and extensively analyzed from multiple directions.

This is especially helpful in situations optimization where data is scarce, difficult and/or expensive to obtain. Think of analyzing molecular-level activities in metallurgical process such as what happens as a lithium battery is being used up or evaluating alternative strategies for asset trading. Let’s examine what is the state of quantum operations, how quantum differs from classical computing and recommended next steps.

Quantum Research Transitioning to Experimental Application

Today’s successful efforts at working with quantum machines (and simulators) lead to the conclusion that for the foreseeable future, quantum computers leveraged in tandem with classical computers represents the most promising way forward. Classical computing executes with its logic gates; quantum computing uses quantum theory to manipulate qubits, gates and logic.

Tests have been run comparing the problem-solving speed of quantum algorithms (on quantum devices) against classical algorithms (run on classical machines). Results show that quantum devices don’t consistently complete more quickly or offer better solutions—or not enough to justify the additional effort required to do so. That confirms the continuing value of classical computing, as well as the need for both approaches.

Today, the challenge lies in identifying the exact problems or parts of a problem that can best be addressed via a quantum computing device. Performing these tests and experiments is also helping to reveal ways to improve some algorithms to run even faster and more accurately on classical computers.

Theory and logical quantum computer simulations provide some insight into problem formulation. However, the gap between what can be done with a logical qubit versus a real, live qubit is enormous. As an example, a logical qubit can hold its states forever and be examined at leisure. In real life, the qubit has an accessible, informational life of microseconds, meaning only samples of output can be taken, which are also error prone. Algorithms are run repeatedly to correct for these errors. Research continues to identify ways to extend the life and stability of qubits.
Decades of classical computing solving all kinds of problems has given great insight into their operation. They use binary logic and mathematical concepts tied to the physical world. Physical models allowed logical processes to be replicated. The expected results could be predicted and checked. This made problem formulation, execution and answer checking relatively straightforward. The same level of knowledge doesn’t exist for quantum computing.
Quantum can be best at solving problems involving large data arrays or problems involving many complex options with limited data. But the quantum world operates at the limits of our measurable knowledge and abilities of observation of results. “Touching” a qubit to measure its state causes it to immediately change state. So, identifying the specific details of quantum friendly problems remains an on-going challenge. These include determining the best way to articulate problems, and even deciding what problem or pieces of a problem to run on a quantum computer. Also, there remains much to be learned about composing algorithms, as well as verifying solutions.

What’s Different About Quantum Computing?

Quantum computing is superficially similar to (but fundamentally different from) classical computing. Both quantum and classical computers use algorithms to solve problems. The actual algorithms are different because of unique execution techniques. Both are programmed with gates and transforms. But quantum computing manipulates objects at a molecular level. The laws of molecular physics that apply to quantum operations are quite different as are the conditions at which it works. IBM Q requires near absolute zero (-273 Celsius) temperatures. It operates in ways not fully observable or currently even directly measurable.

Qubits are casually like bits. But bits hold only one of two states (0 or 1). Qubits hold a state of 0, 1 or any combination in between (e.g., 20% – 0, 80% – 1). The amount of data a qubit holds accounts for its great potential. They are shorter-lived, sensitive (collapsing to bits if touched by minimal external energy), error-prone, etc. Significant amounts of todays’ research and experimentation efforts are directed at improving error rates, prolonging the time a qubit retains its active state.

These and other problems are being resolved with various strategies. These range from increasing the number of iterations to reduce the error rate, to experimenting with qubit topologies to increase stability, using different substances to operate at higher temperatures. Quantum particles exhibit a very helpful characteristic phenomena of interaction called entanglement. It’s been described by no less than Albert Einstein as “spooky action at a distance.” Let’s take a look at that characteristic.

What’s Entanglement?

Simply put, entanglement is when two particles (photon, qubit, etc.) interact and retain a relationship that is neither a physical nor a controllable exchange of any sort. Think of two identical particles, A and B, separate, independent of each other with no physical or connection. Particle (A) is entangled with particle (B). Once entangled, any changes made in the superposition state of one of the pair will correlate with changes in the superposition state of the other particle.

So, observing entangled particle (A) changes the state of its superposition. Near instantaneously, the state of the second particle (B) changes in a correlated, but opposite of (A)—like a mirror image. This occurs without stimulation of (B) nor any connection nor exchange of any kind: no wave, no impact, no connection. The change occurs at speeds exceeding the speed of light even when significantly large distances separate the two particles.
The change in the state of (B) is predictable, but opposite (complementary) to the change in the state of (A). Entanglement simply (or not) means that the superpositions of two entangled particles will change in an observable, complementary way with no physical contact or connection. Thus, the change state is correlated, not causal. The results of stimulating (A) are detectable by comparing states afterwards. The change is the result of random movement in both particles—but only the overall outcome is observable.

Entanglement of particle superpositions is unique to quantum computing. Neither classical computing nor classical physics have anything like it. It’s the basis of a lot of the power and promise of quantum computing.
Decades-long vendor efforts have led to major contributions to and advances in quantum science. More recently, the focus has been on moving quantum computing from a science to a technology for experiments in application. In addition to partnerships and alliances, smart vendors have been supportive of public involvement (researchers, students, enterprise staffs) as well as promoting and funding widespread quantum education. IBM, for example, has made a textbook , tutorials and quantum access (via IBM Q Experience ) available to researchers, educators, students, etc. IBM, Google, Microsoft and others recognize that progress to true quantum computing commercialization will be result of a series of step-ups in knowledge and capability, not a flashy leapfrog.

Next Steps for Quantum

Classical computing is based on well-understood models of logic and mathematics. It is actually based on how we think about and analyze the world around us. Experience and detailed models allow us to predict outcomes and measure results against expectations. We know how to articulate problems and structure algorithms with precision. None of this holds for quantum; where we are just learning how to do all that in quantum terms.
It’s critical to begin engaging with this new evolution in computing technology. Not necessarily to become experts in the theoretical aspects, but to understand the change in thinking about how things operate to discover how it might be useful. Operating in a quantum environment requires a unique, almost philosophical view of problems. There’s no doubt that it has the potential to radically alter how problems are viewed, articulated and solved.
Quantum computing will have major market impact. Learning to think, communicate and frame questions and, then to comprehend answers in quantum terms requires effort. But the effort will pay off. Even for those who do not code solutions, understanding will provide insight into new opportunities that lead to competitive advantages.
To ignore quantum computing today is a major mistake. Further delaying entry into the quantum computing space will only increase the costs of entry and catchup to overcome a disadvantage that increases as time passes.
Quantum computing’s ability to deeply analyze small (as well as large) amounts of data in reasonable time yielding highly accurate, actionable insights will benefit many areas. It will improve forecasting and allow ‘what-if’ analysis of incredible variety and complexity. Deep analysis will detect trends, relationships, correlations, even causations that are unique. The impact will be felt in shaping and developing strategies for everything from financial trading to inventory management. Research will improve energy use, conservation and discovery. Quantum computing will impact the application of  AI and machine learning in all sorts of areas, including metallurgy, financial analysis and trading, traffic control, and much more.