• Feite Kraay, Author |
6 min read

When I was in high school (call it the early 1980s), taking a computer science class was one of my formative experiences. My rural school shared the computer with another school in a different town, and by “the computer” I mean a donated minicomputer from a well-known hardware vendor. Minicomputer is also somewhat euphemistic: this machine was roughly the size of a refrigerator and had to be physically trucked back and forth between the schools. It sported four teletype terminals that allowed four students to enter code interactively. At any one time, everyone else had to code by pencil marks on a sequence of cards fed through a reader. Learning to code in BASIC and, in the advanced class, FORTRAN was an exhilarating experience at the time.

That class’s final assignment was to write a game. Being ambitious, I chose to try and replicate the well-known children’s game, Battleship. I wrote out all the BASIC code by hand on paper, modeling the 10x10 grid and writing subroutines to place the ships randomly, ensuring no ships were adjacent to each other. After building the data model and constraints, I wrote code to interactively receive guesses from the player and record hits and misses. I was envisioning a simplified one-sided game where the human player just tries to sink the computer’s ships. Once I had painstakingly entered my program, imagine my disappointment when it just wouldn’t run. No typos or other errors, just constant stalling. My teacher and I reviewed everything several times over and concluded that the program and data for my simple game model probably just exceeded the machine’s 48 kilobytes of shared memory and 385 KHz CPU. (I still passed the assignment, but it took me until university to learn how to write efficient code.)

Four decades later, my nearly-obsolete smartphone boasts 128 gigabytes of memory and a 2.39 GHz, 6-core CPU. This increase in power represents approximately a doubling every two years and provides an instance of Moore’s Law. Gordon Moore, CEO of Intel in the 1960s, predicted that the number of components on an integrated circuit would double annually. Although he later revised this to every two years, Moore’s Law has been generalized to measure (and predict) the growth of computing capacity using similar resources. It has held up remarkably well, only recently coming up against the physical limits of miniaturization in silicon manufacturing. It was expected that Moore’s Law would end by approximately 2025, which obviously draws close. However, I believe it won’t just fade away.

To misquote T.S. Eliot, the world of computing governed by Moore’s Law will end not with a whimper, but with a bang. This bang will be delivered by the rapidly emerging field of Quantum Computing, about which it’s no exaggeration to say that it will be the biggest revolution in the field of Computer Science since, well, the beginning of Computer Science. Quantum computers are rapidly realizing their potential to solve complex, computation-intensive problems thousands of times faster than the computers we all use today.

I’ve discussed some of these developments specifically in the context of cyber security here on this blog. In this post, I want to dig a little deeper into the roots of quantum computing and identify some practical use-cases that, quite apart from the risks quantum poses, could lead to vast enhancements in our daily lives.

Cue the revolution
How quantum computers will achieve this—how they will literally break Moore’s Law—is rooted in the science of quantum physics, and the mysterious behaviour of subatomic particles such as electrons and photons. Quantum physics, a field going back some 100 years, seeks to explain phenomena such as the energy states of atoms or the way light waves diffract. It posits that electrons and photons can actually exist in multiple states at once and be described by matrices or wave functions rather than precise numerical definitions of their location or speed. This is known as “quantum superposition.” I won’t try to further condense many libraries’ worth of scientific research into a single paragraph but suffice to say that quantum physics has proven to be the most successful and reliable theory to describe how the universe works at both macro and micro levels.

In the early 1980s, a number of physicists and mathematicians came up with the idea of using superposition to encode information in a radically new way.

A “classical” computer is built using transistors that are either on or off (numerical values of one or zero) based on an electrical current, which is the basic definition of a binary digit or “bit.” Binary coding is the foundation of all computer programming up to now, and each added bit effectively doubles capacity.

By contrast, a quantum computer can be built on subatomic particles (those photons and electrons), which can have a value of one or zero or anything in between—and, again, all at once. Such an encoded subatomic particle is known as a quantum bit or “qubit” and it can hold much more information than a classical bit. Adding more qubits to a system could increase its power beyond exponentially.

If this seems strange or simply hard to grasp, a common analogy is the simple problem of traversing a maze. A classical computer would navigate a maze the same way you would. You’d take one path at a time, backtrack when you encounter a dead end, select another path, and repeat the process until you finally find the exit. Larger mazes with more paths become exponentially more complicated to navigate. But a quantum computer navigating the maze could use superpositioning to select all paths in the maze at once. How many paths the maze has doesn’t matter; the time it takes a quantum computer to solve the maze is related to the shortest (or only) path to the exit.

This is what will break through the limitations of Moore’s Law and allow quantum computers to work orders of magnitude faster than classical computers.

Game changer
As I’ve suggested, the potential benefits of quantum computing are immense. Many of the problems facing us today are still too large and complex for classical computing, but quantum computing promises to solve them. Several applications come to mind:

Optimization: Common mathematical problems like finding the most efficient way to traverse a network or finding the lowest-cost solution given hundreds or thousands of constraints are currently only solved by approximation. Quantum computing can do a much better job of this and could be applied in scenarios such as:

  • Optimizing fleet routing or other delivery networks to minimize “empty miles” and maximize fuel efficiency
  • Analyzing investment portfolios to maximize returns while minimizing risk or optimizing derivative pricing
  • Maximizing geographical service coverage with minimal infrastructure (e.g., placement of cell towers, EV charging stations, etc.).

Simulation: Performing large-scale tests in manufacturing or engineering is very time-consuming and expensive. Quantum computing could be deployed to quickly run multiple rounds of virtual test scenarios before doing final physical testing, reducing costs and saving time:

  • Aerodynamic simulation for automobile bodies, aircraft, etc.
  • Fluid dynamics
  • Semiconductor manufacturing
  • Pharmaceuticals (testing drug interactions prior to clinical trials)
  • Monte Carlo simulations: in financial as well as scientific cases, averaging the results of multiple trials involving random variables.

Modeling: Applying quantum technology to new product design, such as:

  • New fuel cells and batteries for EVs, minimizing the use of rare elements
  • Defect identification in manufacturing processes
  • Bio-engineering, discovering new protein structures.

Data analytics and artificial intelligence: Cognitive computing combined with quantum systems is a very promising field with a wide variety of applications, including:

  • Large-scale image processing and interpretation (consider the spectacular images from deep space provided now by the James Webb Space Telescope
  • Meteorology
  • Processing vast quantities of data from IoT sensors
  • Retail product recommendations and targeted offers
  • Credit risk scoring and financial fraud detection.

In short, quantum computing will completely transform the technology industry as we know it today. Immense benefits can be realized in scientific research, finance, medicine and even climate change when quantum technology is fully deployed. But how long will this take? And when can we expect to enter this promised land?

No one knows the answers to those questions today, although most experts expect about a five to ten year horizon. In my next post, I’ll look at the actual current state of quantum technology, the work that still needs to be done before its benefits can be fully realized, and what we can do to get started in the meantime.

Publication multilingue

Cette publication est aussi offerte dans les langues suivantes :

Tenez-vous au courant de sujets qui vous intéressent.

Inscrivez-vous aujourd’hui pour avoir accès à du contenu personnalisé en fonction de vos intérêts.