With hundreds of start-ups in the emerging field of quantum computing, alongside giants like IBM, Microsoft, Amazon and Google, the prospect of generating revenue from what’s essentially a research and development endeavor is daunting. Publicly-traded quantum start-ups such as IonQ and D-Wave Systems, which don’t have non-quantum lines of business to subsidize their investments, probably feel this pressure more acutely.
Monetizing quantum technologies is difficult as current systems have limited useful capabilities. Significant advancements have been made in extending coherence times, decreasing error rates and increasing qubit counts; these developments influence product road maps for future systems, but the promise of tomorrow has little impact on what’s available now.
Selling quantum computing on a large scale is possible, even when quantum computers aren’t yet advanced enough to be used in production. Doing so requires a significantly different approach from hardware and software purveyors, as well as industry-wide coordination — and restraint — in communicating the value of quantum technologies.
Demystifying the Rhetoric
The general public’s understanding of quantum computers’ utility and purpose remains abysmal. Prospective buyers are moderately better-off, despite the marketing attempts of manufacturers. Press releases touting the achievement of “quantum supremacy” are as much a rite of passage as they are unhelpful to the cause; leaving alone the unfortunate implications of the word supremacy, the underlying claim has been so frequently repeated and debunked that the premise itself is thought-terminating.
Explaining the value of quantum computers to prospective customers must first demystify the science behind the technology — phrases such as Einstein’s remark calling quantum entanglement “spooky action at a distance” are neither relevant nor helpful in explaining the value of an error-corrected quantum computer. Likewise, inflating the practical ability of near-term quantum computers with subjective milestones undermines the impact that higher qubit counts, robust error correction, higher qubit fidelity and longer coherence times will ultimately deliver.
Quantum hardware manufacturers should publish as many common benchmarks as possible. Although individual metrics provide interesting data points, synthetic benchmarking allows for progress to be measured and tracked over time. Quantum and classical synthetic benchmarks alike may not show the full value or ability of a given system, but this shouldn’t be used to dismiss existing standards. Introducing novel, company-specific benchmarks while ignoring the standards that competitors use would make comparison all but impossible.
Contextualization
Comparing successive quantum computers from one manufacturer, and between models from competing companies, is important for characterizing progress. But this alone doesn’t convey to prospective buyers what quantum computers can do, or how they differ from classical computers. Oft-repeated practical examples — such as the difference between bits and qubits, the utility of entanglement and so on — explain how quantum computing differs from classical computing, though it’s often more conceptual than concrete.
Historically, advances in classical computing have enabled cost-effective computation for different areas of mathematics. These modalities allow for new uses to which a computer can be practically applied, leading to new products and capabilities. These advances illustrate the difference — and therefore, the value — that quantum computers can provide.
Classical processors, like the CPUs in consumer and enterprise systems today, are effectively the descendants of adding machines. Since the introduction of the Intel 8086 microprocessor in 1978 various improvements have been added, including longer bus widths, faster clock speeds and floating-point arithmetic. But, independent of architecture, any given application running on a CPU is by volume mostly the same six instructions: add, subtract, load, store, compare and branch. Traditional CPUs are intentionally general purpose; they can perform almost any calculation accurately, but not necessarily quickly.
But CPUs aren’t particularly efficient at graphics processing, which requires higher parallelism and relies extensively on geometric calculations that CPUs aren’t tailored for. Demand for 3D graphics processing in business and entertainment led to the mass-market commercialization of GPUs in the mid-1990s. “Nice to have” features have been added over time, such as video encoding and decoding, texture mapping and raytracing. Nvidia’s CUDA software made its GPUs popular for general-purpose workloads, opening the same underlying hardware to new markets; comparatively, AMD’s software stack is less versatile, and adoption of AMD GPUs beyond graphics processing is less prevalent.
Artificial intelligence (AI) and machine learning workloads were the primary beneficiary of non-graphical computing on GPUs, though this hasn’t been a perfect fit. Although these workloads can use the parallelism of GPUs, they typically rely on matrix and tensor calculus and have a stronger dependency on data locality than graphics processing. Similarly, extensions for texture mapping found in GPUs aren’t useful for AI or machine learning. In the mid-2010s, various approaches — such as Google’s Tensor Processing Unit and Graphcore’s Intelligence Processing Unit — emerged as hardware accelerators to rectify this.
Quantum computing is the next step in this progression of computational ability, making it the fourth pillar of computing. Quantum processors will simultaneously perform classes of applications that are impractical to calculate on classical computers. The prospect of using Shor’s algorithm to factor prime numbers in pursuit of cracking encryption is often touted in security circles, but the impact of quantum computing extends beyond that. Uses such as linear systems of equations, mathematical optimization, and boson sampling are thought to have implications for economics, engineering and pharmacology, as well as for the development of AI and machine learning generally.
Quantum Computers, Quantum Algorithms
The capability of any computer platform is determined by the quality and quantity of its software. For quantum hardware to be useful, quantum algorithms must be developed in parallel. This requires an investment of time and money, alongside a general idea of the problems that could be addressed with a quantum computer. Crucially, what it doesn’t require is an expertise in quantum science; many manufacturer-specific and neutral tools are available to ease the adoption process.
Usefully managing these tools to ensure that success is a possible outcome requires breaking institutional problems such as internal opposition, culture clashes and budget tightening. Taking a wait-and-see approach by definition cedes any potential of a first-mover advantage. In a wider view of the potential impact of quantum computing, there’s no guarantee that a problem relevant to an industry will be solved without a company putting in some effort to solve it. Or, to indulge in a truism — you miss 100% of the shots you don’t take.
Although there’s no panacea to solving institutional inertia, reassuring businesses that quantum technologies are worth developing today is a vital first step.
Fortunately, developing quantum software doesn’t require hand-stitching circuits for a specific computer. Although it’s still early days, initiatives such as the QIR Alliance are developing cross-platform solutions that aim to fully use the capabilities of quantum processors from different hardware manufacturers.
Likewise, organizations looking to explore quantum computing can partner with external firms to gain quantum competency or obtain guidance in developing quantum skills internally.