Posts Tagged ‘NISQ (noisy intermediate scale quantum)’

IBM Rides Quantum Volume  to Quantum Advantage

March 19, 2019

Recently IBM announced achieving its’ highest quantum volume to date. Of course, nobody else knows what Quantum Volume is.  Quantum volume is both a measurement and a procedure developed, no surprise here, by IBM to determine how powerful a quantum computer is. Read the May 4 announcement here.

Quantum volume is not just about the number of qubits, although that is one part of it. It also includes both gate and measurement errors, device cross talk, as well as device connectivity and circuit compiler efficiency. According to IBM, the company has doubled the power of its quantum computers annually since 2017.

The upgraded processor will be available for use by developers, researchers, and programmers to explore quantum computing using a real quantum processor at no cost via the IBM Cloud. This offer has been out in various forms since May 2016 as IBM’s Q Experience.

Also announced was a new prototype of a commercial processor, which will be the core for the first IBM Q early-access commercial systems.  Dates have only been hinted at.

IBM’s recently unveiled IBM Q System One quantum computer, with a fourth-generation 20-qubit processor, which has resulted in a Quantum Volume of 16, roughly double that of the current IBM Q 20-qubit device, which have a Quantum Volume of 8.

The Q volume math goes something like this: a variety of factors determine Quantum Volume, including the number of qubits, connectivity, and coherence time, plus accounting for gate and measurement errors, device cross talk, and circuit software compiler efficiency.

In addition to producing the highest Quantum Volume to date, IBM Q System One’s performance reflects some of the lowest error rates IBM has ever measured, with an average 2-qubit gate error less than 2 percent, and its best gate achieving less than a 1 percent error rate. To build a fully-functional, large-scale, universal, fault-tolerant quantum computer, long coherence times and low error rates are required. Otherwise how could you ever be sure of the results?

Quantum Volume is a fundamental performance metric that measures progress in the pursuit of Quantum Advantage, the Quantum Holy Grail—the point at which quantum applications deliver a significant, practical benefit beyond what classical computers alone are capable. To achieve Quantum Advantage in the next decade, IBM believes that the industry will need to continue to double Quantum Volume every year.

Sounds like Moore’s Law all over again. IBM doesn’t deny the comparison. It writes: in 1965, Gordon Moore postulated that the number of components per integrated function would grow exponentially for classical computers. Jump to the new quantum era and IBM notes its Q system progress since 2017 presents a similar early growth pattern, supporting the premise that Quantum Volume will need to double every year and presenting a clear roadmap toward achieving Quantum Advantage.

IBM’s recently unveiled IBM Q System One quantum computer, with a fourth-generation 20-qubit processor, which has produced a Quantum Volume of 16, roughly double that of the current IBM Q 20-qubit IBM Q Network device, which has a Quantum Volume of 8.

Potential use cases, such as precisely simulating battery-cell chemistry for electric vehicles, speeding quadratic derivative models, and many others are already being investigated by IBM Q Network partners. To achieve Quantum Advantage in the 2020s, IBM believes the industry will need to continue doubling Quantum Volume every year.

In time AI should play a role expediting quantum computing.  For that, researchers will need to develop more effective AI that can identify patterns in data otherwise invisible to classical computers.

Until then how should most data centers proceed? IBM researchers suggest 3 initial steps:

  1. Develop quantum algorithms that demonstrate how quantum computers can improve AI classification accuracy.
  1. Improve feature mapping to a scale beyond the reach of the most powerful classical computers
  2. Classify data through the use of short depth circuits, allowing AI applications in the NISQ (noisy intermediate scale quantum) regime and a path forward to achieve quantum advantage for machine learning.

Sounds simple, right? Let DancingDinosaur know how you are progressing.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at technologywriter.com.


%d bloggers like this: