Posts Tagged ‘Quantum Volume’

Pushing Quantum Onto the Cloud

September 4, 2020

Did you ever imagine the cloud would become your quantum computing platform, a place where you would run complex quantum algorithms requiring significant specialized processing across multi-qubit machines available at a click? But that is exactly what is happening.

IBM started it a few years back by making their small qubit machines available in the cloud and even larger ones now. Today Xanadu is offering 8-qubit or 12-qubit chips, and even a 24-qubit chip in the next month or so, according to the Toronto-based company.

Xanadu quantum processor

As DancingDinosaur has previously reported, there are even more: Google reports a quantum computer lab with five machines and Honeywell has six quantum machines. D-Wave is another along with more startups, including nQ, Quantum Circuits, and Rigetti Computing.

D-Wave is another along with more startups, including nQ, Quantum Circuits, and Rigetti Computing.In September, Xanadu introduced its quantum cloud platform. This allows developers to access its gate-based photonic quantum processors with 8-qubit or 12-qubit chips across the cloud.

Photonics-based quantum machines have certain advantages over other platforms, according to the company. Xanadu’s quantum processors operate at room temperature, not low Kelvin temperatures. They can easily integrate into an existing fiber optic-based telecommunication infrastructure, enabling quantum computers to be networked. It also offers scalability and fault tolerance, owing to error-resistant physical qubits and flexibility in designing error correction codes. Xanadu’s type of qubit is based on squeezed states – a special type of light generated by its own chip-integrated silicon photonic devices, it claims.

DancingDinosaur recommends you check out Xanadu’s documentation and details. It does not have sufficient familiarity with photonics, especially as related to quantum computing, to judge any of the above statements. The company also notes it offers a cross-platform Python library for simulating and executing programs on quantum photonic hardware. Its open source tools are available on GitHub.

Late in August IBM has unveiled a new milestone on its quantum computing road map, achieving the company’s highest Quantum Volume to date. By following the link, you see that Quantum Value is a metric conceived by IBM to measure and compare quantum computing power. DancingDinosaur is not aware of any other quantum computing vendors using it, which doesn’t mean anything of course. Quantum computing is so new and so different and with many players joining in with different approaches it will be years before anadu see what metrics prove most useful. 

To come up with its Quantum Volume rating, IBM  combined a series of new software and hardware techniques to improve overall performance, IBM has upgraded one of its newest 27-qubit, systems to achieve the high Quantum Volume rating. The company has made a total of 28 quantum computers available over the last four years through the IBM Quantum Experience, which companies join to gain access to its quantum machines and tools, including its software development toolset, 

Do not confuse Quantum Volume with Quantum Advantage, the point where certain information processing tasks can be performed more efficiently or cost effectively on a quantum computer versus a conventional one. Quantum Advantage will require improved quantum circuits, the building blocks of quantum applications. Quantum Volume, notes IBM, measures the length and complexity of circuits – the higher the Quantum Volume, the higher the potential for exploring solutions to real world problems across industry, government, and research.

To achieve its Quantum Volume milestone, the company focused on a new set of techniques and improvements that used knowledge of the hardware to optimally run the Quantum Volume circuits. These hardware-aware methods are extensible and will improve any quantum circuit run on any IBM Quantum system, resulting in improvements to the experiments and applications which users can explore. These techniques will be available in upcoming releases and improvements to the IBM Cloud software services and the cross-platform open source software development kit (SDK) Qiskit. The IBM Quantum team has shared details on the technical improvements made across the full stack to reach Quantum Volume 64 in a preprint released on arXiv, today.

What is most exciting is that the latest quantum happenings are things quantum you can access over the cloud without having to cool your data center to near zero Kelvin temperatures. If you try any of these, DancingDinosaur would love to hear how it goes.

Alan Radding, a veteran information technology analyst, writer, and ghost-writer, is DancingDinosaur. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/.

IBM Rides Quantum Volume  to Quantum Advantage

March 19, 2019

Recently IBM announced achieving its’ highest quantum volume to date. Of course, nobody else knows what Quantum Volume is.  Quantum volume is both a measurement and a procedure developed, no surprise here, by IBM to determine how powerful a quantum computer is. Read the May 4 announcement here.

Quantum volume is not just about the number of qubits, although that is one part of it. It also includes both gate and measurement errors, device cross talk, as well as device connectivity and circuit compiler efficiency. According to IBM, the company has doubled the power of its quantum computers annually since 2017.

The upgraded processor will be available for use by developers, researchers, and programmers to explore quantum computing using a real quantum processor at no cost via the IBM Cloud. This offer has been out in various forms since May 2016 as IBM’s Q Experience.

Also announced was a new prototype of a commercial processor, which will be the core for the first IBM Q early-access commercial systems.  Dates have only been hinted at.

IBM’s recently unveiled IBM Q System One quantum computer, with a fourth-generation 20-qubit processor, which has resulted in a Quantum Volume of 16, roughly double that of the current IBM Q 20-qubit device, which have a Quantum Volume of 8.

The Q volume math goes something like this: a variety of factors determine Quantum Volume, including the number of qubits, connectivity, and coherence time, plus accounting for gate and measurement errors, device cross talk, and circuit software compiler efficiency.

In addition to producing the highest Quantum Volume to date, IBM Q System One’s performance reflects some of the lowest error rates IBM has ever measured, with an average 2-qubit gate error less than 2 percent, and its best gate achieving less than a 1 percent error rate. To build a fully-functional, large-scale, universal, fault-tolerant quantum computer, long coherence times and low error rates are required. Otherwise how could you ever be sure of the results?

Quantum Volume is a fundamental performance metric that measures progress in the pursuit of Quantum Advantage, the Quantum Holy Grail—the point at which quantum applications deliver a significant, practical benefit beyond what classical computers alone are capable. To achieve Quantum Advantage in the next decade, IBM believes that the industry will need to continue to double Quantum Volume every year.

Sounds like Moore’s Law all over again. IBM doesn’t deny the comparison. It writes: in 1965, Gordon Moore postulated that the number of components per integrated function would grow exponentially for classical computers. Jump to the new quantum era and IBM notes its Q system progress since 2017 presents a similar early growth pattern, supporting the premise that Quantum Volume will need to double every year and presenting a clear roadmap toward achieving Quantum Advantage.

IBM’s recently unveiled IBM Q System One quantum computer, with a fourth-generation 20-qubit processor, which has produced a Quantum Volume of 16, roughly double that of the current IBM Q 20-qubit IBM Q Network device, which has a Quantum Volume of 8.

Potential use cases, such as precisely simulating battery-cell chemistry for electric vehicles, speeding quadratic derivative models, and many others are already being investigated by IBM Q Network partners. To achieve Quantum Advantage in the 2020s, IBM believes the industry will need to continue doubling Quantum Volume every year.

In time AI should play a role expediting quantum computing.  For that, researchers will need to develop more effective AI that can identify patterns in data otherwise invisible to classical computers.

Until then how should most data centers proceed? IBM researchers suggest 3 initial steps:

  1. Develop quantum algorithms that demonstrate how quantum computers can improve AI classification accuracy.
  1. Improve feature mapping to a scale beyond the reach of the most powerful classical computers
  2. Classify data through the use of short depth circuits, allowing AI applications in the NISQ (noisy intermediate scale quantum) regime and a path forward to achieve quantum advantage for machine learning.

Sounds simple, right? Let DancingDinosaur know how you are progressing.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at technologywriter.com.


%d bloggers like this: