Posts Tagged ‘qubits’

IBM Introduces 53 Qubit Quantum Machine

September 23, 2019

IBM made two major system announcements within just a couple of weeks: On Sept. 18 IBM announced a 53 qubit guantum machine. The week before, IBM introduced its latest mainframe, the z15. Already buzz is circulating of a z16 in two years, about a normal release cycle for the next generation of  an IBM mainframe. 

Quantum computer up close
IBM’s largest quantum machine at 53 qubits

Along with the 53 qubit machine IBM announced the opening of a Quantum Computation Center in New York state. The new center expands, according to IBM, its fleet of quantum computing systems for commercial and research activity that exist beyond the confines of experimental lab environments. IBM’s offerings run from 5 to 10 to 20 to, now, 53 qubits. These are actual quantum machines hosted by IBM in the cloud, not just simulations. 

The IBM Quantum Computation Center will support the growing needs of a community of over 150,000 registered users and nearly 80 commercial clients, academic institutions and research laboratories to advance quantum computing and explore practical applications. To date, notes IBM, this  global community of users have run more than 14 million experiments on IBM’s quantum computers through the cloud since 2016, and published more than 200 scientific papers. To meet growing demand for access to real quantum hardware, ten quantum computing systems are now online through IBM’s Quantum Computation Center. The fleet is composed of five 20-qubit systems, one 14-qubit system, and four 5-qubit systems. Five of the systems now have a quantum volume of 16 – a measure of the power of a quantum computer used by IBM demonstrating a new sustained performance milestone.

IBM’s quantum systems are optimized for the reliability and reproducibility of programmable multi-qubit operations. Due to these factors, the systems enable state-of-the-art quantum computational research with 95 percent availability, according to the company.

Within one month, IBM’s commercially available quantum fleet will grow to 14 systems, including the new 53-qubit quantum computer, the single largest universal quantum system made available for external access in the industry to date. The new system offers a larger lattice and gives users the ability to run even more complex entanglement and connectivity experiments. Industry observers note that serious work requires a minimum of 200 qubits, probably just a couple more product intros away. 

Advances in quantum computing could open the door to future scientific discoveries such as new medicines and materials, vast improvements in the optimization of supply chains, and new ways computers to model financial data to make better investments. Examples of IBM’s  work with clients and partners, include:

  • J.P. Morgan Chase and IBM posted on arXiv,  Option Pricing using Quantum Computers, a methodology to price financial options and portfolios of such options, on a gate-based quantum computer. This resulted in an algorithm that provides a quadratic speedup, i.e. whereby classically computers need millions of samples, this methodology requires only a few thousands of samples to achieve the same result, It allows financial analysts to perform the option pricing and risk analysis in near real time. The implementation is available as open source in Qiskit Finance. 
  • Mitsubishi Chemical, Keio University and IBM simulated the initial steps of the reaction mechanism between lithium and oxygen in lithium-air batteries. Also available on arXiv,  this represents a first step in modeling the entire lithium-oxygen reaction on a quantum computer. Better understanding of this interaction could lead to more efficient batteries for mobile devices or automotive vehicles.

In the meantime IBM continues to simulate quantum algorithms on conventional supercomputers. According to one 2-year old report: at roughly 50 qubits, existing methods for calculating quantum amplitudes require either too much computation to be practical, or more memory than is available on any existing supercomputer, or both. You can bet that IBM or somebody else will push beyond 53 qubits pretty quickly. Google already claims a 72-qubit device, but it hasn’t let outsiders run programs on it. IBM has been making quantum available via the cloud since 2016. Other companies putting quantum computers in the cloud, include IBM’s Quantum Computation Center.IBM’s Quantum Computation Center. Others include  Rigetti Computing,  and Canada’s D-Wave

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/ 

IBM Advances Commercial Quantum Computing

August 7, 2019

The reason IBM and others are so eager for quantum computing is simple: money. Recent efforts have demonstrated that quantum analytics can process massive amounts of transactions quickly and accurately, as much as nearly $70 Trillion last year, according to the World Bank.

“These are enormous amounts of money,” says mathematician Cornelis Oosterlee of Centrum Wiskunde & Informatica, a national research institute in the Netherlands for a piece in Wired Magazine. “Some single trades involve numbers that are scary to imagine”—part of a company’s pension fund, say, or a university endowment, he continues.

Of course, this isn’t exactly new. Large organizations with access to huge amounts of resources devote inordinate quantities of those resources in an effort to predict how much their assets will be worth in the future.  If they could do this modeling faster or more accurately or more efficiently, maybe just shaving off a few seconds here or there; well you can do the arithmetic.

Today these calculations are expensive to run, requiring either an in-house supercomputer or two or a big chunk of cloud computing processors and time. But if or when quantum computing could deliver on some of its theoretical promise to drive these analyses faster, more accurately, more efficiently and cheaper that’s something IBM could build into the next generation of systems.. 

And it is not just IBM. From Google on down to startups, developers are working on machines that could one day beat conventional computers at various tasks, such as classifying data through machine learning or inventing new drugs—and running complex financial calculations. In a step toward delivering on that promise, researchers affiliated with IBM and J.P. Morgan recently figured out how to run a simplified risk calculation on an actual quantum computer.

Using one of IBM’s machines, located in Yorktown Heights, New York, the researchers demonstrated they could simulate the future value of a financial product called an option. Currently, many banks use what’s called  the Monte Carlo method to simulate prices of all sorts of financial instruments. In essence, the Monte Carlo method models the future as a series of forks in the road. A company might go under; it might not. President Trump might start a trade war; he might not. Analysts estimate the likelihood of such scenarios, then generate millions of alternate futures at random. To predict the value of a financial asset, they produce a weighted average of these millions of possible outcomes.

Quantum computers are particularly well suited to this sort of probabilistic calculation, says Stefan Woerner, who led the IBM team. Classical (or conventional) computers—the kind most of us use—are designed to manipulate bits. Bits are binary, having a value of either 0 or 1. Quantum computers, on the other hand, manipulate qubits, which represent an in-between state. A qubit is like a coin flipping in the air—neither heads nor tails, neither 0 nor 1 but some probability of being one or the other. And because a qubit has unpredictability built in, it promises to  be a natural tool for simulating uncertain outcomes.

Woerner and his colleagues ran their Monte Carlo calculations using three of the 20 qubits available on their quantum machine. The experiment was too simplistic to be useful commercially, but it’s a promising proof of concept; once bigger and smoother-running quantum computers are available, the researchers hope to execute the algorithm faster than conventional machines.

But this theoretical advantage is just that, theoretical. Existing machines remain too error-ridden to compute consistently, In addition, financial institutions already have ample computing power available, onsite or in the cloud.. And they will have even more as graphics processing units (GPU), which can execute many calculations in parallel, come on line. A quantum computer might well be faster than an individual chip but it’s unclear whether it could beat a fleet of high performance GPUs in a supercomputer.

Still, it’s noteworthy that the IBM team was able to implement the algorithm on actual hardware, says mathematician Ashley Montanaro of the University of Bristol in the UK, who was not involved with the work. Academics first developed the mathematical proofs behind this quantum computing algorithm in 2000, but it remained a theoretical exercise for years. Woerner’s group took a 19-year-old recipe and figured out how to make it quantum-ready on actual quantum hardware.

Now they’re looking to improve their algorithm by using more qubits. The most powerful quantum computers today have fewer than 200 qubits, Practitioners suggest it may take thousands to consistently beat conventional methods.

But demonstrations like Woerner’s, even with their limited scope, are useful in that they apply quantum computers to problems organizationz actually want to solve, And that is what it will take if IBM expects to build quantum computing into a viable commercial business.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at technologywriter.com. 

Are Quantum Computers Even Feasible

November 29, 2018

IBM has toned down its enthusiasm for quantum computing. Even last spring it already was backing off a bit at Think 2018. Now the company is believes that quantum computing will augment classical computing to potentially open doors that it once thought would remain locked indefinitely.

First IBM Q computation center

With its Bristlecone announcement Google trumped IBM with 72 qubits. Debating a few dozen qubits more or less may prove irrelevant. A number of quantum physics researchers have recently been publishing papers that suggest useful quantum computing may be decades away.

Mikhail Dyakonov writes in his piece titled: The Case Against Quantum Computing, which appeared last month in Spectrum IEEE.org. Dyakonov does research in theoretical physics at Charles Coulomb Laboratory at the University of Montpellier, in France.

As Dyakonov explains: In quantum computing, the classical two-state circuit element (the transistor) is replaced by a quantum element called a quantum bit, or qubit. Like the conventional bit, it also has two basic states. But you already know this because DancingDinosaur covered it here and several times since.

But this is what you might not know: With the quantum bit, those two states aren’t the only ones possible. That’s because the spin state of an electron is described as a quantum-mechanical wave function. And that function involves two complex numbers, α and β (called quantum amplitudes), which, being complex numbers, have real parts and imaginary parts. Those complex numbers, α and β, each have a certain magnitude, and, according to the rules of quantum mechanics, their squared magnitudes must add up to 1.

Dyakonov continues: In contrast to a classical bit a qubit can be in any of a continuum of possible states, as defined by the values of the quantum amplitudes α and β. This property is often described by the statement that a qubit can exist simultaneously in both of its ↑ and ↓ states. Yes, quantum mechanics often defies intuition.

So while IBM, Google, and other classical computer providers quibble about 50 qubits or 72 or even 500 qubits, to Dyakonov this is ridiculous. The real number of qubits will be astronomical as he explains: Experts estimate that the number of qubits needed for a useful quantum computer, one that could compete with your laptop in solving certain kinds of interesting problems, is between 1,000 and 100,000. So the number of continuous parameters describing the state of such a useful quantum computer at any given moment must be at least 21,000, which is to say about 10300. That’s a very big number indeed; much greater than the number of subatomic particles in the observable universe.

Just in case you missed the math, he repeats: A useful quantum computer [will] need to process a set of continuous parameters that is larger than the number of subatomic particles in the observable universe.

Before you run out to invest in a quantum computer with the most qubits you can buy you would be better served joining IBM’s Q Experience and experimenting with it on IBM’s nickel. Let them wrestle with the issues Dyakonov brings up.

Then, Dyakonov concludes: I believe that such experimental research is beneficial and may lead to a better understanding of complicated quantum systems.  I’m skeptical that these efforts will ever result in a practical quantum computer. Such a computer would have to be able to manipulate—on a microscopic level and with enormous precision—a physical system characterized by an unimaginably huge set of parameters, each of which can take on a continuous range of values. Could we ever learn to control the more than 10300 continuously variable parameters defining the quantum state of such a system? My answer is simple. No, never.

I hope my high school science teacher who enthusiastically introduced me to quantum physics has long since retired or, more likely, passed on. Meanwhile, DancingDinosaur expects to revisit quantum regularly in the coming months or even years.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at technologywriter.com.

IBM Continues Quantum Push

June 8, 2018

IBM continued building out its Q Network ecosystem in May with the announcement of North Carolina State University, which is the first university-based IBM Q Hub in North America. As a hub. NC State will focus on accelerating industry collaborations, learning, skills development, and the implementation of quantum computing.

Scientists inside an open dilution fridge

NC State will work directly with IBM to advance quantum computing and industry collaborations, as part of the IBM Q Network’s growing quantum computing ecosystem. The school is the latest Q Network member. The network consists of individuals and organizations, including scientists, engineers, and business leaders, along with forward thinking companies, academic institutions, and national research labs enabled by IBM Q. Its mission: advancing quantum computing and launching the first commercial applications.

This past Nov. IBM announced a 50 qubit system. Shortly after Google announced Bristlecone, which claims to top that. With Bristlecone Google topped IBM for now with 72 qubits. However, that may not be the most important metric to focus on.

Stability rather than the number of qubits should be the most important metric. The big challenge today revolves around the instability of qubits. To maintain qubit machines stable enough the systems need to keep their processors extremely cold (Kelvin levels of cold) and protect them from external shocks. This is not something you want to build into a laptop or even a desktop. Instability leads to inaccuracy, which defeats the whole purpose.  Even accidental sounds can cause the computer to make mistakes. For minimally acceptable error rates, quantum systems need to have an error rate of less than 0.5 percent for every two qubits. To drop the error rate for any qubit processor, engineers must figure out how software, control electronics, and the processor itself can work alongside one another without causing errors.

50 cubits currently is considered the minimum number for serious business work. IBM’s November announcement, however, was quick to point out that “does not mean quantum computing is ready for common use.” The system IBM developed remains extremely finicky and challenging to use, as are those being built by others. In its 50-qubit system, the quantum state is preserved for 90 microseconds—record length for the industry but still an extremely short period of time.

Nonetheless, 50 qubits have emerged as the minimum number for a (relatively) stable system to perform practical quantum computing. According to IBM, a 50-qubit machine can do things that are extremely difficult to even simulate with the fastest conventional system.

Today, IBM offers the public IBM Q Experience, which provides access to 5- and 16-qubit systems; and the open quantum software development kit, QISKit, maybe the first quantum SDK. To date, more than 80,000 users of the IBM Q Experience, have run more than 4 million experiments and generated more than 65 third-party research articles.

Still, don’t expect to pop a couple of quantum systems into your data center. For the immediate future, the way to access and run qubit systems is through the cloud. IBM has put qubit systems in the cloud, where they are available to participants in its Q Network and Q Experience.

IBM has also put some of its conventional systems, like the Z, in the cloud. This raises some interesting possibilities. If IBM has both quantum and conventional systems in the cloud, can the results of one be accessed or somehow shared with the other. Hmm, DancingDinosaur posed that question to IBM managers earlier this week at a meeting in North Carolina (NC State, are you listening?).

The IBMers acknowledged the possibility although in what form and what timeframe wasn’t even at the point of being discussed. Quantum is a topic DancingDinosaur expects to revisit regularly in the coming months or even years. Stay tuned.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog. See more of his work at technologywriter.com and here.

IBM Moves Quantum Computing Toward Commercial Systems

September 20, 2017

IBM seem determined to advance quantum computing. Just this week IBM announced its researchers developed a new approach to simulate molecules on a quantum computer that may one day help revolutionize chemistry and materials science. In this case, the researchers implemented a novel algorithm that is efficient with respect to the number of quantum operations required for the simulation. This involved a 7-qubit processor.

7-cubit processor

In the diagram above IBM scientists successfully used six qubits on a purpose-built seven-qubit quantum device to address the molecular structure problem for beryllium hydride (BeH2) – the largest molecule simulated on a quantum computer to date.

Back in May IBM announced an even bigger quantum device. It prototyped the first commercial processor with 17 qubits and leverages significant materials, device, and architecture improvements to make it the most powerful quantum processor created to date by IBM. This week’s announcement certainly didn’t surpass it in size. IBM engineered the 17-qubit system to be at least twice as powerful as what is available today to the public on the IBM Cloud and it will be the basis for the first IBM Q early-access commercial systems.

It has become apparent to the scientists and researchers who try to work with complex mathematical problems and simulations that the most powerful conventional commercial computers are not up to the task. Even the z14 with its 10-core CPU and hundreds of additional processors dedicated to I/O cannot do the job.

As IBM puts it: Even today’s most powerful supercomputers cannot exactly simulate the interacting behavior of all the electrons contained in a simple chemical compound such as caffeine. The ability of quantum computers to analyze molecules and chemical reactions could help accelerate research and lead to the creation of novel materials, development of more personalized drugs, or discovery of more efficient and sustainable energy sources.

The interplay of atoms and molecules is responsible for all matter that surrounds us in the world. Now “we have the potential to use quantum computers to boost our knowledge of natural phenomena in the world,” said Dario Gil, vice president of AI research and IBM Q, IBM Research. “Over the next few years, we anticipate IBM Q systems’ capabilities to surpass what today’s conventional computers can do, and start becoming a tool for experts in areas such as chemistry, biology, healthcare and materials science.”

So commercial quantum systems are coming.  Are you ready to bring a quantum system into you data center? Actually you can try one today for free here  or through GitHub, which offers a Python software development kit for writing quantum computing experiments, programs, and applications. Although DancingDinosaur will gladly stumble through conventional coding, quantum computing probably exceeds his frustration level even with a Python development kit.

However, if your organization is involved in these industries—materials science, chemistry, and the like or is wrestling with a problem you cannot do on a conventional computer—it probably is worth a try, especially for free. You can try an easy demo card game that compares quantum computing with conventional computing.

But as reassuringly as IBM makes quantum computing sound, don’t kid yourself; it is very complicated.  Deploying even a small qubit machine is not going to be like buying your first PC. Quantum bits, reportedly, are very fragile or transitory. Labs will keep them very cold just to better stabilize the system and keep them from switching their states before they should.  Just think how you’d feel about your PC if the bit states of 0 and 1 suddenly and inextricably changed.

That’s not the only possible headache. You only have limited time to work on cubits given their current volatility when not super cooled. Also, work still is progressing on advancing the quantum frameworks and mapping out ecosystem enablement.

Even IBM researchers admit that some problems may not be better on quantum computers. Still, until you pass certain threshold, like qubit volume, your workload might not perform better on a quantum computer. The IBM quantum team suggests it will take until 2021 to consistently solve a problem that has commercial relevance using quantum computing.

Until then, and even after, IBM is talking about a hybrid approach in which parts of a problem are solved with a quantum computer and the rest with a conventional system. So don’t plan on replacing your Z with a few dozen or even hundreds of qubits anytime soon.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Please follow DancingDinosaur on Twitter, @mainframeblog. See more of his IT writing at technologywriter.com and here.

 

IBM Systems Sets 2016 Priorities

December 14, 2015

Despite its corporate struggles, IBM Systems, the organization that replaced IBM System and Technology Group (IBM STG) had a pretty good year in 2015. It started the year by launching the z13, which was optimized for the cloud and mobile economy. No surprise there. IBM made no secret that cloud, mobile, and analytics were its big priorities.  Over the year it also added cognitive computing and software defined storage to its priorities.

But it might have left out its biggest achievement of 2015.  This week IBM announced receiving a major multi-year research grant to IBM scientists to advance the building blocks for a universal quantum computer. The award was made by the U.S. Intelligence Advanced Research Projects Activity (IARPA) program. This may not come to commercial fruition in our working lives but it has the potential to radically change computing as we have ever envisioned it. And it certainly will put a different spin on worries about Moore’s Law.

Three Types of Quantum Computing

Right now, according to IBM, the workhorse of the quantum computer is the quantum bit (qubit). Many scientists are tackling the challenge of building qubits, but quantum information is extremely fragile and requires special techniques to preserve the quantum state. This fragility of qubits played a key part in one of the preposterous but exciting plots on the TV show Scorpion. The major hurdles include creating qubits of high quality and packaging them together in a scalable form so they can perform complex calculations in a controllable way – limiting the errors that can result from heat and electromagnetic radiation.

IBM scientists made a great stride in that direction earlier this year by demonstrating critical breakthroughs to detect quantum errors by combining superconducting qubits in lattices on computer chips – and whose quantum circuit design is the only physical architecture that can scale to larger dimensions.

To return to a more mundane subject, revenue, during 2015 DancingDinosaur reported the positive contributions the z System made to IBM’s revenue, one of the company’s few positive revenue performers. Turned out DancingDinosaur missed one contributor since it doesn’t track constant currency. If you look at constant currency, which smooths out fluctuations in currency valuations, IBM Power Systems have been on an upswing for the last 3 quarters: up 1% in Q1, up 5% in Q2, up 2% in Q3.   DancingDinosaur expects both z and Power to contribute to IBM revenue in upcoming quarters.

Looking ahead to 2016, IBM identified the following priorities:

  • Develop an API ecosystem that monetizes big data and cognitive workloads, built on the cloud as part of becoming a better service provider.
  • Win the architectural battle with OpenPOWER and POWER8 – designed for data and the cognitive era. (Unspoken, beat x86.)
  • Extend z Systems for new mobile, cloud and in-line analytics workloads.
  • Capture new developers, markets and buyers with open innovation on IBM LinuxONE, the most advanced and trusted enterprise Linux system.
  • Shift the IBM storage portfolio to a Flash and the software defined model that disrupts the industry by enabling new workloads, very high speed, and data virtualization for improved data economics.
  • Engage clients through a digital-first Go-to-Market model

These are all well and good. About the only thing missing is any mention of the IBM Open Mainframe Project that was announced in August as a partnership with the Linux Foundation. Still hoping that will generate the kind of results in terms of innovative products for the z that the OpenPOWER initiative has started to produce. DancingDinosaur covered that announcement here. Hope they haven’t given up already.  Just have to remind myself to be patient; it took about a year to start getting tangible results from OpenPOWER consortium.

DancingDinosaur is Alan Radding, a veteran information technology analyst and writer. Please follow DancingDinosaur on Twitter, @mainframeblog. See more of his IT writing at technologywriter.com and here.

Expect this to be the final DancingDinosaur for 2015.  Be back the week of Jan. 4


%d bloggers like this: