Posts Tagged ‘Quantum computing’

IBM Moves Quantum Computing Toward Commercial Systems

September 20, 2017

IBM seem determined to advance quantum computing. Just this week IBM announced its researchers developed a new approach to simulate molecules on a quantum computer that may one day help revolutionize chemistry and materials science. In this case, the researchers implemented a novel algorithm that is efficient with respect to the number of quantum operations required for the simulation. This involved a 7-qubit processor.

7-cubit processor

In the diagram above IBM scientists successfully used six qubits on a purpose-built seven-qubit quantum device to address the molecular structure problem for beryllium hydride (BeH2) – the largest molecule simulated on a quantum computer to date.

Back in May IBM announced an even bigger quantum device. It prototyped the first commercial processor with 17 qubits and leverages significant materials, device, and architecture improvements to make it the most powerful quantum processor created to date by IBM. This week’s announcement certainly didn’t surpass it in size. IBM engineered the 17-qubit system to be at least twice as powerful as what is available today to the public on the IBM Cloud and it will be the basis for the first IBM Q early-access commercial systems.

It has become apparent to the scientists and researchers who try to work with complex mathematical problems and simulations that the most powerful conventional commercial computers are not up to the task. Even the z14 with its 10-core CPU and hundreds of additional processors dedicated to I/O cannot do the job.

As IBM puts it: Even today’s most powerful supercomputers cannot exactly simulate the interacting behavior of all the electrons contained in a simple chemical compound such as caffeine. The ability of quantum computers to analyze molecules and chemical reactions could help accelerate research and lead to the creation of novel materials, development of more personalized drugs, or discovery of more efficient and sustainable energy sources.

The interplay of atoms and molecules is responsible for all matter that surrounds us in the world. Now “we have the potential to use quantum computers to boost our knowledge of natural phenomena in the world,” said Dario Gil, vice president of AI research and IBM Q, IBM Research. “Over the next few years, we anticipate IBM Q systems’ capabilities to surpass what today’s conventional computers can do, and start becoming a tool for experts in areas such as chemistry, biology, healthcare and materials science.”

So commercial quantum systems are coming.  Are you ready to bring a quantum system into you data center? Actually you can try one today for free here  or through GitHub, which offers a Python software development kit for writing quantum computing experiments, programs, and applications. Although DancingDinosaur will gladly stumble through conventional coding, quantum computing probably exceeds his frustration level even with a Python development kit.

However, if your organization is involved in these industries—materials science, chemistry, and the like or is wrestling with a problem you cannot do on a conventional computer—it probably is worth a try, especially for free. You can try an easy demo card game that compares quantum computing with conventional computing.

But as reassuringly as IBM makes quantum computing sound, don’t kid yourself; it is very complicated.  Deploying even a small qubit machine is not going to be like buying your first PC. Quantum bits, reportedly, are very fragile or transitory. Labs will keep them very cold just to better stabilize the system and keep them from switching their states before they should.  Just think how you’d feel about your PC if the bit states of 0 and 1 suddenly and inextricably changed.

That’s not the only possible headache. You only have limited time to work on cubits given their current volatility when not super cooled. Also, work still is progressing on advancing the quantum frameworks and mapping out ecosystem enablement.

Even IBM researchers admit that some problems may not be better on quantum computers. Still, until you pass certain threshold, like qubit volume, your workload might not perform better on a quantum computer. The IBM quantum team suggests it will take until 2021 to consistently solve a problem that has commercial relevance using quantum computing.

Until then, and even after, IBM is talking about a hybrid approach in which parts of a problem are solved with a quantum computer and the rest with a conventional system. So don’t plan on replacing your Z with a few dozen or even hundreds of qubits anytime soon.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Please follow DancingDinosaur on Twitter, @mainframeblog. See more of his IT writing at technologywriter.com and here.

 

IBM Introduces First Universal Commercial Quantum Computers

March 9, 2017

A few years ago DancingDinosaur first encountered the possibility of quantum computing. It was presented as a real but distant possibility. This is not something I need to consider I thought at the time.  By the time it is available commercially I will be long retired and probably six feet under. Well, I was wrong.

This week IBM unveiled its IBM Q quantum systems. IBM Q will be leading Watson and blockchain to deliver the most advanced set of services on the IBM Cloud platform. There are organizations using it now, and DancingDinosaur continues to be living and working still.

IBM Quantum Computing scientists Hanhee Paik (left) and Sarah Sheldon (right) examine the hardware inside an open dilution fridge at the IBM Q Lab

As IBM explains: While technologies that currently run on classical (or conventional) computers, such as Watson, can help find patterns and insights buried in vast amounts of existing data, quantum computers will deliver solutions to multi-faceted problems where patterns cannot be seen because the data doesn’t exist and the possibilities that you need to explore are too enormous to ever be processed by conventional computers.

Just don’t retire your z or Power system in favor on an IBM Q yet. As IBM explained at a recent briefing on the quantum computing the IBM Q universal quantum computers will be able to do any type of problem that conventional computers do today. However, many of today’s workloads, like on-line transaction processing, data storage, and web serving will continue to run more efficiently on conventional systems. The most powerful quantum systems of the next decade will be a hybrid of quantum computers with conventional computers to control logic and operations on large amounts of data.

The most immediate use cases will involve molecular dynamics, drug design, and materials. The new quantum machine, for example, will allow the healthcare industry to design more effective drugs faster and at less cost and the chemical industry to develop new and improved materials.

Another familiar use case revolves around optimization in finance and manufacturing. The problem here comes down to computers struggling with optimization involving an exponential number of possibilities. Quantum systems, noted IBM, hold the promise of more accurately finding the most profitable investment portfolio in the financial industry, the most efficient use of resources in manufacturing, and optimal routes for logistics in the transportation and retail industries.

To refresh the basics of quantum computing.  The challenges invariably entail exponential scale. You start with 2 basic ideas; 1) the uncertainty principle, which states that attempting to observe a state in general disturbs it while obtaining only partial information about the state. Or 2) where two systems can exist in an entangled state, causing them to behave in ways that cannot be explained by supposing that each has some state of its own. No more zero or 1 only.

The basic unit of quantum computing is the qubit. Today IBM is making available a 5 qubit system, which is pretty small in the overall scheme of things. Large enough, however, to experiment and test some hypotheses; things start getting interesting at 20 qubits. An inflexion point, IBM researchers noted, occurs around 50 qubits. At 50-100 qubits people can begin to do some serious work.

This past week IBM announced three quantum computing advances: the release of a new API for the IBM Quantum Experience that enables developers and programmers to begin building interfaces between IBM’s existing 5 qubit cloud-based quantum computer and conventional computers, without needing a deep background in quantum physics. You can try the 5 qubit quantum system via IBM’s Quantum Experience on Bluemix here.

IBM also released an upgraded simulator on the IBM Quantum Experience that can model circuits with up to 20 qubits. In the first half of 2017, IBM plans to release a full SDK on the IBM Quantum Experience for users to build simple quantum applications and software programs. Only the publically available 5 qubit quantum system with a web-based graphical user interface now; soon to be upgraded to more qubits.

 IBM Research Frontiers Institute allows participants to explore applications for quantum computing in a consortium dedicated to making IBM’s most ambitious research available to its members.

Finally, the IBM Q Early Access Systems allows the purchase of access to a dedicated quantum system hosted and managed by IBM. Initial system is 15+ qubits, with a fast roadmap promised to 50+ qubits.

“IBM has invested over decades to growing the field of quantum computing and we are committed to expanding access to quantum systems and their powerful capabilities for the science and business communities,” said Arvind Krishna, senior vice president of Hybrid Cloud and director for IBM Research. “We believe that quantum computing promises to be the next major technology that has the potential to drive a new era of innovation across industries.”

Are you ready for quantum computing? Try it today on IBM’s Quantum Experience through Bluemix. Let me know how it works for you.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Please follow DancingDinosaur on Twitter, @mainframeblog. See more of his IT writing at technologywriter.com and here.

IBM Systems Sets 2016 Priorities

December 14, 2015

Despite its corporate struggles, IBM Systems, the organization that replaced IBM System and Technology Group (IBM STG) had a pretty good year in 2015. It started the year by launching the z13, which was optimized for the cloud and mobile economy. No surprise there. IBM made no secret that cloud, mobile, and analytics were its big priorities.  Over the year it also added cognitive computing and software defined storage to its priorities.

But it might have left out its biggest achievement of 2015.  This week IBM announced receiving a major multi-year research grant to IBM scientists to advance the building blocks for a universal quantum computer. The award was made by the U.S. Intelligence Advanced Research Projects Activity (IARPA) program. This may not come to commercial fruition in our working lives but it has the potential to radically change computing as we have ever envisioned it. And it certainly will put a different spin on worries about Moore’s Law.

Three Types of Quantum Computing

Right now, according to IBM, the workhorse of the quantum computer is the quantum bit (qubit). Many scientists are tackling the challenge of building qubits, but quantum information is extremely fragile and requires special techniques to preserve the quantum state. This fragility of qubits played a key part in one of the preposterous but exciting plots on the TV show Scorpion. The major hurdles include creating qubits of high quality and packaging them together in a scalable form so they can perform complex calculations in a controllable way – limiting the errors that can result from heat and electromagnetic radiation.

IBM scientists made a great stride in that direction earlier this year by demonstrating critical breakthroughs to detect quantum errors by combining superconducting qubits in lattices on computer chips – and whose quantum circuit design is the only physical architecture that can scale to larger dimensions.

To return to a more mundane subject, revenue, during 2015 DancingDinosaur reported the positive contributions the z System made to IBM’s revenue, one of the company’s few positive revenue performers. Turned out DancingDinosaur missed one contributor since it doesn’t track constant currency. If you look at constant currency, which smooths out fluctuations in currency valuations, IBM Power Systems have been on an upswing for the last 3 quarters: up 1% in Q1, up 5% in Q2, up 2% in Q3.   DancingDinosaur expects both z and Power to contribute to IBM revenue in upcoming quarters.

Looking ahead to 2016, IBM identified the following priorities:

  • Develop an API ecosystem that monetizes big data and cognitive workloads, built on the cloud as part of becoming a better service provider.
  • Win the architectural battle with OpenPOWER and POWER8 – designed for data and the cognitive era. (Unspoken, beat x86.)
  • Extend z Systems for new mobile, cloud and in-line analytics workloads.
  • Capture new developers, markets and buyers with open innovation on IBM LinuxONE, the most advanced and trusted enterprise Linux system.
  • Shift the IBM storage portfolio to a Flash and the software defined model that disrupts the industry by enabling new workloads, very high speed, and data virtualization for improved data economics.
  • Engage clients through a digital-first Go-to-Market model

These are all well and good. About the only thing missing is any mention of the IBM Open Mainframe Project that was announced in August as a partnership with the Linux Foundation. Still hoping that will generate the kind of results in terms of innovative products for the z that the OpenPOWER initiative has started to produce. DancingDinosaur covered that announcement here. Hope they haven’t given up already.  Just have to remind myself to be patient; it took about a year to start getting tangible results from OpenPOWER consortium.

DancingDinosaur is Alan Radding, a veteran information technology analyst and writer. Please follow DancingDinosaur on Twitter, @mainframeblog. See more of his IT writing at technologywriter.com and here.

Expect this to be the final DancingDinosaur for 2015.  Be back the week of Jan. 4

IBM Edge Rocks 6000 Strong for Digital Transformation

May 15, 2015

Unless you’ve been doing the Rip Van Winkle thing, you have to have noticed that a profound digital transformation is underway fueled, in this case,from the bottom. “This is being driven by people embracing technology,” noted Tom Rosamilia, Senior Vice President, IBM System. And it will only get greater with quantum computing, a peak into it provided at Edge2015 by Arvind Krishna, senior vice president and director, IBM Research.

ibm_infographic_rough draft_r5

(Quantum computing, courtesy of IBM, click to enlarge)

Need proof? Just look around. New cars are now hot spots, and it’s not just luxury cars. Retailers are adding GPS inside their store and are using it to follow and understand the movement of shoppers in real time. Eighty-two percent of millennials do their banking from their mobile phone.  As Rosamilia noted, it amounts to “an unprecedented digital disruption” in the way people go about their lives. Dealing with this digital transformation and the challenges and opportunities it presents was what IBM Edge 2015 was about. With luck you can check out much from Edge2015 at the media center here.

The first day began with a flurry of product announcements starting with a combined package of new servers and storage software and solutions aimed to accelerate the development of hybrid cloud computing.  Hybrid cloud computing was big at Edge2015. To further stimulate hybrid computing IBM introduced new flexible software licensing of its middleware to help companies speed their adoption of hybrid cloud environments.

Joining in the announcement was Rocket Software, which sponsored the entertainment, including the outstanding Grace Potter concert. As for Rocket’s actual business, the company announced Rocket Data Access Service on Bluemix for z Systems, intended to provide companies a simplified connection to data on the IBM z Systems mainframe for development of mobile applications through Bluemix. Starting in June, companies can access a free trial of the service, which works with a range of database storage systems, including VSAM, ADABASE, IMS, CICS, and DB2, and enables access through common mobile application interfaces, including MongoDB, JDBC, and the REST protocol.  Now z shops have no excuse not to connect their systems with mobile and social business.

Storage also grabbed the spotlight. IBM introduced new storage systems, including the IBM Power System E850, a four-socket system with flexible capacity and up to 70% guaranteed utilization. The E850 targets cloud service providers and medium or large enterprises looking to securely and efficiently deploy multi-tenancy workloads while speeding access to data through larger in-memory databases with up to 4TB of installed memory.

The IBM Power System E880, designed to scale to 192 cores, is suitable for IBM DB2 with BLU Acceleration, enhancing the efficiency of cloud deployments; and the PurePOWER System, a converged infrastructure for cloud. It is intended to help deliver insights via the cloud, and is managed with OpenStack.

The company also will be shipping IBM Spectrum Control Storage Insights, a new software-defined storage offering that provides data management as a hybrid cloud service to optimize on-premises storage infrastructures. Storage Insights is designed to simplify storage management by improving storage visibility while applying analytics to ease capacity planning, enhance performance monitoring, and improve storage utilization. It does this by reclaiming under-utilized storage. Thank you analytics.

Finally for storage, the company announced IBM XIV GEN 3, designed for cloud with real-time compression that enables scaling as demand for data storage capacity expands. You can get more details on all the announcements at Edge 2015 here.

Already announced is IBM Edge 2016, again at the Venetian in Las Vegas in October 2016. That gives IBM 18 months to pack it with even more advances. Doubt there will be a new z by then; a new business class version of the z13 is more likely.

DancingDinosaur will take up specific topics from Edge2015 in the coming week. These will include social business on z, real-time analytics on z, and Jon Toigo sorting through the hype on SDS.

DancingDinosaur is Alan Radding, a veteran IT analyst and writer. Please follow DancingDinosaur on Twitter, @mainframeblog. See more of his IT writing on Technologywriter.com and here.


%d bloggers like this: