Posts Tagged ‘Quantum computing’

IBM Leverages Strategic Imperatives to Win in Cloud

March 16, 2018

Some people may have been ready to count out IBM in the cloud. The company, however, is clawing its way back into contention faster than many imagined. In a recent Forbes Magazine piece, IBM credits 16,000 AI engagements, 400 blockchain engagements, and a couple of quantum computing pilots as driving its return as a serious cloud player.

IBM uses blockchain to win the cloud

According to Fortune, IBM has jumped up to third in cloud revenue with $17 billion, ranking behind Microsoft with $18.6 billion and Amazon, with $17.5. Among other big players, Google comes in seventh with $3 billion

In the esoteric world of quantum computing IBM is touting live projects underway with JPMorganChase, Daimler, and others. Bob Evans, a respected technology writer and now the principle of Evans Strategic Communications, notes that the latest numbers “underscore not only IBM’s aggressive moves into enterprise IT’s highest-potential markets,” but also the legitimacy of the company’s claims that it has joined the top ranks of the competitive cloud-computing marketplace alongside Microsoft and Amazon.

As reported in the Fortune piece, CEO Ginni Rometty, speaking to a quarterly analyst briefing, declared: “While IBM has a considerable presence in the public-cloud IaaS market because many of its clients require or desire that, it intends to greatly differentiate itself from the big IaaS providers via higher-value technologies such as AI, blockchain, cybersecurity and analytics.” These are the areas that Evans sees as driving IBM into the cloud’s top tier.

Rometty continued; “I think you know that for us the cloud has never been about having Infrastructure-as-a-Service-only as a public cloud, or a low-volume commodity cloud; Frankly, Infrastructure-as-a-Service is almost just a dialtone. For us, it’s always been about a cloud that is going to be enterprise-strong and of which IaaS is only a component.”

In the Fortune piece she then laid out four strategic differentiators for the IBM Cloud, which in 2017 accounted for 22% of IBM’s revenue:

  1. “The IBM Cloud is built for “data and applications anywhere,” Rometty said. “When we say you can do data and apps anywhere, it means you have a public cloud, you have private clouds, you have on-prem environments, and then you have the ability to connect not just those but also to other clouds. That is what we have done—all of those components.”
  2. The IBM Cloud is “infused with AI,” she continued, alluding to how most of the 16,000 AI engagements also involve the cloud. She cited four of the most-popular ways in which customers are using AI: customer service, enhancing white-collar work, risk and compliance, and HR.
  3. For securing the cloud IBM opened more than 50 cybersecurity centers around the world to ensure “the IBM Cloud is secure to the core,” Rometty noted.
  4. “And perhaps this the most important differentiator—you have to be able to extend your cloud into everything that’s going to come down the road, and that could well be more cyber analytics but it is definitely blockchain, and it is definitely quantum because that’s where a lot of new value is going to reside.”

You have to give Rometty credit: She bet big that IBM’s strategic imperatives, especially blockchain and, riskiest of all, quantum computing would eventually pay off. The company had long realized it couldn’t compete in high volume, low margin businesses. She made her bet on what IBM does best—advanced research—and stuck with it.  During those 22 consecutive quarters of revenue losses she stayed the course and didn’t publicly question the decision.

As Fortune observed: In quantum, IBM’s leveraging its first-mover status and has moved far beyond theoretical proposals. “We are the only company with a 50-qubit system that is actually working—we’re not publishing pictures of photos of what it might look like, or writings that say if there is quantum, we can do it—rather, we are scaling rapidly and we are the only one working with clients in development working on our quantum,” Rometty said.

IBM’s initial forays into commercial quantum computing are just getting started: JPMorganChase is working on risk optimization and portfolio optimization using IBM quantum computing;  Daimler is using IBM’s quantum technology to explore new approaches to logistics and self-driving car routes; and JSR is doing computational chemistry to create entirely new materials. None of these look like the payback is right around the corner. As DancingDinosaur wrote just last week, progress with quantum has been astounding but much remains to be done to get a functioning commercial ecosystem in place to support the commercialization of quantum computing for business on a large scale.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog. See more of his work at and here.

The Rush to Quantum Computing

March 9, 2018

Are you excited about quantum computing? Are you taking steps to get ready for it? Do you have an idea of what you would like to do with quantum computing or a plan for how to do it? Except for the most science-driven organizations or those with incomprehensively complex challenges to solve DancingDinosaur can’t imagine this is the most pressing IT issue you are facing today.

Yet leading IT-based vendors are making astounding gains in moving quantum computing forward further and faster than the industry was even projecting a few months ago. This past Nov. IBM announced a 50 qubit system. Earlier this month Google announced Bristlecone, which claims to top that. With Bristlecone Google trumps IBM for now with 72 qubits. However, that may not be the most important metric to focus on.

Never heard of quantum supremacy? You are going to hear a lot about it in the coming weeks, months, and even years as the vendors battle for the quantum supremacy title. Here is how Wikipedia defines it: Quantum supremacy is the potential ability of quantum computing devices to solve problems that classical computers cannot. In computational complexity-theoretic terms, this generally means providing a super-polynomial speedup over the best known or possible classical algorithm. If this doesn’t send you racing to dig out your old college math book you were a better student than DancingDinosaur. In short, supremacy means beating the current best conventional algorithms. But you can’t just beat them; you have to do it using less energy or faster or some way that will demonstrate your approach’s advantage.

The issue resolves around the instability of qubits; the hardware needs to be sturdy to run them. Industry sources note that quantum computers need to keep their processors extremely cold (Kelvin levels of cold) and protect them from external shocks. Even accidental sounds can cause the computer to make mistakes. To operate in even remotely real-world settings, quantum processors also need to have an error rate of less than 0.5 percent for every two qubits. Google’s best came in at 0.6 percent using its much smaller 9-qubit hardware. Its latest blog post didn’t state Bristlecone’s error rate, but Google promised to improve on its previous results. To drop the error rate for any qubit processor, engineers must figure out how software, control electronics, and the processor itself can work alongside one another without causing errors.

50 cubits currently is considered the minimum number for serious business work. IBM’s November announcement, however, was quick to point out that “does not mean quantum computing is ready for common use.” The system IBM developed remains extremely finicky and challenging to use, as are those being built by others. In its 50-qubit system, the quantum state is preserved for 90 microseconds—record length for the industry but still an extremely short period of time.

Nonetheless, 50 qubits have emerged as the minimum number for a (relatively) stable system to perform practical quantum computing. According to IBM, a 50-qubit machine can do things that are extremely difficult to simulate without quantum technology.

The problem touches on one of the attributes of quantum systems.  As IBM explains, where normal computers store information as either a 1 or a 0, quantum computers exploit two phenomena—entanglement and superposition—to process information differently.  Conventional computers store numbers as sequences of 0 and 1 in memory and process the numbers using only the simplest mathematical operations, add and subtract.

Quantum computers can digest 0 and 1 too but have a broader array of tricks. That’s where entanglement and superposition come in.  For example, contradictory things can exist concurrently. Quantum geeks often cite a riddle dubbed Schrödinger’s cat. In this riddle the cat can be alive and dead at the same time because quantum systems can handle multiple, contradictory states. That can be very helpful if you are trying to solve huge data- and compute-intensive problems like a Monte Carlo simulation. After working at quantum computing for decades the new 50-cubit system finally brings something IBM can offer to businesses which face complex challenges that can benefit from quantum’s superposition capabilities.

Still, don’t bet on using quantum computing to solve serious business challenges very soon.  An entire ecosystem of programmers, vendors, programming models, methodologies, useful tools, and a host of other things have to fall into place first. IBM and Google and others are making stunningly rapid progress. Maybe DancingDinosaur will actually be alive to see quantum computing as just another tool in a business’s problem-solving toolkit.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog. See more of his work at and here.

IBM Moves Quantum Computing Toward Commercial Systems

September 20, 2017

IBM seem determined to advance quantum computing. Just this week IBM announced its researchers developed a new approach to simulate molecules on a quantum computer that may one day help revolutionize chemistry and materials science. In this case, the researchers implemented a novel algorithm that is efficient with respect to the number of quantum operations required for the simulation. This involved a 7-qubit processor.

7-cubit processor

In the diagram above IBM scientists successfully used six qubits on a purpose-built seven-qubit quantum device to address the molecular structure problem for beryllium hydride (BeH2) – the largest molecule simulated on a quantum computer to date.

Back in May IBM announced an even bigger quantum device. It prototyped the first commercial processor with 17 qubits and leverages significant materials, device, and architecture improvements to make it the most powerful quantum processor created to date by IBM. This week’s announcement certainly didn’t surpass it in size. IBM engineered the 17-qubit system to be at least twice as powerful as what is available today to the public on the IBM Cloud and it will be the basis for the first IBM Q early-access commercial systems.

It has become apparent to the scientists and researchers who try to work with complex mathematical problems and simulations that the most powerful conventional commercial computers are not up to the task. Even the z14 with its 10-core CPU and hundreds of additional processors dedicated to I/O cannot do the job.

As IBM puts it: Even today’s most powerful supercomputers cannot exactly simulate the interacting behavior of all the electrons contained in a simple chemical compound such as caffeine. The ability of quantum computers to analyze molecules and chemical reactions could help accelerate research and lead to the creation of novel materials, development of more personalized drugs, or discovery of more efficient and sustainable energy sources.

The interplay of atoms and molecules is responsible for all matter that surrounds us in the world. Now “we have the potential to use quantum computers to boost our knowledge of natural phenomena in the world,” said Dario Gil, vice president of AI research and IBM Q, IBM Research. “Over the next few years, we anticipate IBM Q systems’ capabilities to surpass what today’s conventional computers can do, and start becoming a tool for experts in areas such as chemistry, biology, healthcare and materials science.”

So commercial quantum systems are coming.  Are you ready to bring a quantum system into you data center? Actually you can try one today for free here  or through GitHub, which offers a Python software development kit for writing quantum computing experiments, programs, and applications. Although DancingDinosaur will gladly stumble through conventional coding, quantum computing probably exceeds his frustration level even with a Python development kit.

However, if your organization is involved in these industries—materials science, chemistry, and the like or is wrestling with a problem you cannot do on a conventional computer—it probably is worth a try, especially for free. You can try an easy demo card game that compares quantum computing with conventional computing.

But as reassuringly as IBM makes quantum computing sound, don’t kid yourself; it is very complicated.  Deploying even a small qubit machine is not going to be like buying your first PC. Quantum bits, reportedly, are very fragile or transitory. Labs will keep them very cold just to better stabilize the system and keep them from switching their states before they should.  Just think how you’d feel about your PC if the bit states of 0 and 1 suddenly and inextricably changed.

That’s not the only possible headache. You only have limited time to work on cubits given their current volatility when not super cooled. Also, work still is progressing on advancing the quantum frameworks and mapping out ecosystem enablement.

Even IBM researchers admit that some problems may not be better on quantum computers. Still, until you pass certain threshold, like qubit volume, your workload might not perform better on a quantum computer. The IBM quantum team suggests it will take until 2021 to consistently solve a problem that has commercial relevance using quantum computing.

Until then, and even after, IBM is talking about a hybrid approach in which parts of a problem are solved with a quantum computer and the rest with a conventional system. So don’t plan on replacing your Z with a few dozen or even hundreds of qubits anytime soon.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Please follow DancingDinosaur on Twitter, @mainframeblog. See more of his IT writing at and here.


IBM Introduces First Universal Commercial Quantum Computers

March 9, 2017

A few years ago DancingDinosaur first encountered the possibility of quantum computing. It was presented as a real but distant possibility. This is not something I need to consider I thought at the time.  By the time it is available commercially I will be long retired and probably six feet under. Well, I was wrong.

This week IBM unveiled its IBM Q quantum systems. IBM Q will be leading Watson and blockchain to deliver the most advanced set of services on the IBM Cloud platform. There are organizations using it now, and DancingDinosaur continues to be living and working still.

IBM Quantum Computing scientists Hanhee Paik (left) and Sarah Sheldon (right) examine the hardware inside an open dilution fridge at the IBM Q Lab

As IBM explains: While technologies that currently run on classical (or conventional) computers, such as Watson, can help find patterns and insights buried in vast amounts of existing data, quantum computers will deliver solutions to multi-faceted problems where patterns cannot be seen because the data doesn’t exist and the possibilities that you need to explore are too enormous to ever be processed by conventional computers.

Just don’t retire your z or Power system in favor on an IBM Q yet. As IBM explained at a recent briefing on the quantum computing the IBM Q universal quantum computers will be able to do any type of problem that conventional computers do today. However, many of today’s workloads, like on-line transaction processing, data storage, and web serving will continue to run more efficiently on conventional systems. The most powerful quantum systems of the next decade will be a hybrid of quantum computers with conventional computers to control logic and operations on large amounts of data.

The most immediate use cases will involve molecular dynamics, drug design, and materials. The new quantum machine, for example, will allow the healthcare industry to design more effective drugs faster and at less cost and the chemical industry to develop new and improved materials.

Another familiar use case revolves around optimization in finance and manufacturing. The problem here comes down to computers struggling with optimization involving an exponential number of possibilities. Quantum systems, noted IBM, hold the promise of more accurately finding the most profitable investment portfolio in the financial industry, the most efficient use of resources in manufacturing, and optimal routes for logistics in the transportation and retail industries.

To refresh the basics of quantum computing.  The challenges invariably entail exponential scale. You start with 2 basic ideas; 1) the uncertainty principle, which states that attempting to observe a state in general disturbs it while obtaining only partial information about the state. Or 2) where two systems can exist in an entangled state, causing them to behave in ways that cannot be explained by supposing that each has some state of its own. No more zero or 1 only.

The basic unit of quantum computing is the qubit. Today IBM is making available a 5 qubit system, which is pretty small in the overall scheme of things. Large enough, however, to experiment and test some hypotheses; things start getting interesting at 20 qubits. An inflexion point, IBM researchers noted, occurs around 50 qubits. At 50-100 qubits people can begin to do some serious work.

This past week IBM announced three quantum computing advances: the release of a new API for the IBM Quantum Experience that enables developers and programmers to begin building interfaces between IBM’s existing 5 qubit cloud-based quantum computer and conventional computers, without needing a deep background in quantum physics. You can try the 5 qubit quantum system via IBM’s Quantum Experience on Bluemix here.

IBM also released an upgraded simulator on the IBM Quantum Experience that can model circuits with up to 20 qubits. In the first half of 2017, IBM plans to release a full SDK on the IBM Quantum Experience for users to build simple quantum applications and software programs. Only the publically available 5 qubit quantum system with a web-based graphical user interface now; soon to be upgraded to more qubits.

 IBM Research Frontiers Institute allows participants to explore applications for quantum computing in a consortium dedicated to making IBM’s most ambitious research available to its members.

Finally, the IBM Q Early Access Systems allows the purchase of access to a dedicated quantum system hosted and managed by IBM. Initial system is 15+ qubits, with a fast roadmap promised to 50+ qubits.

“IBM has invested over decades to growing the field of quantum computing and we are committed to expanding access to quantum systems and their powerful capabilities for the science and business communities,” said Arvind Krishna, senior vice president of Hybrid Cloud and director for IBM Research. “We believe that quantum computing promises to be the next major technology that has the potential to drive a new era of innovation across industries.”

Are you ready for quantum computing? Try it today on IBM’s Quantum Experience through Bluemix. Let me know how it works for you.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Please follow DancingDinosaur on Twitter, @mainframeblog. See more of his IT writing at and here.

IBM Systems Sets 2016 Priorities

December 14, 2015

Despite its corporate struggles, IBM Systems, the organization that replaced IBM System and Technology Group (IBM STG) had a pretty good year in 2015. It started the year by launching the z13, which was optimized for the cloud and mobile economy. No surprise there. IBM made no secret that cloud, mobile, and analytics were its big priorities.  Over the year it also added cognitive computing and software defined storage to its priorities.

But it might have left out its biggest achievement of 2015.  This week IBM announced receiving a major multi-year research grant to IBM scientists to advance the building blocks for a universal quantum computer. The award was made by the U.S. Intelligence Advanced Research Projects Activity (IARPA) program. This may not come to commercial fruition in our working lives but it has the potential to radically change computing as we have ever envisioned it. And it certainly will put a different spin on worries about Moore’s Law.

Three Types of Quantum Computing

Right now, according to IBM, the workhorse of the quantum computer is the quantum bit (qubit). Many scientists are tackling the challenge of building qubits, but quantum information is extremely fragile and requires special techniques to preserve the quantum state. This fragility of qubits played a key part in one of the preposterous but exciting plots on the TV show Scorpion. The major hurdles include creating qubits of high quality and packaging them together in a scalable form so they can perform complex calculations in a controllable way – limiting the errors that can result from heat and electromagnetic radiation.

IBM scientists made a great stride in that direction earlier this year by demonstrating critical breakthroughs to detect quantum errors by combining superconducting qubits in lattices on computer chips – and whose quantum circuit design is the only physical architecture that can scale to larger dimensions.

To return to a more mundane subject, revenue, during 2015 DancingDinosaur reported the positive contributions the z System made to IBM’s revenue, one of the company’s few positive revenue performers. Turned out DancingDinosaur missed one contributor since it doesn’t track constant currency. If you look at constant currency, which smooths out fluctuations in currency valuations, IBM Power Systems have been on an upswing for the last 3 quarters: up 1% in Q1, up 5% in Q2, up 2% in Q3.   DancingDinosaur expects both z and Power to contribute to IBM revenue in upcoming quarters.

Looking ahead to 2016, IBM identified the following priorities:

  • Develop an API ecosystem that monetizes big data and cognitive workloads, built on the cloud as part of becoming a better service provider.
  • Win the architectural battle with OpenPOWER and POWER8 – designed for data and the cognitive era. (Unspoken, beat x86.)
  • Extend z Systems for new mobile, cloud and in-line analytics workloads.
  • Capture new developers, markets and buyers with open innovation on IBM LinuxONE, the most advanced and trusted enterprise Linux system.
  • Shift the IBM storage portfolio to a Flash and the software defined model that disrupts the industry by enabling new workloads, very high speed, and data virtualization for improved data economics.
  • Engage clients through a digital-first Go-to-Market model

These are all well and good. About the only thing missing is any mention of the IBM Open Mainframe Project that was announced in August as a partnership with the Linux Foundation. Still hoping that will generate the kind of results in terms of innovative products for the z that the OpenPOWER initiative has started to produce. DancingDinosaur covered that announcement here. Hope they haven’t given up already.  Just have to remind myself to be patient; it took about a year to start getting tangible results from OpenPOWER consortium.

DancingDinosaur is Alan Radding, a veteran information technology analyst and writer. Please follow DancingDinosaur on Twitter, @mainframeblog. See more of his IT writing at and here.

Expect this to be the final DancingDinosaur for 2015.  Be back the week of Jan. 4

IBM Edge Rocks 6000 Strong for Digital Transformation

May 15, 2015

Unless you’ve been doing the Rip Van Winkle thing, you have to have noticed that a profound digital transformation is underway fueled, in this case,from the bottom. “This is being driven by people embracing technology,” noted Tom Rosamilia, Senior Vice President, IBM System. And it will only get greater with quantum computing, a peak into it provided at Edge2015 by Arvind Krishna, senior vice president and director, IBM Research.

ibm_infographic_rough draft_r5

(Quantum computing, courtesy of IBM, click to enlarge)

Need proof? Just look around. New cars are now hot spots, and it’s not just luxury cars. Retailers are adding GPS inside their store and are using it to follow and understand the movement of shoppers in real time. Eighty-two percent of millennials do their banking from their mobile phone.  As Rosamilia noted, it amounts to “an unprecedented digital disruption” in the way people go about their lives. Dealing with this digital transformation and the challenges and opportunities it presents was what IBM Edge 2015 was about. With luck you can check out much from Edge2015 at the media center here.

The first day began with a flurry of product announcements starting with a combined package of new servers and storage software and solutions aimed to accelerate the development of hybrid cloud computing.  Hybrid cloud computing was big at Edge2015. To further stimulate hybrid computing IBM introduced new flexible software licensing of its middleware to help companies speed their adoption of hybrid cloud environments.

Joining in the announcement was Rocket Software, which sponsored the entertainment, including the outstanding Grace Potter concert. As for Rocket’s actual business, the company announced Rocket Data Access Service on Bluemix for z Systems, intended to provide companies a simplified connection to data on the IBM z Systems mainframe for development of mobile applications through Bluemix. Starting in June, companies can access a free trial of the service, which works with a range of database storage systems, including VSAM, ADABASE, IMS, CICS, and DB2, and enables access through common mobile application interfaces, including MongoDB, JDBC, and the REST protocol.  Now z shops have no excuse not to connect their systems with mobile and social business.

Storage also grabbed the spotlight. IBM introduced new storage systems, including the IBM Power System E850, a four-socket system with flexible capacity and up to 70% guaranteed utilization. The E850 targets cloud service providers and medium or large enterprises looking to securely and efficiently deploy multi-tenancy workloads while speeding access to data through larger in-memory databases with up to 4TB of installed memory.

The IBM Power System E880, designed to scale to 192 cores, is suitable for IBM DB2 with BLU Acceleration, enhancing the efficiency of cloud deployments; and the PurePOWER System, a converged infrastructure for cloud. It is intended to help deliver insights via the cloud, and is managed with OpenStack.

The company also will be shipping IBM Spectrum Control Storage Insights, a new software-defined storage offering that provides data management as a hybrid cloud service to optimize on-premises storage infrastructures. Storage Insights is designed to simplify storage management by improving storage visibility while applying analytics to ease capacity planning, enhance performance monitoring, and improve storage utilization. It does this by reclaiming under-utilized storage. Thank you analytics.

Finally for storage, the company announced IBM XIV GEN 3, designed for cloud with real-time compression that enables scaling as demand for data storage capacity expands. You can get more details on all the announcements at Edge 2015 here.

Already announced is IBM Edge 2016, again at the Venetian in Las Vegas in October 2016. That gives IBM 18 months to pack it with even more advances. Doubt there will be a new z by then; a new business class version of the z13 is more likely.

DancingDinosaur will take up specific topics from Edge2015 in the coming week. These will include social business on z, real-time analytics on z, and Jon Toigo sorting through the hype on SDS.

DancingDinosaur is Alan Radding, a veteran IT analyst and writer. Please follow DancingDinosaur on Twitter, @mainframeblog. See more of his IT writing on and here.

%d bloggers like this: