Posts Tagged ‘HPE’

IBM Q Network Promises to Commercialize Quantum

December 14, 2017

The dash to quantum computing is well underway and IBM is preparing to be one of the leaders. When IBM gets there it will find plenty of company. HPE, Dell/EMC, Microsoft and more are staking out quantum claims. In response IBM is speeding the build-out of its quantum ecosystem, the IBM Q Network, which it announced today.

IBM’s 50 qubit system prototype

Already IBM introduced its third generation of quantum computers in Nov., a prototype 50 qubit system. IBM promises online access to the IBM Q systems by the end of 2017, with a series of planned upgrades during 2018. IBM is focused on making available advanced, scalable universal quantum computing systems to clients to explore practical applications.

Further speeding the process, IBM is building a quantum computing ecosystem of big companies and research institutions. The result, dubbed IBM Q Network, will consist of a worldwide network of individuals and organizations, including scientists, engineers, business leaders, and forward thinking companies, academic institutions, and national research labs enabled by IBM Q. Its mission: advancing quantum computing and launching the first commercial applications.

Two particular goals stand out: Engage industry leaders to combine quantum computing expertise with industry-oriented, problem-specific expertise to accelerate development of early commercial uses. The second: expand and train the ecosystem of users, developers, and application specialists that will be essential to the adoption and scaling of quantum computing.

The key to getting this rolling is the groundwork IBM laid with the IBM Q Experience, which IBM initially introduced in May of 2016 as a 5 cubit system. The Q Experience (free) upgrade followed with a 16-qubit upgrade in May, 2017. The IBM effort to make available a commercial universal quantum computer for business and science applications has increased with each successive rev until today with a prototype 50 cubit system delivered via the IBM Cloud platform.

IBM opened public access to its quantum processors over a year ago  to serve as an enablement tool for scientific research, a resource for university classrooms, and a catalyst for enthusiasm. Since then, participants have run more than 1.7M quantum experiments on the IBM Cloud.

To date IBM was pretty easy going about access to the quantum computers but now that they have a 20 cubit system and 50 cubit system coming the company has become a little more restrictive about who can use them. Participation in the IBM Q Network is the only way to access these advanced systems, which involves a commitment of money, intellectual property, and agreement to share and cooperate, although IBM implied at any early briefing that it could be flexible about what was shared and what could remain an organization’s proprietary IP.

Another reason to participate in the Quantum Experience is QISKit, an open source quantum computing SDK anyone can access. Most DancingDinosaur readers, if they want to participate in IBM’s Q Network will do so as either partners or members. Another option, a Hub, is really targeted for bigger, more ambitious early adopters. Hubs, as IBM puts it, provide access to IBM Q systems, technical support, educational and training resources, community workshops and events, and opportunities for joint work.

The Q Network has already attracted some significant interest for organizations at every level and across a variety of industry segments. These include automotive, financial, electronics, chemical, and materials players from across the globe. Initial participants include JPMorgan Chase, Daimler AG, Samsung, JSR Corporation, Barclays, Hitachi Metals, Honda, Nagase, Keio University, Oak Ridge National Lab, Oxford University, and University of Melbourne.

As noted at the top, other major players are staking out their quantum claims, but none seem as far along or as comprehensive as IBM:

  • Dell/EMC is aiming to solve complex, life-impacting analytic problems like autonomous vehicles, smart cities, and precision medicine.
  • HPE appears to be focusing its initial quantum efforts on encryption.
  • Microsoft, not surprisingly, expects to release a new programming language and computing simulator designed for quantum computing.

As you would expect, IBM also is rolling out IBM Q Consulting to help organizations envision new business value through the application of quantum computing technology and provide customized roadmaps to help enterprises become quantum-ready.

Will quantum computing actually happen? Your guess is as good as anyone’s. I first heard about quantum physics in high school 40-odd years ago. It was baffling but intriguing then. Today it appears more real but still nothing is assured. If you’re willing to burn some time and resources to try it, go right ahead. Please tell DancingDinosaur what you find.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Please follow DancingDinosaur on Twitter, @mainframeblog. See more of his IT writing at technologywriter.com and here.

IBM’s POWER9 Races to AI

December 7, 2017

IBM is betting the future of its Power Systems on artificial intelligence (AI). The company introduced its newly designed POWER9 processor publicly this past Tuesday. The new machine, according to IBM, is capable of shortening the training of deep learning frameworks by nearly 4x, allowing enterprises to build more accurate AI applications, faster.

IBM engineer tests the POWER9

Designed for the post-CPU era, the core POWER9 building block is the IBM Power Systems AC922. The AC922, notes IBM, is the first to embed PCI-Express 4.0, next-generation NVIDIA NVLink, and OpenCAPI—3 interface accelerators—which together can accelerate data movement 9.5x faster than PCIe 3.0 based x86 systems. The AC922 is designed to drive demonstrable performance improvements across popular AI frameworks such as Chainer, TensorFlow and Caffe, as well as accelerated databases such as Kinetica.

More than a CPU under the AC922 cover

Depending on your sense of market timing, POWER9 may be coming at the best or worst time for IBM.  Notes industry observer Timothy Prickett Morgan, The Next Platform: “The server market is booming as 2017 comes to a close, and IBM is looking to try to catch the tailwind and lift its Power Systems business.”

As Morgan puts it, citing IDC 3Q17 server revenue figures, HPE and Dell are jockeying for the lead in the server space, and for the moment, HPE (including its H3C partnership in China) has the lead with $3.32 billion in revenues, compared to Dell’s $3.07 billion, while Dell was the shipment leader, with 503,000 machines sold in Q3 2017 versus HPE’s 501,400 machines shipped. IBM does not rank in the top five shippers but thanks in part to the Z and big Power8 boxes, IBM still holds the number three server revenue generator spot, with $1.09 billion in sales for the third quarter, according to IDC. The z system accounted for $673 million of that, up 63.8 percent year-on year due mainly to the new Z. If you do the math, Morgan continued, the Power Systems line accounted for $420.7 million in the period, down 7.2 percent from Q3 2016. This is not surprising given that customers held back knowing Power9 systems were coming.

To get Power Systems back to where it used to be, Morgan continued, IBM must increase revenues by a factor of three or so. The good news is that, thanks to the popularity of hybrid CPU-GPU systems, which cost around $65,000 per node from IBM, this isn’t impossible. Therefore, it should take fewer machines to rack up the revenue, even if it comes from a relatively modest number of footprints and not a huge number of Power9 processors. More than 90 percent of the compute in these systems is comprised of GPU accelerators, but due to bookkeeping magic, it all accrues to Power Systems when these machines are sold. Plus IBM reportedly will be installing over 10,000 such nodes for the US Department of Energy’s Summit and Sierra supercomputers in the coming two quarters, which should provide a nice bump. And once IBM gets the commercial Power9 systems into the field, sales should pick up again, Morgan expects.

IBM clearly is hoping POWER9 will cut into Intel x86 sales. But that may not happen as anticipated. Intel is bringing out its own advanced x86 Xeon machine, Skylake, rumored to be quite expensive. Don’t expect POWER9 systems to be cheap either. And the field is getting more crowded. Morgan noted various ARM chips –especially ThunderX2 from Cavium and Centriq 2400 from Qualcomm –can boost non-X86 numbers and divert sales from IBM’s Power9 system. Also, AMD’s Epyc X86 processors have a good chance of stealing some market share from Intel’s Skylake. So the Power9 will have to fight for every sale IBM wants and take nothing for granted.

No doubt POWER9 presents a good case and has a strong backer in Google, but even that might not be enough. Still, POWER9 sits at the heart of what is expected to be the most powerful data-intensive supercomputers in the world, the Summit and Sierra supercomputers, expected to knock off the world’s current fastest supercomputers from China.

Said Bart Sano, VP of Google Platforms: “Google is excited about IBM’s progress in the development of the latest POWER technology;” adding “the POWER9 OpenCAPI bus and large memory capabilities allow further opportunities for innovation in Google data centers.”

This really is about deep learning, one of the latest hot buzzwords today. Deep learning emerged as a fast growing machine learning method that extracts information by crunching through millions of processes and data to detect and rank the most important aspects of the data. IBM designed the POWER9 chip to manage free-flowing data, streaming sensors, and algorithms for data-intensive AI and deep learning workloads on Linux.  Are your people ready to take advantage of POWER9?

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Please follow DancingDinosaur on Twitter, @mainframeblog. See more of his IT writing at technologywriter.com and here.

IBM Introduces Cloud Private to Hybrid Clouds

November 10, 2017

When you have enough technologies lying around your basement, sometimes you can cobble a few pieces together, mix it with some sexy new stuff and, bingo, you have something that meets a serious need of a number of disparate customers. That’s essentially what IBM did with Cloud Private, which it announced Nov. 1.

IBM staff test Cloud Private automation software

IBM intended Cloud Private to enable companies to create on-premises cloud capabilities similar to public clouds to accelerate app dev. Don’t think it as just old stuff; the new platform is built on the open source Kubernetes-based container architecture and supports both Docker containers and Cloud Foundry. This facilitates integration and portability of workloads, enabling them to evolve to almost any cloud environment, including—especially—the public IBM Cloud.

Also IBM announced container-optimized versions of core enterprise software, including IBM WebSphere Liberty, DB2 and MQ – widely used to run and manage the world’s most business-critical applications and data. This makes it easier to share data and evolve applications as needed across the IBM Cloud, private, public clouds, and other cloud environments with a consistent developer, administrator, and user experience.

Cloud Private amounts to a new software platform, which relies on open source container technology to unlock billions of dollars in core data and applications incorporating legacy software like WebSphere and Db2. The purpose is to extend cloud-native tools across public and private clouds. For z data centers that have tons of valuable, reliable working systems years away from being retired, if ever, Cloud Private may be just what they need.

Almost all enterprise systems vendors are trying to do the same hybrid cloud computing enablement. HPE, Microsoft, Cisco, which is partnering with Google on this, and more. This is a clear indication that the cloud and especially the hybrid cloud is crossing the proverbial chasm. In years past IT managers and C-level executives didn’t want anything to do with the cloud; the IT folks saw it as a threat to their on premises data center and the C-suite was scared witless about security.

Those issues haven’t gone away although the advent of hybrid clouds have mitigated some of the fears among both groups. Similarly, the natural evolution of the cloud and advances in hybrid cloud computing make this more practical.

The private cloud too is growing. According to IBM, while public cloud adoption continues to grow at a rapid pace, organizations, especially in regulated industries of finance and health care, are continuing to leverage private clouds as part of their journey to public cloud environments to quickly launch and update applications. This also is what is driving hybrid clouds. IBM estimates companies will spend more than $50 billion globally starting in 2017 to create and evolve private clouds with growth rates of 15 to 20 percent a year through 2020, according to IBM market projections.

The problem facing IBM and the other enterprise systems vendors scrambling for hybrid clouds is how to transition legacy systems into cloud native systems. The hybrid cloud in effect acts as facilitating middleware. “Innovation and adoption of public cloud services has been constrained by the challenge of transitioning complex enterprise systems and applications into a true cloud-native environment,” said Arvind Krishna, Senior Vice President for IBM Hybrid Cloud and Director of IBM Research. IBM’s response is Cloud Private, which brings rapid application development and modernization to existing IT infrastructure while combining it with the service of a public cloud platform.

Hertz adopted this approach. “Private cloud is a must for many enterprises such as ours working to reduce or eliminate their dependence on internal data centers,” said Tyler Best, Hertz Chief Information Officer.  A strategy consisting of public, private and hybrid cloud is essential for large enterprises to effectively make the transition from legacy systems to cloud.

IBM is serious about cloud as a strategic initiative. Although not as large as Microsoft Azure or Amazon Web Service (AWS) in the public cloud, a recent report by Synergy Research found that IBM is a major provider of private cloud services, making the company the third-largest overall cloud provider.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Please follow DancingDinosaur on Twitter, @mainframeblog. See more of his IT writing at technologywriter.com and here.


%d bloggers like this: