IBM Introduces New Flash Storage Family

February 14, 2020

IBM describes this mainly as a simplification move. The company is eliminating 2 current storage lines, Storwize and Flash Systems A9000, and replacing them with a series of flash storage systems that will scale from entry to enterprise. 

Well, uh, not quite enterprise as Dancing Dinosaur readers might think of it. No changes are planned for the DS8000 storage systems, which are focused on the mainframe market, “All our existing product lines, not including our mainframe storage, will be replaced by the new FlashSystem family,” said Eric Herzog, IBM’s chief marketing officer and vice president of worldwide storage channel in a published report earlier this week

The move will rename two incompatible storage lines out of the IBM product lineup and replace them with a line that provides compatible storage software and services from entry level to the highest enterprise, mainframe excluded, Herzog explained. The new flash systems family promises more functions, more features, and lower prices, he continued.

Central to the new Flash Storage Family is NVMe, which comes in multiple flavors.  NVM Express (NVMe) or Non-Volatile Memory Host Controller Interface Specification (NVMHCIS) is an open logical device interface specification for accessing non-volatile storage media attached via a PCI Express (PCIe) bus.

At the top of the new family line is the NVMe and multicloud ultra-high throughput storage system. This is a validated system with IBM implementation. IBM promises unmatched NVMe performance, SCM, and  IBM FlashCore technology. In addition it brings the features of IBM Spectrum Virtualize to support the most demanding workloads.

Image result for IBM flash storage family

IBM multi-cloud flash storage family system

Get NVMe performance, SCM and  IBM FlashCore technology, and the rich features of IBM Spectrum Virtualize to support your most demanding workloads.

NVM Express (NVMe) or Non-Volatile Memory Host Controller Interface Specification (NVMHCIS) is an open logical device interface specification for accessing non-volatile storage media attached via a PCI Express (PCIe) bus.

Next up are the IBM FlashSystem 9200 and IBM FlashSystem 9200R, IBM tested and validated rack solutions designed for the most demanding environments. With the extreme performance of end-to-end NVMe, the IBM FlashCore technology, and the ultra-low latency of Storage Class Memory (SCM). It also brings IBM Spectrum Virtualize and AI predictive storage management with proactive support by Storage Insights. FlashSystem 9200R is delivered assembled, with installation and configuration completed by IBM to ensure a working multicloud solution.

Gain the performance of all-flash and NVMe with SCM support for flash acceleration and the reliability and innovation of IBM FlashCore technology, plus the rich features of IBM Spectrum Virtualize — all in a powerful 2U storage system.

Combine the performance of flash and NVMe with the reliability and innovation of IBM FlashCore® and the rich features of IBM Spectrum Virtualize™, bringing high-end capability to clients needing enterprise mid-range storage.

In the middle of the family is the IBM FlashSystem 7200 and FlashSystem 7200H. As IBM puts it, these offer end-to-end NVMe, the innovation of IBM FlashCore technology, the ultra-low latency of Storage Class Memory (SCM), the flexibility of IBM Spectrum Virtualize, and the AI predictive storage management and proactive support of Storage Insights. It comes in a powerful 2U storage all flash or hybrid flash array. The IBM FlashSystem 7200 brings mid-range storage while allowing the organization to add  multicloud technology that best supports the business.

At the bottom of the line is the NVMe entry enterprise all flash storage solution, which brings  NVMe end-to-end capabilities and flash performance to the affordable FlashSystem 5100. As IBM describes it, the FlashSystem® 5010 and IBM FlashSystem 5030 (formerly known as IBM Storwize V5010E and Storwize V5030E–they are still there, just renamed) are all-flash or hybrid flash solutions intended to provide enterprise-grade functionalities without compromising affordability or performance. Built with the flexibility of IBM Spectrum Virtualize and AI-powered predictive storage management and proactive support of Storage Insights. IBM FlashSystem 5000 helps make modern technologies such as artificial intelligence accessible to enterprises of all sizes. In short, these promise entry-level flash storage solutions designed to provide enterprise-grade functionality without compromising affordability or performance

IBM likes the words affordable and affordability in discussing this new storage family. But, as is typical with IBM, nowhere will you see a price or a reference to cost/TB or cost/IOPS or cost of anything although these are crucial metrics for evaluating any flash storage system. DancingDinosaur expects this after 20 years of writing about the z. Also, as I wrote at the outset, the z is not even included in this new flash storage family so we don’t even have to chuckle if they describe z storage as affordable.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/

Meet IBM’s New CEO

February 6, 2020

Have to admire Ginny Rometty. She survived 19 consecutive losing quarters (one quarter shy of 5 years), which DancingDinosaur and the rest of the world covered with monotonous regularity, and she was not bounced out until this January. Memo to readers: Keep that in mind if you start feeling performance heat from top management. Can’t imagine another company that would tolerate it but what do I know.

Arvind Krishna becomes the Chief Executive Officer and a member of the I BM Board of Directors effective April 6, 2020. Krishna is currently IBM Senior Vice President for Cloud and Cognitive Software, and was a principal architect of the company’s acquisition of Red Hat. The cloud/Red Hat strategy has only just started to show signs of payback.

As IBM writes: Under Rometty’s leadership, IBM acquired 65 companies, built out key capabilities in hybrid cloud, security, industry and data, and AI both organically and inorganically, and successfully completed one of the largest technology acquisitions in history (Red Hat).  She reinvented more than 50% of IBM’s portfolio, built a $21 billion hybrid cloud business and established IBM’s leadership in AI, quantum computing, and blockchain, while divesting nearly $9 billion in annual revenue to focus the portfolio on IBM’s high value, integrated offerings. Part of that was the approximately $34 billion Red Hat acquisition, IBM’s, and possibly the IT industry’s, biggest to date. Rometty isn’t going away all that soon; she continues in some executive Board position.

It is way too early to get IBM 1Q2020 results, which will be the last quarter of Rometty’s reign. The fourth quarter of 2019, at least was positive, especially after all those quarters of revenue loss. The company reported  $21.8 billion in revenue, up 0.1 percent. Red Hat revenue was up 24 percent. Cloud and cognitive systems were up 9 percent while systems, which includes the z, was up 16 percent. 

Total cloud revenue, the new CEO Arvind Krishna’s baby, was up 21 percent. Even with z revenue up more than cloud and cognitive systems, it is probably unlikely IBM will easily find a buyer for the z soon. If IBM dumps it, they will probably have to pay somebody to take it despite the z’s faithful, profitable blue chip customer base. 

Although the losing streak has come to an end Krishna still faces some serious challenges.  For example, although DancingDinosaur has been enthusiastically cheerleading quantum computing as the future there is no proven business model there. Except for some limited adoption by a few early adopters, there is no widespread groundswell of demand for quantum computing and the technology has not yet proven itself useful. Also there is no ready pool of skilled quantum talent. If you wanted to try quantum computing would you even know what to try or where to find skilled people?

Even in the area of cloud computing where IBM finally is starting to show some progress the company has yet to penetrate the top tier of players. These players–Amazon, Google, Microsoft/Azur–are not likely to concede market share.

So here is DancingDinosaur’s advice to Krishna: Be prepared to scrap for every point of cloud share and be prepared to spin a compelling case around quantum computing. Finally, don’t give up the z until the accountants and lawyers force you, which they will undoubtedly insist on.To the contrary, slash the z prices and make it an irresistible bargain. 

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/ 

Montana Sidelines the Mainframe

January 21, 2020

Over the past 20+ years DancingDinosaur has written this story numerous times. It never ends exactly the way they think it will. Here is the one I encountered this past week.

IBM z15

But that doesn’t stop the pr writers from finding a cute way to write the story. This time the  writers turned to references to the moon landings and trilby hats (huh?). Looks like a Fedora to me, but what do I know; I only wear baseball hats. But they always have to come up with something that makes the mainframe sound completely outdated. In this case they wrote: Mainframe computers, a technology that harkens back to an era of moon landings and men in trilby hats, are still widely used throughout government, but not in Montana for much longer.

At least they didn’t write that the mainframe was dead and gone or forgotten. Usually, I follow up on stories like this months later and call whichever IT person is still there. I congratulate him or her and ask how it went. That’s when I usually start hearing ums and uhs. It turns out the mainframe is still there, handling those last few jobs they just can’t replace yet.

Depending on how playful I’m feeling that day, I ask him or her what happened to the justification presented at the start of the project. Or I might ask what happened to the previous IT person. 

Sometimes, I might even refer them to a recent DancingDinosaur piece that explains about Linux on the mainframe or Java or describes mainframes running the latest Docker container technology or microservices. I’m not doing this for spite; I’m just trying to build up my readership. DancingDinosaur hates losing any reader, even if it’s late in their game.  So I always follow up with a link to DancingDinosaur

In an interview published by StateScoop, Chief Information Officer Tim Bottenfield described how for the last several years, the last remaining agencies using the state’s mainframe have migrated their data away from it and are now developing modern applications that can be moved to the state’s private and highly virtualized cloud environment. By spring 2021, Montana expects to be mainframe-free. Will make a note to call Bottenfield in Spring 2021 and see how they are doing.  Does anyone want to bet if the mainframe actually is completely out of service and gone by then?

As you all know, mainframes can be expensive to maintain, particularly if it’s just to keep a handful of applications running, which usually turn out to be mission-critical applications. Of the three major applications Montana still runs on its mainframe, two are used by the Montana Department of Public Health and Human Services, which is in the process of recoding those programs to work on modern platforms, as if the z15 isn’t  modern.

They haven’t told us whether these applications handle payments or deliver critical services to citizens. Either way it will not be pretty if such applications go down. The third is the state’s vehicle titling and registration system, which is being rebuilt to run out of the state’s data center. Again, we don’t know much about the criticality of these systems. But think how you might feel if you can’t get accurate or timely information from one of these systems. I can bet you wouldn’t be a happy camper; neither would I.

Systems like these are difficult to get right the first time, if at all. This is especially true if you will be using the latest hybrid cloud and services technologies. Yes, skilled mainframe people are hard to find and retain but so are any technically skilled and experienced people. If I were a decade younger, I could be attracted to the wide open spaces of Montana as a relief from the congestion of Boston. But I’m not the kind of hire Montana needs or wants. Stay tuned for when I check back in Spring 2021.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/ 

2020 IBM Quantum Gains

January 13, 2020

IBM returned from the holidays announcing a flurry of activity around quantum computing. Specifically, it has expanded its set of Q Network partners, including a range of commercial, academic, startup, government, and research entities.  

IBM Qiskit screen

The Q Network now includes over 100 organizations, across multiple industries, including: Airline, Automotive, Banking and Finance, Energy, Insurance, Materials, and Electronics.  Specifically, Anthem, Delta Air Lines, Goldman Sachs, Wells Fargo, and Woodside Energy are among the latest organizations to begin to explore practical applications using quantum computing.

In addition to these industry leaders, a number of academic, government research labs and startups have also joined the IBM Q Network, including the Georgia Institute of Technology (Georgia Tech), Stanford University, Los Alamos National Laboratory, AIQTech, Beit, Quantum Machines, Tradeteq, and Zurich Instruments.

These organizations join over 200,000 users, who have run hundreds of billions of executions on IBM’s quantum systems and simulators through the IBM Cloud. This has led to the publication of more than 200 third-party research papers on practical quantum applications.

More quantum: IBM also recently announced the planned installation of the first two IBM Q System One commercial universal quantum computers outside the US – one with Europe’s leading organization for applied research, Fraunhofer-Gesellschaft, in Germany; another with The University of Tokyo. Both are designed to advance country-wide research and provide an education framework program to engage universities, industry, and government to grow a quantum computing community and foster new economic opportunities.

Growing a quantum computing community should quickly become a critical need and, more likely, a major headache. My own cursory search of employment sites revealed no quantum computing openings  listed. Just a few casual inquiries suggest curiosity about quantum computing but not much insight or readiness or actual skills or openings to generate action. 

Still, even at this early stage things already are happening.

Anthem, Inc., a leading health benefits company is expanding its research and development efforts to explore how quantum computing may further enhance the consumer healthcare experience. For Anthem, quantum computing offers the potential to analyze vast amounts of data inaccessible to classical computing while also enhancing privacy and security. It also brings the potential to help individuals through the development of more accurate and personalized treatment options while improving the prediction of health conditions.

Delta Air Lines joined the IBM Q Hub at North Carolina State University to embark on a multi-year collaborative effort with IBM to explore the potential capabilities of quantum computing in transforming experiences for customers and employees as they encounter challenges throughout the  travel day.

Quantum Machines (QM), a provider of control and operating systems for quantum computers, brings customers among the leading players in the field, including multinational corporations, academic institutions, start-ups and national research labs. As part of the IBM and QM collaboration, a compiler between IBM’s quantum computing programming languages, like Qiskit (see graphic above),  and those of QM is being developed for use by QM’s customers. Such development will lead to the increased adoption of IBM’s open-sourced programming languages across the industry.

The Los Alamos National Laboratory also has joined as an IBM Q Hub to greatly help the lab’s research efforts, including developing and testing near-term quantum algorithms and formulating strategies for mitigating errors on quantum computers. A 53-qubit system will also allow Los Alamos to benchmark the abilities to perform quantum simulations on real quantum hardware and perhaps to finally push beyond the limits of classical computing. 

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/ 

A Blockchain Feast

January 6, 2020

Hope everybody had wonderful holidays and feasted to your heart’s content.  Did you, by chance, think about the data involved in that feast? No, not the calories you consumed but the data that tracked the food you consumed from the farm or ranch through numerous processing and shipping steps to finally arrive at your table.  Well, maybe next time.

Apple Pie: Courtesy of IBM

The IBM Food Trust, which is built on blockchain, enables sellers and consumers to trace their food from farm to warehouse to kitchen, explains IBM. For more eco- and safety-conscious diners, IBM continues, this information is crucial for ensuring a safer, smarter, more transparent and sustainable food ecosystem. The company, unfortunately, hasn’t yet said anything about  Food Trust counting calories consumed.

As IBM describes it, the Food Trust is a collaborative network of growers, processors, wholesalers, distributors, manufacturers, retailers, and others, enhancing visibility and accountability across the food supply chain. Built on IBM Blockchain, this solution connects participants through a permissioned, immutable, and shared record of food provenance, transaction data, processing details, and more.

To date, IBM reports more than 200 companies participate in Food Trust, the first network of its kind to connect participants across the food supply chain through a permanent and shared record of data. The result, according to the company,  is a suite of solutions that improve food safety and freshness, unlock supply chain efficiencies, minimize waste, and empower consumers who care about where their food comes from. 

Take chicken, for example, if you can  shop at the European grocery chain Carrefour, where chicken is being tracked by IBM Food Trust alongside a mix of other foods, like eggs, milk, oranges, pork and cheese.  This selection of foods will grow by more than 100 over the next year, says the company, but so popular is the blockchain-tracked chicken, claims IBM, that the grocer reports sales growth exceeding that of non-blockchain poultry.

Carrefour shoppers just use their smartphones to scan QR codes on the chicken’s packaging. What they will find is information on the livestock’s date of birth, nutrition information and packing date. Sounds interesting until my wife feels obligated to send the chicken a birthday gift.  Customers also learn about the food’s journey from farm to store, providing additional transparency about the life and times of this chicken. It said nothing, however, about whether it lived a wild youth.

Maybe you wonder if your seafood is correctly labeled and sustainably caught. IBM is turning to  blockchain to bring more trust and transparency to the supply chain of the fish and seafood we consume.  Specifically, the sustainable Shrimp Partnership now uses blockchain to trace the journey of Ecuadorian farmed shrimp and attest to the highest social and environmental standards. 

Similarly, the seafood industry in Massachusetts is tracing the provenance of fresh scallops. It also allows consumers in restaurants to use a QR code to learn about the seafood’s quality and origin. That’s something I might actually do. Finally, the National Fisheries Institute has joined the Food Trust Network in an effort to trace multiple seafood species.

IBM is trying to do the same with coffee, pasta, mashed potatoes, and more. This is something that I might actually grow to rely on if it were readily available and dead simple. One question is how accessible this information will be when a shopper or diner really needs it. OK, we can all use QR codes as long as they are right in front of us. But beyond that, as a diner I’m too impatient to bother to do much more.

This blog has periodically been following blockchain for years, always expecting the technology to take off imminently.  Maybe with Food Trust the technology will finally pick up some traction.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/ 

IBM Cloud Pak–Back to the Future

December 19, 2019

It had seemed that IBM was in a big rush to get everybody to cloud and hybrid cloud. But then in a recent message, it turned out maybe not such a rush. 

What that means is the company believes coexistence will involve new and existing applications working together for some time to come. Starting at any point new features may be added to existing applications. Eventually a microservices architecture should be exposed to new and existing applications. Whew, this is not something you should feel compelled to do today or next quarter or in five years, maybe not even in 10 years.


Here is more from the company earlier this month. When introducing its latest Cloud Paks as enterprise-ready cloud software the company presents it as a containerized software packaged with open source components, pre-integrated with common operational services and a secure-by-design container platform and operational services consisting of  logging, monitoring, security, and identity access management. DancingDinosaur tried to keep up for a couple of decades but in recent years has given up. Thankfully, no one is counting on me to deliver the latest code fast.

IBM has been promoting packaged software  and hardware for as long as this reporter has been following the company, which was when my adult married daughters were infants. (I could speed them off to sleep by reading them the latest IBM white paper I had just written for IBM or other tech giants. Don’t know if they retained or even appreciated any of that early client/server stuff but they did fall asleep, which was my intent.)

Essentially IBM is offering as enterprise-ready Cloud Paks, already packaged and integrated with hardware and software, ready to deploy.  It worked back then as it will now, I suspect, with the latest containerized systems because systems are more complex than ever before, not less by a long shot. Unless you have continuously retained and retrained your best people while continually refreshing your toolset you’ll find it hard to  keep up. You will need pre-integrated and packaged containerized cloud packages that will work right out of the box. 

This is more than just selling you a pre-integrated bundle. This is back to the future; I mean way back. Called Cloud Pak for data system, IBM is offering what it describes as a  fusion of hardware and software. The company chooses the right storage and hardware; all purpose built by IBM in one system. That amounts to convergence of storage, network, software, and data in a single system–all taken care of by IBM and deployed as containers and microservices. As I noted above, a deep trip back to the future.

IBM has dubbed it  Cloud-in-a-box. In short, this is an appliance. You can start very small, paying for what you use now. If later you want more, just expand it then. Am sure your IBM sales rep will be more than happy to provide you with the details. It appears from the briefing that there is an actual base configuration consisting of  2 enclosures with 32 or 128 TB. The company promises to install this and get you up and running in 4 hours, leaving only the final provisioning for you.

This works for existing mainframe shops too, at least those running Linux on the mainframe.  LinuxONE shops are probably ideal. It appears all z shops will need is DB2 and maybe Netezza. Much of the work will be done off the mainframe so at least you should  save some MIPS.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/ 

This is the last appearance of DancingDinosaur this year. It will reappear in the week of Jan. 6, 2020. Best wishes for the holidays.

Syncsort Acquires Pitney Bowes Software & Data

December 10, 2019

It is easy to forget that there are other ISVs  who work with the z. A recent list of z ISVs ran to over a dozen, including Rocket Software, Compuware, GT Software, and Syncsort, among others.  

Syncsort has grabbed some attention of late by announcing  the completion of an agreement to combine Pitney Bowes, the postal metering company, to take over its software and data operations. As a result, Syncsort claims a position of one of the leading data management software companies in the world, serving more than 11,000 primarily z customers.

The combined portfolio brings together capabilities in location intelligence, data enrichment, customer information management, and engagement solutions with powerful data integration and optimization software. About the only thing they haven’t listed is AI.

Over the coming months, teams will be working to combine the Syncsort-Pitney Bowes organizations and portfolios. While there may be some changes within the Syncsort organization, not much will change for its customers immediately. They can still expect to receive the same level of service they have received to support their everyday needs.

Syncsort’s acquisition of the Pitney Bowes software and data business creates a data management software company with more than 11,000 enterprise customers, $600 million in revenue, and 2,000 employees worldwide. Although modest in comparison with today’s Internet tech giants and even IBM, the resulting company brings sufficient scale, agility, and breadth of portfolio to enable leading enterprises to gain a competitive advantage from their data, Syncsort noted in its announcement.

“Enterprises everywhere are striving to increase their competitiveness through the strategic use of data…”  As a result, “organizations must invest in next-generation technologies like cloud, streaming, and machine learning, while simultaneously leveraging and modernizing decades of investment in traditional data infrastructure,” said Josh Rogers, CEO, Syncsort. Now “our increased scale allows us to expand the scope of partnerships with customers so that they can maximize the value of all their data,” he added.

According to Paige Bartley of 451 Research accompanying Syncsort’s announcement:  “The ability to derive actionable human intelligence from data requires ensuring that it has been integrated from all relevant sources, is representative and high quality, and has been enriched with additional context and information. Syncsort, as a longtime player in the data management space, is further addressing these issues with the acquisition of Pitney Bowes Software Solutions’ assets – technology that complements existing data-quality capabilities to provide additional context and enrichment for data, as well as leverage customer data and preferences to drive business outcomes.” 

The combined portfolio brings together much-in-demand capabilities in location intelligence, data enrichment, customer information management, and engagement solutions with powerful data integration and optimization software. These end-to-end capabilities, Syncsort adds,  will empower organizations to overcome ever-increasing challenges around the integrity of their data so their IT and business operations can easily integrate, enrich, and improve data assets to maximize insights.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/ 

IBM Suggests Astounding Productivity with Cloud Pak for Automation

November 25, 2019

DancingDinosaur thought IBM would not introduce another Cloud Pak until after the holidays, but I was wrong. Last week IBM launched Cloud Pak for security. According to IBM it helps an organization uncover threats, make more informed risk-based decisions, and prioritize your team’s time. 

More specifically, it connects the organization’s existing data sources to generate deeper insights. In the process you can access IBM and third-party tools to search for threats across any cloud or on-premises location. Quickly orchestrate actions and responses to those threats  while leaving your data where it is.

DancingDinosaur’s only disappointment in the IBM’s new security cloud pak as with other IBM Cloud Paks is that it runs only on Linux. That means it doesn’t run RACF, the legendary IBM access control tool for zOS. IBM’s Cloud Paks reportedly run on z Systems, but only those running Linux. Not sure how IBM can finesse this particular issue. 

Of the 5 original IBM Cloud Paks (application, data, integration, multicloud mgt, and automation) only one offers the kind of payback that will wow top c-level execs; automation.  Find Cloud Park for Automation here.

To date, IBM reports  over 5000 customers have used IBM Digital Business Automation to run their digital business. At the same time, IBM claims successful digitization has increased organizational scale and fueled growth of knowledge work.

McKinsey & Company notes that such workers spend up to 28 hours each week on low value work. IBM’s goal with digital business automation is to bring digital scale to knowledge work and free these workers to work on high value tasks.

Such tasks include collaborating and using creativity to come up with new ideas or meeting and building relationships with clients or resolving issues and exceptions. By automating these tasks the payoff, says IBM, can be staggering simply  by applying intelligent automation.

“We can reclaim 120 billion hours a year  spent by knowledge workers on low value work by using intelligent automation,” declares IBM.  So what value can you reclaim over the course of the year for your operation with, say, 100 knowledge workers, earning, maybe, $22 per hour, or maybe 1000 workers earning $35/hr. You can do the math. 

As you would expect,  automation is the critical component of this particular Cloud Pak. The main targets for enhancement or assistance among the rather broad category of knowledge workers are administrative/departmental work and expert work, which includes cross enterprise work.  IBM offers vendor management as one example.

The goal is to digitize core services by automating at scale and building low code/no code apps for your knowledge workers. For what IBM refers to as digital workers, who are key to this plan, the company wants to free them for higher value work. IBM’s example of such an expert worker would be a loan officer. 

Central to IBM’s Cloud Pak for Automation is what IBM calls its Intelligent Automation Platform. Some of this is here now, according to the company, with more coming in the future. Here now is the ability to create apps using low code tooling, reuse assets from business automation workflow, and create new UI assets.

Coming up in some unspecified timeframe is the ability to enable  digital workers to automate job roles, define and create content services to enable intelligent capture and extraction, and finally to envision and create decision services to offload and automate routine decisions.

Are your current and would-be knowledge workers ready to contribute or participate in this scheme? Maybe for some. it depends for others. To capture those billions of hours of increased productivity, however, they will have to step up to it. But you can be pretty sure IBM will do it for you if you ask.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/ 

 

IBM Cloud Pak Rollouts Continue

November 14, 2019

IBM Cloud Paks have emerged as a key strategy by the company to grow not just its cloud, but more importantly, its hybrid cloud business. For the past year or so, IBM shifted its emphasis from clouds to hybrid clouds. No doubt this is driven by its realization that its enterprise clients are adopting multiple clouds, necessitating the hybrid cloud.

The company is counting on success in hybrid clouds.  For years IBM has scrambled to claw a place for itself among the top cloud players but from the time DancingDinosaur has tracked IBM’s cloud presence it has never risen higher than third. In 2019, the top cloud providers are AWS, Microsoft, Google, IBM, Oracle, Alibaba, with IBM slipping to fourth in one analyst’s ranking.

Hybrid clouds, over time, can change the dynamics of the market. It has not, however, changed things much according to a ranking from Datamation. “There are too many variables to strictly rank hybrid cloud providers,” notes Datamation. With that said, Datamation still ranked them starting with  Amazon’s Amazon Web Services (AWS), which remains the unquestioned leader of the business with twice the market share as its next leading competitor, Microsoft/Azure, and followed by IBM. The company is counting on its Red Hat acquisition, which includes OpenShift along with Enterprise Linux, to alter its market standing.. 

The hybrid cloud segment certainly encompasses a wider range of customer needs, so there are ways IBM can work Red Hat to give it some advantages in pricing and packaging, which it has already signaled it can and will do, starting with OpenShift. DancingDinosaur doubts it will overtake AWS outright, but as noted above, hybrid clouds are a different beast. So don’t rule out IBM in the hybrid cloud market.

Another thing that may give IBM an edge in hybrid clouds among its enterprise customers are its Cloud Paks.  As IBM describes them Cloud Paks are enterprise-ready, containerized software that give organizations an open, faster and more secure way to move core business applications to any cloud. Each IBM Cloud Pak runs on Red Hat OpenShift, IBM Cloud, and Red Hat Enterprise Linux. 

Each pak includes containerized IBM middleware and common software services for development and management. Also included is a common integration layer designed to reduce development time by up to 84 percent and operational expenses by up to 75 percent, according to IBM.

Cloud Paks, IBM continues,, enable you to easily deploy modern enterprise software either on-premises, in the cloud, or with pre-integrated systems and quickly bring workloads to production by seamlessly leveraging Kubernetes as the container management framework supporting production-level qualities of service and end-to-end lifecycle management. This gives organizations an open, faster, more secure way to move core business applications to any cloud.

When IBM introduced Cloud Paks a few weeks ago they planned a suite of five Cloud Paks:  

  • Application
  • Data
  • Integration
  • Automation
  • Multi Cloud mgt

Don’t be surprised as hybrid cloud usage evolves if even more Cloud Paks eventually appear. It becomes an opportunity for IBM to bundle together more of its existing tools and products and send them to the cloud too.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/ 

Has Google Achieved Quantum Supremacy?

October 31, 2019

Google said they did last week. “If true, it is big news,” writes Chelsea Whyte in New Scientist. Quantum computers have the potential to change the way organizations design new materials, work out logistics, build artificial intelligence, and break encryption. That is why firms like Google, Intel and IBM – along with plenty of start-ups – have been racing to reach this crucial milestone, Whyte continued.

Or maybe Google hasn’t. Its claim that its 53-qubit computer performed, in 200 seconds, an arcane task that would take 10,000 years for Summit, a supercomputer that is currently the world’s fastest, which IBM built for the Department of Energy. 

qubit

As IBM puts it, writes Adrian Cho, in Science Magazine: On 21 October, IBM announced that, by tweaking the way Summit approaches the task, it can do it far faster: in 2.5 days. Therefore, notes IBM, the threshold for quantum supremacy—doing something a classical computer cannot.—has still not been met.  By tweaking the way Summit approaches the task, IBM says it can still do it far faster: in 2.5 days, Chou reports. Therefore, the threshold for quantum supremacy—doing something a classical computer can’t—has thus still not been met, IBM insists. 

Somehow, when the industry reaches the point of quantum supremacy, DancingDinosaur suspects, it won’t be with a 53-qubit device. That’s easily within range of the biggest quantum computers already available. The problem Google solved involved random numbers; specifically they tackled a random sampling problem – that is, checking that a set of numbers has a truly random distribution. Reportedly, this is very difficult for a traditional computer when there are a lot of numbers involved.  Maybe I’m still a Z big iron bigot, but breaking this threshold should take hundreds if not thousands of qubits according to published pieces.

But this raises an interesting point. If IBM can tweak its best supercomputer to reduce a process that would take years to 2.5 days, why didn’t they do it earlier? And how many other such inefficiencies are lying around that they could streamline right now?

The race for quantum supremacy may not equate with effective quantum computing or even competitively priced systems. Except for a couple of newcomers nobody is talking about price. IBM offered some free time on  its smallest qubit machines in the cloud for those who joined its quantum program but for how long? With Google, Intel, HP, and IBM – along with a handful of newcomers like D-Wave and Rigetti and any startups that pop up the race to quantum supremacy surely will be competitive but who knows what the cost will be.

DancingDinosaur’s guess is that, given the players currently involved, the quantum computing market will look a lot like today’s enterprise computing market minus the startups. As for pricing, beyond IBM’s promotional offer of free time on one of it cloud-based quantum machines these systems can’t be cheap when they finally become available. 

Just the cooling required to keep a quantum machine stable will be costly.  IBM is taking the right approach for now; put some machines in the cloud where it can provide and manage all the infrastructure required to support the machines. Don’t expect to ever buy one at Best Buy and plug it into your data center.

And then there is the talent problem.  Skilled quantum programmers and technicians probably don’t post their availability on Indeed. You’ll need a few PhDs in mathematics or physics, at the least, to get you started. Suggest you get in line at MIT or Stanford.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/ 


%d bloggers like this: