Posts Tagged ‘Linux’

IBM Wazi cloud-native devops for Z

June 12, 2020

In this rapidly evolving world of hybrid and multicloud systems, organizations are required to quickly evolve their processes and tooling to address business needs. Foremost among that are development environments that include IBM Z as part of their hybrid solution face, says Sanjay Chandru, Director, IBM Z DevOps.

IBM’s goal, then  is to provide a cloud native developer experience for the IBM Z that is consistent and familiar to all developers. And that requires cross platform consistency in tooling for application programmers on Z who will need to deliver innovation faster and without the backlogs that have been expected in the past.

Wazi, along with OpenShift,  is another dividend from IBM purchase of Red Hat. Here is where IBM Wazi for Red Hat CodeReady Workspaces comes in: an add-on to IBM Cloud Pak for Applications. It allows developers to use an industry standard integrated development environment (IDE),  such as Microsoft Visual Studio Code (VS Code) or Eclipse, to develop and test IBM z/OS applications in a containerized, virtual z/OS environment on Red Hat OpenShift running on x86 hardware. The container creates a sandbox. 

The combination of IBM Cloud Pak for Applications goes beyond what Zowe offers as an open source framework for z/OS and the OpenProject to enable Z development and operations teams to securely manage, control, script and develop on the mainframe like any other cloud platform. Developers who are not used to z/OS and IBM Z, which are most developers, now can  become productive faster in a familiar and accessible working environment, effectively  improving DevOps adoption across the enterprise

As IBM explained: Wazi integrates seamlessly into a standard, Git-based open tool chain to enable continuous integration and continuous delivery (CI/CD) as part of a fully hybrid devops process encompassing distributed and z systems.

IBM continues: Wazi is offered with deployment choices so that organizations can flexibly rebalance entitlement over time based on its business needs. In short, the organization can 

protect and leverage its IBM Z investments with robust and standard development capabilities that encompasses IBM Z and multicloud platforms.

The payoff comes as developers who are NOT used to z/OS and IBM Z, which is most of the developer world, can become productive faster in a familiar and accessible working environment while  improving DevOps adoption across the enterprise. IBM Wazi integrates seamlessly into a standard, Git-based open tool chain to deliver CI/CD and is offered with deployment choices so that any organization can flexibly rebalance over time based on its business needs. In short, you are protecting and leveraging your IBM Z investments with robust and standard development capabilities that encompass the Z and multicloud platforms.

As one large IBM customer put it: “We want to make the mainframe accessible. Use whatever tool you are comfortable with – Eclipse / IDz / Visual Studio Code. All of these things we are interested in to accelerate our innovation on the mainframe” 

An IT service provider added in IBM’s Wazi announcement: “Our colleagues in software development have been screaming for years for a dedicated testing environment that can be created and destroyed rapidly.” Well, now they have it in Wazi.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work athttp://technologywriter.com/

Apps and Ecosystem Critical for 5G Edge Success

May 18, 2020

According to the gospel of IBM, Edge computing with 5G creates opportunities in every industry. It brings computation and data storage closer to where data is generated, enabling better data control, reduced costs, faster insights and actions, and continuous operations.

Edge computing IBM Cloud Architecture

By 2025, 75% of enterprise data will be processed more efficiently on devices at the edge, compared to only 10% today. It will eliminate the need to relay data acquired, and often used for decision making in the field back to a data center for processing and storage. 

In short, the combination of 5G and smart devices on the edge aids this growing flow of data and processing through the proliferation of a variety of clouds: private, public, multi, and hybrid. But more is needed.

To get things rolling, IBM announced a handful of applications and tools and an edge ecosystem. As IBM notes: organizations across industries can now fully realize the benefits of edge computing, including running AI and analytics at the edge to achieve insights closer to where the work is done and the results applied. These new solutions include:

  • IBM Edge Application Manager – an autonomous management tool to enable AI, analytics and IoT enterprise workloads to be deployed and remotely managed, delivering real-time analysis and insight at scale. It aims to enable the management of up to 10,000 edge nodes simultaneously by a single administrator. It is the first to be powered by Open Horizon, which is folded into the Linux Foundation. 
  • IBM Telco Network Cloud Manager – runs on Red Hat OpenShift and Red Hat Open Stack,  a cloud computing platform that virtualizes resources from industry-standard hardware, organizes them into clouds, and manages them to provide new services now and going forward as 5G adoption expands.
  • A portfolio of edge-enabled applications and services, including IBM Visual Insights, IBM Production Optimization, IBM Connected Manufacturing, IBM Asset Optimization, IBM Maximo Worker Insights and IBM Visual Inspector. All aim to deliver the flexibility to deploy AI and cognitive applications and services at the edge and at scale. 
  • Red Hat OpenShift, which manages containers with automated installation, upgrades, and lifecycle management throughout the container stack—the operating system, Kubernetes cluster services, and applications—on any cloud.
  • Dedicated IBM Services teams for edge computing and telco network clouds that draw on IBM’s expertise to deliver 5G and edge-enabled capabilities across all industries.

In addition, IBM is announcing the IBM Edge Ecosystem, through which an increasingly broad set of ISVs, GSIs and more will be helping enterprises capture the opportunities of edge computing with a variety of solutions built upon IBM’s technology. IBM is also creating the IBM Telco Network Cloud Ecosystem, bringing together a set of partners across the telecommunications industry that offer a breadth of network functionality that helps providers deploy their network cloud platforms. 

These open ecosystems of equipment manufacturers, networking and IT providers, and software providers include Cisco, Dell Technologies, Juniper Networks, Intel, NVIDIA, Samsung, Packet, Equinix Company, Hazelcast, Sysdig, Turbonomic, Portworx, Humio, Indra Minsait, Eurotech, Arrow Electronics, ADLINK, Acromove, Geniatech, SmartCone, CloudHedge, Altiostar, Metaswitch, F5 Networks, and ADVA as members. 

Making the promise of edge computing a reality requires an open ecosystem with diverse participants. It also requires open standards-based, cloud native solutions that can be deployed and autonomously managed at massive scale throughout the edge and can move data and applications seamlessly between private data centers, hybrid multiclouds, and the edge. IBM has already enlisted dozens of organizations in what it describes as its open edge ecosystem.  You can try to join the IBM ecosystem or start organizing your own.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/

5G Joins Edge Technology and Hybrid Multicloud

May 11, 2020

At IBM’s virtual Think Conference the first week in May the company made a big play for edge computing and 5G together. 

From connected vehicles to intelligent manufacturing equipment, the amount of data from devices has resulted in unprecedented volumes of data at the edge. IBM is convinced the data volumes will compound as 5G networks increase the number of connected mobile devices.

z15 T02  and the LinuxONE 111 LT2

Edge computing  and 5G networks promise to reduce latency while improving speed, reliability, and processing. This will deliver faster and more comprehensive data analysis, deeper insights, faster response times, and improved experiences for employees, customers, and their customers.

First gaining prominence with the Internet of Things (IoT) a few years back IBM defined edge computing as a distributed computing framework that brings enterprise applications closer to where data is created and often remains, where it can be processed. This is where decisions are made and actions taken.

5G stands for the Fifth Generation of cellular wireless technology. Beyond higher speed and reduced latency, 5G standards will have a much higher connection density, allowing networks to handle greater numbers of connected devices combined with network slicing to isolate and protect designated applications.

Today, 10% of data is processed at the edge, an amount IBM expects to grow to 75% by 2025. Specifically, edge computing enables:

  • Better data control and lower costs by minimizing data transport to central hubs and reducing vulnerabilities and costs
  • Faster insights and actions by tapping into more sources of data and processing that data there, at the edge
  • Continuous operations by enabling systems that run autonomously, reduce disruption, and lower costs because data can be processed by the devices themselves on the spot and where decisions can be made

In short: the growing number of increasingly capable devices, faster 5G processing, and the increased pressure to drive the edge computing market beyond what the initial IoT proponents, who didn’t have 5G yet, envisioned. They also weren’t in a position to imagine the growth in the processing capabilities of edge devices in just the past year or two.

But that is starting to happen now, according to IDC: By 2023, half of the newly deployed on-premises infrastructure will be in critical edge locations rather than corporate datacenters, up from less than 10% today.

Also unimagined was the emergence of the hybrid multicloud, which IBM has only recently started to tout. The convergence of 5G, edge computing, and hybrid multicloud, according to the company, is redefining how businesses operate. As more embrace 5G and edge, the ability to modernize networks to take advantage of the edge opportunity is only now feasible. 

And all of this could play very well with the new z machines, the z15 T02  and LinuxONE lll LT2. These appear to be sufficiently capable to handle the scale of business edge strategies and hybrid cloud requirements for now. Or the enterprise class z15 if you need more horsepower.

By moving to a hybrid multicloud model, telcos can process data at both the core and network edge across multiple clouds, perform cognitive operations and make it easier to introduce and manage differentiated digital services. As 5G matures it will become the network technology that underpins the delivery of these services. 

Enterprises adopting a hybrid multicloud model that extends from corporate data centers (or public and private clouds) to the edge is critical to unlock new connected experiences. By extending cloud computing to the edge, enterprises can perform AI/analytics faster, run enterprise apps to reduce impacts from intermittent connectivity, and minimize data transport to central hubs for cost efficiency. 

Deploying a hybrid multicloud model from corporate data centers to the edge is central to capitalizing on  new connected experiences. By extending cloud computing to the edge, organizations can run AI/analytics faster  while minimizing data transport to central hubs for cost efficiency. By 2023, half of the newly deployed on-premises infrastructure will be in critical edge locations rather than corporate datacenters, up from less than 10% today. It’s time to start thinking about making edge part of your computer strategy. 

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/ 

Red Hat OpenShift Container Platform on z

February 20, 2020

IBM is finally starting to capitalize on last year’s $34 billion acquisition of Red Hat for z shops. If you had a new z and it ran Linux you would have no problem running Red Hat products so the company line went. Well, in mid February IBM announced Red Hat’s OpenShift Container Platform is now available on the z and LinuxONE, a z with built-in Linux optimized for the underlying z.

OpenShift comes to z and LinuxONE

As the company puts it:  The availability of OpenShift for z and LinuxONE is a major milestone for both hybrid multicloud and enterprise computing. OpenShift, a form of middleware for use with DevOps,  supports cloud-native applications being built once and deployed anywhere, including to on premises enterprise servers, especially the z and LinuxONE. This new release results from the collaboration between IBM and Red Hat development teams, and discussions with early adopter clients.

Working with its Hybrid Cloud, the company has created a roadmap for bringing the ecosystem of enterprise software to the OpenShift platform. IBM Cloud Paks containerize key IBM and open source software components to help enable faster enterprise application development and delivery. In addition to the availability of OpenShift for z it also announced that IBM Cloud Pak for Applications is available for the z and LinuxONE. In effect, it supports the modernization of existing apps and the building of new cloud-native apps. In addition, as announced last August,it is the company’s intention to deliver additional Cloud Paks for the z and LinuxONE.

Red Hat is a leader in hybrid cloud and enterprise Kubernetes, with more than 1,000 customers already using Red Hat OpenShift Container Platform. With the availability of OpenShift for the z and LinuxONE, the agile cloud-native world of containers and Kubernetes, which has become the defacto open global standard for containers and orchestration,  but it is now reinforced by the security features, scalability, and reliability of IBM’s enterprise servers.

“Containers are the next generation of software-defined compute that enterprises will leverage to accelerate their digital transformation initiatives,” says Gary Chen, Research Director at IDC, in a published report.  “IDC estimates that 71% of organizations are in the process of implementing containers and orchestration or are already using them regularly. IDC forecasts that the worldwide container infrastructure software opportunity is growing at a 63.9 % 5-year CAGR and is predicted to reach over $1.5B by 2022.”

By combining the agility and portability of Red Hat OpenShift and IBM Cloud Paks with the security features, scalability, and reliability of z and LinuxONE, enterprises will have the tools to build new cloud-native applications while also modernizing existing applications. Deploying Red Hat OpenShift and IBM Cloud Paks on z and LinuxONE reinforces key strengths and offers additional benefits:

  • Vertical scalability enables existing large monolithic applications to be containerized, and horizontal scalability enables support for large numbers of containers in a single z or LinuxONE enterprise server
  • Protection of data from external attacks and insider threats, with pervasive encryption and tamper-responsive protection of encryption keys
  • Availability of 99.999%  to meet service levels and customer expectations
  • Integration and co-location of cloud-native applications on the same system as the data, ensuring the fastest response times

IBM z/OS Cloud Broker helps enable OpenShift applications to interact with data and applications on IBM Z. IBM z/OS Cloud Broker is the first software product to provide access to z/OS services by the broader development community.

To more easily manage the resulting infrastructure organizations can license the IBM Cloud Infrastructure Center. This is an Infrastructure-as-a-Service offering which provides simplified infrastructure management in support of z/VM-based Linux virtual machines on the z and LinuxONE.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/

Meet IBM’s New CEO

February 6, 2020

Have to admire Ginny Rometty. She survived 19 consecutive losing quarters (one quarter shy of 5 years), which DancingDinosaur and the rest of the world covered with monotonous regularity, and she was not bounced out until this January. Memo to readers: Keep that in mind if you start feeling performance heat from top management. Can’t imagine another company that would tolerate it but what do I know.

Arvind Krishna becomes the Chief Executive Officer and a member of the I BM Board of Directors effective April 6, 2020. Krishna is currently IBM Senior Vice President for Cloud and Cognitive Software, and was a principal architect of the company’s acquisition of Red Hat. The cloud/Red Hat strategy has only just started to show signs of payback.

As IBM writes: Under Rometty’s leadership, IBM acquired 65 companies, built out key capabilities in hybrid cloud, security, industry and data, and AI both organically and inorganically, and successfully completed one of the largest technology acquisitions in history (Red Hat).  She reinvented more than 50% of IBM’s portfolio, built a $21 billion hybrid cloud business and established IBM’s leadership in AI, quantum computing, and blockchain, while divesting nearly $9 billion in annual revenue to focus the portfolio on IBM’s high value, integrated offerings. Part of that was the approximately $34 billion Red Hat acquisition, IBM’s, and possibly the IT industry’s, biggest to date. Rometty isn’t going away all that soon; she continues in some executive Board position.

It is way too early to get IBM 1Q2020 results, which will be the last quarter of Rometty’s reign. The fourth quarter of 2019, at least was positive, especially after all those quarters of revenue loss. The company reported  $21.8 billion in revenue, up 0.1 percent. Red Hat revenue was up 24 percent. Cloud and cognitive systems were up 9 percent while systems, which includes the z, was up 16 percent. 

Total cloud revenue, the new CEO Arvind Krishna’s baby, was up 21 percent. Even with z revenue up more than cloud and cognitive systems, it is probably unlikely IBM will easily find a buyer for the z soon. If IBM dumps it, they will probably have to pay somebody to take it despite the z’s faithful, profitable blue chip customer base. 

Although the losing streak has come to an end Krishna still faces some serious challenges.  For example, although DancingDinosaur has been enthusiastically cheerleading quantum computing as the future there is no proven business model there. Except for some limited adoption by a few early adopters, there is no widespread groundswell of demand for quantum computing and the technology has not yet proven itself useful. Also there is no ready pool of skilled quantum talent. If you wanted to try quantum computing would you even know what to try or where to find skilled people?

Even in the area of cloud computing where IBM finally is starting to show some progress the company has yet to penetrate the top tier of players. These players–Amazon, Google, Microsoft/Azur–are not likely to concede market share.

So here is DancingDinosaur’s advice to Krishna: Be prepared to scrap for every point of cloud share and be prepared to spin a compelling case around quantum computing. Finally, don’t give up the z until the accountants and lawyers force you, which they will undoubtedly insist on.To the contrary, slash the z prices and make it an irresistible bargain. 

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/ 

Montana Sidelines the Mainframe

January 21, 2020

Over the past 20+ years DancingDinosaur has written this story numerous times. It never ends exactly the way they think it will. Here is the one I encountered this past week.

IBM z15

But that doesn’t stop the pr writers from finding a cute way to write the story. This time the  writers turned to references to the moon landings and trilby hats (huh?). Looks like a Fedora to me, but what do I know; I only wear baseball hats. But they always have to come up with something that makes the mainframe sound completely outdated. In this case they wrote: Mainframe computers, a technology that harkens back to an era of moon landings and men in trilby hats, are still widely used throughout government, but not in Montana for much longer.

At least they didn’t write that the mainframe was dead and gone or forgotten. Usually, I follow up on stories like this months later and call whichever IT person is still there. I congratulate him or her and ask how it went. That’s when I usually start hearing ums and uhs. It turns out the mainframe is still there, handling those last few jobs they just can’t replace yet.

Depending on how playful I’m feeling that day, I ask him or her what happened to the justification presented at the start of the project. Or I might ask what happened to the previous IT person. 

Sometimes, I might even refer them to a recent DancingDinosaur piece that explains about Linux on the mainframe or Java or describes mainframes running the latest Docker container technology or microservices. I’m not doing this for spite; I’m just trying to build up my readership. DancingDinosaur hates losing any reader, even if it’s late in their game.  So I always follow up with a link to DancingDinosaur

In an interview published by StateScoop, Chief Information Officer Tim Bottenfield described how for the last several years, the last remaining agencies using the state’s mainframe have migrated their data away from it and are now developing modern applications that can be moved to the state’s private and highly virtualized cloud environment. By spring 2021, Montana expects to be mainframe-free. Will make a note to call Bottenfield in Spring 2021 and see how they are doing.  Does anyone want to bet if the mainframe actually is completely out of service and gone by then?

As you all know, mainframes can be expensive to maintain, particularly if it’s just to keep a handful of applications running, which usually turn out to be mission-critical applications. Of the three major applications Montana still runs on its mainframe, two are used by the Montana Department of Public Health and Human Services, which is in the process of recoding those programs to work on modern platforms, as if the z15 isn’t  modern.

They haven’t told us whether these applications handle payments or deliver critical services to citizens. Either way it will not be pretty if such applications go down. The third is the state’s vehicle titling and registration system, which is being rebuilt to run out of the state’s data center. Again, we don’t know much about the criticality of these systems. But think how you might feel if you can’t get accurate or timely information from one of these systems. I can bet you wouldn’t be a happy camper; neither would I.

Systems like these are difficult to get right the first time, if at all. This is especially true if you will be using the latest hybrid cloud and services technologies. Yes, skilled mainframe people are hard to find and retain but so are any technically skilled and experienced people. If I were a decade younger, I could be attracted to the wide open spaces of Montana as a relief from the congestion of Boston. But I’m not the kind of hire Montana needs or wants. Stay tuned for when I check back in Spring 2021.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/ 

IBM Cloud Pak–Back to the Future

December 19, 2019

It had seemed that IBM was in a big rush to get everybody to cloud and hybrid cloud. But then in a recent message, it turned out maybe not such a rush. 

What that means is the company believes coexistence will involve new and existing applications working together for some time to come. Starting at any point new features may be added to existing applications. Eventually a microservices architecture should be exposed to new and existing applications. Whew, this is not something you should feel compelled to do today or next quarter or in five years, maybe not even in 10 years.


Here is more from the company earlier this month. When introducing its latest Cloud Paks as enterprise-ready cloud software the company presents it as a containerized software packaged with open source components, pre-integrated with common operational services and a secure-by-design container platform and operational services consisting of  logging, monitoring, security, and identity access management. DancingDinosaur tried to keep up for a couple of decades but in recent years has given up. Thankfully, no one is counting on me to deliver the latest code fast.

IBM has been promoting packaged software  and hardware for as long as this reporter has been following the company, which was when my adult married daughters were infants. (I could speed them off to sleep by reading them the latest IBM white paper I had just written for IBM or other tech giants. Don’t know if they retained or even appreciated any of that early client/server stuff but they did fall asleep, which was my intent.)

Essentially IBM is offering as enterprise-ready Cloud Paks, already packaged and integrated with hardware and software, ready to deploy.  It worked back then as it will now, I suspect, with the latest containerized systems because systems are more complex than ever before, not less by a long shot. Unless you have continuously retained and retrained your best people while continually refreshing your toolset you’ll find it hard to  keep up. You will need pre-integrated and packaged containerized cloud packages that will work right out of the box. 

This is more than just selling you a pre-integrated bundle. This is back to the future; I mean way back. Called Cloud Pak for data system, IBM is offering what it describes as a  fusion of hardware and software. The company chooses the right storage and hardware; all purpose built by IBM in one system. That amounts to convergence of storage, network, software, and data in a single system–all taken care of by IBM and deployed as containers and microservices. As I noted above, a deep trip back to the future.

IBM has dubbed it  Cloud-in-a-box. In short, this is an appliance. You can start very small, paying for what you use now. If later you want more, just expand it then. Am sure your IBM sales rep will be more than happy to provide you with the details. It appears from the briefing that there is an actual base configuration consisting of  2 enclosures with 32 or 128 TB. The company promises to install this and get you up and running in 4 hours, leaving only the final provisioning for you.

This works for existing mainframe shops too, at least those running Linux on the mainframe.  LinuxONE shops are probably ideal. It appears all z shops will need is DB2 and maybe Netezza. Much of the work will be done off the mainframe so at least you should  save some MIPS.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/ 

This is the last appearance of DancingDinosaur this year. It will reappear in the week of Jan. 6, 2020. Best wishes for the holidays.

IBM Suggests Astounding Productivity with Cloud Pak for Automation

November 25, 2019

DancingDinosaur thought IBM would not introduce another Cloud Pak until after the holidays, but I was wrong. Last week IBM launched Cloud Pak for security. According to IBM it helps an organization uncover threats, make more informed risk-based decisions, and prioritize your team’s time. 

More specifically, it connects the organization’s existing data sources to generate deeper insights. In the process you can access IBM and third-party tools to search for threats across any cloud or on-premises location. Quickly orchestrate actions and responses to those threats  while leaving your data where it is.

DancingDinosaur’s only disappointment in the IBM’s new security cloud pak as with other IBM Cloud Paks is that it runs only on Linux. That means it doesn’t run RACF, the legendary IBM access control tool for zOS. IBM’s Cloud Paks reportedly run on z Systems, but only those running Linux. Not sure how IBM can finesse this particular issue. 

Of the 5 original IBM Cloud Paks (application, data, integration, multicloud mgt, and automation) only one offers the kind of payback that will wow top c-level execs; automation.  Find Cloud Park for Automation here.

To date, IBM reports  over 5000 customers have used IBM Digital Business Automation to run their digital business. At the same time, IBM claims successful digitization has increased organizational scale and fueled growth of knowledge work.

McKinsey & Company notes that such workers spend up to 28 hours each week on low value work. IBM’s goal with digital business automation is to bring digital scale to knowledge work and free these workers to work on high value tasks.

Such tasks include collaborating and using creativity to come up with new ideas or meeting and building relationships with clients or resolving issues and exceptions. By automating these tasks the payoff, says IBM, can be staggering simply  by applying intelligent automation.

“We can reclaim 120 billion hours a year  spent by knowledge workers on low value work by using intelligent automation,” declares IBM.  So what value can you reclaim over the course of the year for your operation with, say, 100 knowledge workers, earning, maybe, $22 per hour, or maybe 1000 workers earning $35/hr. You can do the math. 

As you would expect,  automation is the critical component of this particular Cloud Pak. The main targets for enhancement or assistance among the rather broad category of knowledge workers are administrative/departmental work and expert work, which includes cross enterprise work.  IBM offers vendor management as one example.

The goal is to digitize core services by automating at scale and building low code/no code apps for your knowledge workers. For what IBM refers to as digital workers, who are key to this plan, the company wants to free them for higher value work. IBM’s example of such an expert worker would be a loan officer. 

Central to IBM’s Cloud Pak for Automation is what IBM calls its Intelligent Automation Platform. Some of this is here now, according to the company, with more coming in the future. Here now is the ability to create apps using low code tooling, reuse assets from business automation workflow, and create new UI assets.

Coming up in some unspecified timeframe is the ability to enable  digital workers to automate job roles, define and create content services to enable intelligent capture and extraction, and finally to envision and create decision services to offload and automate routine decisions.

Are your current and would-be knowledge workers ready to contribute or participate in this scheme? Maybe for some. it depends for others. To capture those billions of hours of increased productivity, however, they will have to step up to it. But you can be pretty sure IBM will do it for you if you ask.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/ 

 

IBM Cloud Pak Rollouts Continue

November 14, 2019

IBM Cloud Paks have emerged as a key strategy by the company to grow not just its cloud, but more importantly, its hybrid cloud business. For the past year or so, IBM shifted its emphasis from clouds to hybrid clouds. No doubt this is driven by its realization that its enterprise clients are adopting multiple clouds, necessitating the hybrid cloud.

The company is counting on success in hybrid clouds.  For years IBM has scrambled to claw a place for itself among the top cloud players but from the time DancingDinosaur has tracked IBM’s cloud presence it has never risen higher than third. In 2019, the top cloud providers are AWS, Microsoft, Google, IBM, Oracle, Alibaba, with IBM slipping to fourth in one analyst’s ranking.

Hybrid clouds, over time, can change the dynamics of the market. It has not, however, changed things much according to a ranking from Datamation. “There are too many variables to strictly rank hybrid cloud providers,” notes Datamation. With that said, Datamation still ranked them starting with  Amazon’s Amazon Web Services (AWS), which remains the unquestioned leader of the business with twice the market share as its next leading competitor, Microsoft/Azure, and followed by IBM. The company is counting on its Red Hat acquisition, which includes OpenShift along with Enterprise Linux, to alter its market standing.. 

The hybrid cloud segment certainly encompasses a wider range of customer needs, so there are ways IBM can work Red Hat to give it some advantages in pricing and packaging, which it has already signaled it can and will do, starting with OpenShift. DancingDinosaur doubts it will overtake AWS outright, but as noted above, hybrid clouds are a different beast. So don’t rule out IBM in the hybrid cloud market.

Another thing that may give IBM an edge in hybrid clouds among its enterprise customers are its Cloud Paks.  As IBM describes them Cloud Paks are enterprise-ready, containerized software that give organizations an open, faster and more secure way to move core business applications to any cloud. Each IBM Cloud Pak runs on Red Hat OpenShift, IBM Cloud, and Red Hat Enterprise Linux. 

Each pak includes containerized IBM middleware and common software services for development and management. Also included is a common integration layer designed to reduce development time by up to 84 percent and operational expenses by up to 75 percent, according to IBM.

Cloud Paks, IBM continues,, enable you to easily deploy modern enterprise software either on-premises, in the cloud, or with pre-integrated systems and quickly bring workloads to production by seamlessly leveraging Kubernetes as the container management framework supporting production-level qualities of service and end-to-end lifecycle management. This gives organizations an open, faster, more secure way to move core business applications to any cloud.

When IBM introduced Cloud Paks a few weeks ago they planned a suite of five Cloud Paks:  

  • Application
  • Data
  • Integration
  • Automation
  • Multi Cloud mgt

Don’t be surprised as hybrid cloud usage evolves if even more Cloud Paks eventually appear. It becomes an opportunity for IBM to bundle together more of its existing tools and products and send them to the cloud too.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/ 


%d bloggers like this: