Celonis-IBM SI Deal

April 5, 2021

Is process mining new?  Celonis, like every good marketer on the make, suggests it is: Process mining, it explains, uses Celonis’ software to identify how work moves through an organization and suggests more efficient ways of getting the same work done, also known as process mining.

Abstract 3D rendered illustration of business or programming flowchart/block diagram.Similar pictures from my portfolio:

Process mining image credits: mbortolino / Getty Images

On April 1 Celonis  continued, “Before you can improve a workflow, you have to understand how work advances through a business, which is more complex than one might imagine inside a large enterprise.” That’s where Celonis comes in. It uses software to identify how work moves through an organization and suggests more efficient ways of getting the same work done, an approach it call process mining.

That day the company announced a significant partnership with IBM, by which IBM Global Services will train 10,000 consultants worldwide on Celonis. The deal gives Celonis, a company with around 1,200 employees, access to the massive selling and consulting power of IBM, while IBM gets a deep understanding of a piece of technology that is at the front end of what it describes as the workflow automation trend.

The chief revenue officer at Celonis, explains that “digitizing processes have been a trend for several years. It has sped up due to COVID, and it’s partly why the two companies have decided to work together.” Intelligent workflows, he continues,  or more broadly spoken workflows built to help companies execute better, are at the heart of this partnership and it’s at the heart of this trend now in the market,”

IBM’s view looks a little different: One view of this is that IBM now owns Red Hat, which it acquired in 2018 for $34 billion. The two companies believe that by combining the Celonis technology, which is cloud based, with Red Hat, which can span the hybrid world of on premises and cloud, the two together can provide a much more powerful solution to follow work wherever it happens.

“I do think that moving the [Celonis] software into the Red Hat OpenShift environment is powerful because it does allow what’s already a very open solution to now operate across this hybrid cloud world, leveraging the power of OpenShift, which can straddle the worlds of mainframe, private cloud and public cloud, writes Ron Miller, technology journalist at Tech Crunch.  “The data straddle those worlds and will continue to straddle those worlds,” adds Mark Foster, senior vice president at IBM Services.

Most importantly, it offers another way to leverage IBM’s stunning investment in Red Hat by creating another opportunity to use OpenShift, which is shaping up as the crown jewel of the Red Hat acquisition.

A lingering question arises: Why didn’t IBM, a multi-billion dollar company, just buy Celonis outright. It probably could have acquired it for what would amount for IBM  as petty cash.

Or maybe Celonis was not willing to jump for what it considered small money. Miguel Milano, chief revenue officer at Celonis, says that digitizing processes has been a trend for several years. It has sped up due to COVID, and it’s partly why the two companies have decided to work together. “Intelligent workflows, or more broadly spoken workflows built to help companies execute better, are at the heart of this partnership and it’s at the heart of this trend now in the market,” he insists..

The other part of this is that IBM believes that by combining the Celonis technology, which is cloud based, with Red Hat, which can span the multiple hybrid worlds including on premises and multiple clouds, the two together can provide a much more powerful solution to follow work wherever and however it happens.

Anyway, the companies report they had already been working together for some time prior to this formal announcement, and this partnership is the culmination of that. As this firmer commitment to one another goes into effect, the two companies will be working more closely to train thousands of IBM consultants on the technology, while moving the Celonis solution into Red Hat OpenShift in the coming months.

It’s clearly a big deal with the feel of an acquisition, but Milano says that this is about executing his company’s strategy to work with more systems integrators (SIs), and while IBM is a significant partner, it’s not the only one. Oh yeah? With IBM Global Services set to train 10,000 consultants worldwide on Celonis what SI is going to be bigger. 

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog. See more of his work at technologywriter.com.

Qiskit Metal for Quantum Development

March 30, 2021

On 2/4/2021 IBM unveiled its Quantum Development Roadmap, which showcases the company’s integrated vision and timeline for full-stack quantum development, including hardware, software, and applications.

OGAWA, Tadashi on Twitter: "=> "IBM's Roadmap For Scaling Quantum  Technology", Sep 15, 2020 https://t.co/zPxPkiotCB 2020: 65-qubit IBM Quantum  Hummingbird processor 2021: 127-qubit IBM Quantum Eagle processor 2022:  433-qubit IBM Quantum Osprey

Last September (2020), IBM shared its roadmap to scale quantum technology, with a clear vision for how to get to its declared inflection point of 1,000+ qubits by 2023 – and quantum systems powerful enough to explore solutions to challenges impossible on classical machines, alone. The development roadmap gives the millions of professional developers more reason and opportunity to explore quantum computing within their industry and expertise – without the need to learn new tools or languages.

Earlier this month, IBM introduced Qiskit Metal, a quantum computing SDK that promises to be accessible to almost anyone (not me, I briefly tried). As IBM explains it: Qiskit Metal enables chip prototyping in a matter of minutes. 

Just start from a convenient Python Jupyter notebook and take advantage of its user-friendly graphical user interface (GUI). Choose from a library of predefined quantum components, such as transmon qubits and coplanar resonators, and customize their parameters in real-time to fit your needs. Use the built-ialn algorithms to automatically connect components. You even can easily implement new experimental components using Python templates and examples. (As noted above, it’s not as easy as it implies.)

Metal, IBM continues, starts you off with modeling your intended quantum system. Metal helps by automating the quantum electrodynamics modeling of quantum devices to predict their performance and parameters, such as qubit frequencies, harmonics, couplings, and dissipation. Metal’s vision is to provide the abstraction layer needed to seamlessly interconnect with your favorite electromagnetic analysis tool (HFSS, Sonnet, CST, AWR, Comsol, …), dynamically rendering and co-simulating your design, at the whim of a click.

Behind Qiskit Metal is IBM’s vision for quantum development. In short; designing quantum devices is the bedrock of the quantum ecosystem, but it is a difficult, multi-step process that connects traditionally disparate worlds. That’s where Metal comes in.  Metal promises to automate and streamline this otherwise complex process. IBM envisions  developing a community-driven universal platform capable of orchestrating quantum chip development from concept to fabrication in a simple and open framework.

Specifically, the company wants to accelerate and lower the barrier to innovation of quantum devices. At a recent gathering led by quantum physicist Zlatko Minev and developed with other IBM Quantum team members they introduced a suite of hardware design automation tools that can be used to devise and analyze superconducting devices, with a goal of being able to integrate the best tools into a quantum hardware designer’s workflow. This is what was just introduced as Qiskit Metal, a tool for quantum hardware development.

IBM hopes that the community bridges the gap between pieces of a superconducting metal on a quantum chip with the computational mathematics of Hamiltonian and Hilbert analytical  spaces  available to anyone with a curious mind and a laptop. The goal ultimately is to make quantum device design a streamlined process that automates the laborious hardware tasks as it does with conventional electronic device design.

To achieve that IBM designed the  software with built-in best practices and cutting-edge quantum analysis techniques, all while seamlessly leveraging the power of conventional EDA tools like Python. In short: the goal of Qiskit Metal is to allow for easy quantum hardware modeling with reduction of design-related errors plus increased speed.

With luck it should get better. By 2023, IBM’s development roadmap fills in the gap to take developers from operating at the kernel level to working with application modules, laying the foundation for quantum model services and frictionless workflows. This includes opening up the variety of quantum circuits to include dynamic circuits, bringing real-time classical computing to quantum circuits to improve accuracy and potentially reduce the required resources.

Developers exploring quantum computing today will be able to do more, faster, as IBM implements technologies designed on OpenShift to work alongside quantum computers. As a result, more developers from different industries will have more reasons and opportunities to explore quantum computing within their workflows and without any need to learn new tools or languages, except maybe Qiskit Metal. IBM is counting on you to be more development capable than me.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog. See more of his work at technologywriter.com.

Linux on Z, LinuxOne, OpenShift

March 22, 2021

Dancing Dinosaur has been writing about Linux on Z for a long time: 20 years on Linux on Z, 5 years on LinuxONE, and 1 year of OpenShift on Z. The last step, OpenShift on Z, cost IBM a mere $34 billion. 

4 LinuxONE Frames

Dancing Dinosaur was thrilled 20 years ago when IBM announced Linux on Z. Little did I realize that it would take quite a few years of tweaking before they got it good enough for non-Linux Z geeks to naturally use it.

With more enthusiasm than maybe merited at that time I hoped it could be a PC killer. OK, it’s been 20 years and that still hasn’t happened and it won’t. IBM, however, hasn’t given up railing at x86 with every chance it gets. Not even IBM’s increasing success with hybrid cloud has managed to distract them.

So why did I welcome Linux on Z then?  I thought it would open more software opportunities, offer a wider choice of software options much like the open software concept in general was promising across industries everywhere, and lower prices on the Z. It didn’t turn out that way, at least following the immediate aftermath of the initial announcement. Maybe now, 20+ years later things finally can open up for real. 

Supposedly LinuxONE and IBM Z anticipate and respond to enterprise digital transformation demands by partnering with Independent Software Vendors (ISVs) who offer pre-built applications that can accelerate a client’s transformation journey. Ideally that is what should happen, writes Suchitra Joshi, Manager of  Go-To-Market for Linux on Z and LinuxONE ISV Ecosystem, an IBM position. But is that actually happening or are ISVs themselves captive to IBM? 

ISV ecosystems—in tandem with a Z shop’s own developers—theoretically can maximize a client’s resources while minimizing disruption during the migration to, say, a hybrid cloud, which is IBM’s latest hot focus. ISVs’ well-supported, industry-standard solutions running on LinuxONE and IBM Z should be able to fast-track a client’s ability to run mission-critical applications in the hybrid cloud, taking each enterprise to next-level data privacy, security, resiliency, and scalability. Have you experienced anything like this?

But IBM is not stopping there. It reports working to expand its Linux on Z ecosystem even further. The company insists it works one-on-one with ISVs on their development strategy and helps them port and test their applications on Linux on Z. IBM also promises to facilitate the containerization of ISV applications that can run on Red Hat OpenShift. 

Similarly, IBM announced the launch of Red Hat Marketplace, an open cloud marketplace for enterprise customers to discover, try, purchase, deploy and manage certified container-based software across public, private, cloud, and on-premises environments–essentially hybrid clouds. In the same way it claims to encourage ISVs to utilize IBM Cloud Hyper Protect Services and port their applications directly to the IBM Cloud. Have you tried it yet?

As the world becomes increasingly digitized, privacy laws such as GDPR are mandating that data remain totally secure, whether at rest or in transit. Meanwhile, new open (or public) APIs are placing increasing demands on core systems in financial services and other industries—requiring IT systems to operate even faster to maintain the response times customers expect, IBM explains.

To manage these challenges, it reports, clients are relying on the encryption, security and reliability strengths of IBM Z, which are formidable. That’s why it’s excited about the future of the IBM Z/LinuxONE/ISV/OpenShift ecosystem. Through its deployment on the hybrid cloud and the differentiated value it delivers to its new and existing clients security for now should remain solid..

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog. See more of his work at technologywriter.com.

Taking Down the Mainframe

March 15, 2021

DancingDinosaur loves headlines like this:  Will the cloud take down the mainframe?    This could be a very short answer: No. Or maybe a slightly longer answer: highly unlikely. Or a snide snarky answer like; not in your lifetime, pal. The author, Tom Taulli, is a well recognized and highly credible technology writer. So, let’s take this seriously and see where it goes.

Taulli starts right: “If you use a bank account, the healthcare system, various government servijces and insurance, then there’s a pretty good chance that the transactions were processed through mainframes.  The fact is that much of the Global 2000 companies use this technology.” Nothing to quibble about there.

And he continues in a way each reader of Dancing Dinosaur will likely agree: If you use a bank account,  healthcare system,  government services and insurance, then there’s a pretty good chance that the transactions were processed through mainframes.  Much of the Global 2000 has long used this technology. 

Dancing Dinosaur would take it even further: if you do anything through a smartphone beyond making a basic phone call by the time you get an acknowledgement of what you just did there is most likely a mainframe involved. 

Taulli then finishes his case: “If you have huge amounts of data that can’t be let offsite for regulatory reasons, you probably need something that looks like a mainframe.” Mike Loukides, VP of Emerging Tech Content at O’Reilly Media embellishes the idea. “Terabytes are easy now, but if you’re storing high resolution medical imagery, you’re talking petabytes fairly quickly.”

The IBM Z platform continues to experience storage growth. It also is seeing innovations such as cloud-native development and strong improvements in processing power. 

IBM picks up the story at that point: “The IBM Z business isn’t going anywhere,” said Ross Mauri, who is the general manager of IBM Z.  Huh, Does this means what I think? What he more likely meant, an IBM spokeswoman replied: “IBM Z is alive and well. MIPS growth is substantial. Customers are leaning heavily on Z during Covid.” So, she concluded; “IBM Z definitely is NOT being discontinued. To the contrary, she adds, IBM already is working on zNext and beyond. 

“In fact, since announcing IBM z15 in September 2019, the company reports 75% of the top 20 global banks are using the platform, Mauri continued. He also rattled off growth being driven by Linux and Red Hat OpenShift on IBM Z. Installed Linux MIPS increased 55% from 2Q2019 to 2Q2020–while the platform added  more than 100 clients ready to start with Red Hat OpenShift. 

Finally, while the general business press was bemoaning the decline in business revenue due to the pandemic, even implying a catastrophic downturn, Mauri suggests the Z is weathering the storm:  “COVID-19 has unearthed a renewed focus on IBM Z to help keep the world’s financial trading, retail transactions, insurance claims processing, healthcare IT, and more afloat.”

IBM Z clients activated nearly 4x more general-purpose capacity-on-demand in 2Q 2020 compared to 2Q 2019. 2019 was the last strong year for any IBM platform and was especially good for the Z, only platform reporting positive revenue gains that year.

As the IBM 2020 annual statement says: For the year, we generated $73.6 billion of revenue, a decline of 4 percent excluding the impact of currency and divestitures. Much of this reflects the broader uncertainty of the macro environment, which also affects our clients.

Still it reports cloud-related revenue grew 20 percent to $25.1 billion, excluding the impact of currency and divestitures, and now represents over one-third of IBM’s total revenue. Red Hat was a key driver with normalized revenue growth of 18 percent in 2020 and a backlog topping $5 billion for the first time at year end. Red Hat, together with IBM’s modernized Cloud Pak solutions and delivered overall software revenue growth. Global Business Services cloud revenue grew at a double-digit rate as it focused on modernizing clients’ applications and reimagining their workflows with AI. 

Finally, even with a very successful new product introduction in the second half of 2019, IBM Z revenue still grew in 2020, with the z15 now shipping the largest capacity in the platform’s history. Obviously the Z avoided a case of COVID-19. Only wish the 500,000+ others who died did too.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog. See more of his work at technologywriter.com.

Power Systems/Red Hat Hybrid Cloud

March 4, 2021

In Feb. IBM announced the availability of new Red Hat software on IBM Power Systems and new IBM Power Systems hardware

These announcements further expand IBM’s commitment to help clients modernize by empowering them with the latest technology from Red Hat to develop cloud-native applications and deploy them into hybrid cloud environments. The key, no surprise here, is hybrid cloud.

It starts with expanded Red Hat capabilities on IBM Power Systems featuring Red Hat OpenShift on IBM Power Virtual Server, leveraging OpenShift’s baremetal installer, Red Hat Runtimes, and a newly certified Red Hat Ansible Content Collections.

Then comes a new IBM Power Private Cloud Rack Solution. This entails delivering to customers, as IBM puts it, an optimized, production-level OpenShift platform to modernize traditional environments with cloud-native applications. The IBM Power Private Cloud Rack combines on-premises hardware, a complete software stack of IBM and Red Hat technology, and installation from IBM Systems Lab Services to deliver 49% lower cost per request as compared to similarly equipped x86-based platforms. Notice that IBM continues to consider x86 as the competition, maybe even the primary competition, as opposed to other alternatives, such as various cloud stacks, of which there are many.

Finally it is offering what IBM dubs Extended Dynamic Capacity. This amounts to  enhancements to IBM Power System’s dynamic capacity to quickly scale compute capacity across the hybrid cloud on Linux, IBM i, and AIX. That is a broad set of familiar options to IBM’s primary customer base. Between those options and various cloud stacks there should be no need for x86 at all.

One new element that stands out is the IBM Power Private Cloud Rack. This combines on-premises hardware, a complete software stack of IBM and Red Hat technology, and installation from IBM Systems Lab Services, which, as noted above, is delivered at a competitive price.

“Twelve months ago, IT practitioners faced a vastly different landscape before the world was transformed by the global COVID-19 pandemic,” said General Manager of IBM Cognitive Systems Stephen Leonard. Still, a hybrid cloud approach can offer 2.5x the value derived from a single public cloud, based on an assessment from IBM’s Institute of Business Value. IBM Power Systems, along with the combined IBM and Red Hat portfolio, plays a critical role in this transition to hybrid environments.

Recognizing that Red Hat OpenShift with IBM Power Virtual Server can play a part in helping organizations build an agile hybrid cloud by leveraging OpenShift’s baremetal installer. The IBM Power Virtual Server also can act as an enterprise Infrastructure-as-a-Service offering built around IBM POWER9 while offering access to over 200 IBM Cloud services. In addition, IBM Power Virtual Server clients will now be able to run leading business applications like SAP HANA in an IBM POWER9-based cloud.

To further help organizations and developers create cloud-native applications, Red Hat Runtimes are now supported on IBM Power Systems. Red Hat Runtimes is a set of products, tools and components designed to develop and maintain cloud-native applications. Thus, developers looking to create cloud-native applications on IBM Power Systems have access to leading open source frameworks and runtimes that provide a single development experience for hybrid applications spanning IBM Power Systems and other platforms.

Finally, according to IBM,  the Red Hat Ansible Automation Platform, made available on IBM Power Systems last year, provides an open source platform for simpler automation of common IT tasks, freeing up IT administrator time as well as compute resources to focus on other tasks. In short, IBM has created an extensive set of Ansible modules for the IBM Power Systems user community. 

Since the start of the new year IBM reports Power Systems added 22 new Ansible modules that bring new automation capabilities for common tasks like patch management, security management, OS & application deployment, continuous delivery, centralized backup and recovery, virtualization management & provisioning. Currently, there are 102 Ansible modules, downloaded more than 13,000 times since February that support POWER available to the open source community on GitHub. Many of these same modules are available as production-ready, enterprise-hardened, and certified as Ansible Collections via the Red Hat Ansible Automation Platform.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog. See more of his work at technologywriter.com.

Quantum Networks Proliferate

February 25, 2021

bp, the energy company that previously went by the name of British Petroleum and others appears to have settled on bp (lower case) for now. IBM announced a week ago that bp has joined the IBM Quantum Network to advance the use of quantum computing in the energy industry.

By joining the IBM Quantum Network bp gains access to IBM’s quantum expertise and software as well as cloud-based access to IBM’s most advanced quantum computers available via the cloud. This includes access to a 65-qubit quantum computer, the largest universal quantum system currently available to industry today it boasts. The company considers it a milestone on its way to a 1,000-plus qubit system promised by the end of 2023.

bp will tap these machines to solve actual business and engineering challenges it currently faces around immediate goals such as driving efficiencies and reducing carbon emissions. 

IBM, of course, isn’t the only vendor offering a quantum stack. The quantum stack is proliferating. Microsoft  through azure, is building a network of quantum partners. Oracle explains its approach to quantum computing here and uses python to program it.  Google identifies 18 vendors and partnerships engaged in quantum computing, including IBM and Oracle..

bp’s goals, however, are significantly long term and quite ambitious. It aims to become a net zero company by 2050 or sooner as part of the global drive to help the world get to net zero. 

Quantum computing has the potential to be applied in numerous areas such as:  chemistry, finance, logistics, pharmaceutical, engineering, materials, machine learning, and more.

In 2020, bp announced its net zero objective along with its new strategy. By the end of this decade, it aims to have developed around 50 gigawatts of net renewable-generating capacity (a 20-fold increase). At the same time it has  increased its annual low carbon investment 10-fold to around $5 billion and cut its oil and gas production by 40%. 

Joining the IBM Quantum Network should further speed bp’s ability to leverage quantum advances and applications as they emerge and then apply those breakthroughs toward achieving its net zero energy objectives.

More specifically, quantum computing can be immediately applied to modelling the chemistry and to understand various types of clay in hydrocarbon wells – a crucial factor in efficient hydrocarbon production. It also may contribute to analyzing and managing the fluid dynamics of wind farms; optimizing autonomous robotic facility inspection; and helping to create opportunities not yet imagined to deliver the clean energy the world wants and needs.

By the end of this decade, bp aims to have developed around 50 gigawatts of net renewable-generating capacity (a 20-fold gain), increased annual low carbon investment 10-fold to around $5 billion, and cut its oil and gas production by 40%.

Through participation in the IBM Quantum Network bp can leverage quantum advances and applications as they emerge and then apply those breakthroughs to its industry, energy transition, and its net zero energy goals.

“bp joins a rapidly growing number of companies working, more than 130 to date, ready to explore quantum computing to help accelerate the discovery of solutions to some of today’s biggest challenges,” said Dario Gil, Senior Vice President and Director of IBM Research. 

In addition, bp becomes part of a global community of Fortune 500 companies, start-ups, academic institutions, and research labs working to advance quantum computing and explore practical applications through the IBM Quantum Network. Together they are researching and exploring how quantum computing will help a variety of industries and disciplines.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog. See more of his work at technologywriter.com.

A Year of Pandemic

February 19, 2021

It’s only been a year !! It feels like much, much longer. The pandemic brought my long awaited retirement to a crashing halt when most of the things I wanted to do closed down due to sudden distancing requirements and a few are only recently being eased.

Courtesy of Council on Foreign Relations

So, how has your year of pandemic been going? Let me guess: more Zoom than you ever expected to do. A tool I never used before and never expected to was mastered almost immediately. I now can share screens, text, graphics, videos–whatever you want. Hope I never need these skills again, but you never know. Guess I should just add it to my resume, which since retiring I never expected to use again. 

IBM already is turning it into a super computing opportunity. Since the start of the COVID-19 pandemic IBM has been working closely with governments in the U.S. and worldwide to find all available options to put our technology and expertise to work. 

The most recent announcement I heard from President Biden was that maybe by Christmas life would start getting back to normal.  I’m just going to have to practice being nice in case it gets even further delayed. If it ends up in Feb, I hope they will, at least, hold off until April or May, when it actually starts warming up, if we’re lucky.

So, what did I do with all this free time now that the movie theaters, restaurants, pool halls, bowling alleys and bars were closed? Eighteen Dancing Dinosaur pieces take me back to Oct. There are five more years of pieces one could scroll through but even Google complains if I want to go back much further.

A couple of my local ski areas, at least, actually have snow and came up with their own asinine versions of social distancing rules. First they cap capacity at 20%.  Then you have to reserve parking spots and pay for tickets in advance. Then switch into your ski boots in your car.  They will have rest rooms available but don’t expect to use the base lodge for anything beyond that. Heck, I could just as well ski a bit off the trail and do my business in the woods (That’s an advantage men have that my wife vehemently complains about.).

OK, forget skiing. I also like to hike.  But the hiking trails have gotten unbelievably crowded. Suddenly you can’t even find a parking spot.

So what’s left. Well, there is still streaming stuff. I can watch what I want, when I want … as long as the telcos can deliver sufficient bandwidth.  As a technology writer this is something I’m actually familiar with but it appears my telco was not prepared for a pandemic.  That forced almost everybody to wait to stream content.

But the good news! Our governor announced that Baby Boomers, those of us 65 to 75 can now register to get a vaccine. In this state that’s over 1 million people.  The state’s widely lambasted vaccination website crashed within minutes. Don’t blame me. I wasn’t going to get up that early to even try. Hey, I’m retired, remember. Stay well.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer, formerly briefly retired. Follow DancingDinosaur on Twitter, @mainframeblog. See more of his work at technologywriter.com.

IBM Quantum Roadmap 2021

February 11, 2021

The first IBM roadmap for quantum DancingDinosaur recalls seeing was  Sept. 15, 2020. Written by Jay Gambetta, it started with a grandiose reference to the famous  Moon landing: Building a device that truly captures the behavior of atoms—and can harness these behaviors to solve some of the most challenging problems of our time—might seem impossible if you limit your thinking to the computational world you know. 

But like the Moon landing, Gambetta continued, we have an ultimate objective to access a realm beyond what’s possible on classical (conventional) computers: we want to build a large-scale quantum computer. The future’s quantum computer will pick up the slack where classical computers falter, controlling the behavior of atoms in order to run revolutionary applications across industries, generating world-changing materials, or transforming the way we do business.

IBM Quantum Computing Roadmap

This month IBM published a more recent road map, Feb. 4, 2021. Last September, IBM shared another roadmap to scale quantum technology to the inflection point of 1,000+ qubits by 2023, making quantum systems powerful enough to explore solutions to enable challenges impossible on classical machines alone. This roadmap gives the millions of professional developers more reasons and opportunities to explore quantum computing within their industry and expertise – without the need to learn new tools or languages!

After first hearing about quantum mechanics, the forerunner to quantum computing, in a  high school science class half a century ago, DancingDinosaur has to admit he’s impatient. The insurance industry actuaries apparently are optimistic enough to sell me life insurance that I will be around to see a 1000+cubit quantum machine in 2023.

Before then the key features IBM highlights on the latest roadmap are: Qiskit Runtime in 2021 that will enable a 100X speed-up in program execution, and the capability to run dynamic circuits in 2022 that will allow greater circuit variety as it expands algorithmic complexity in IBM quantum systems.

IBM’s quantum systems will continue to progress toward a 100X speed up by the end of 2021. The company will release examples of this progress over the course of the year. For example, simulating lithium hydride can take up to 100 days to accurately simulate. When 100X speedup is achieved, it will be possible to simulate LiH in one day. Let’s just hope they don’t catch fire.

IBM continues: after 2023, IBM’s development roadmap fills in the gap to take developers from operating at the kernel level to working with application modules, laying the foundation for quantum services, and frictionless workflows. This includes opening up the variety of quantum circuits to include dynamic circuits and bringing real-time classical-like computing to quantum circuits to improve accuracy and potentially reduce required resources.

Developers exploring quantum computing today will be able to do more, faster as IBM implements technologies enabling OpenShift to work alongside quantum computers. And more developers from different industries will have more reasons and opportunities to explore quantum computing within their workflows – no need to learn new tools or languages. That’s why DancingDinosaur wants to hang around.

After  2023, IBM’s development roadmap takes developers from operating at the kernel level to working with application modules, laying the foundation for quantum model services and frictionless workflows. This includes opening up the variety of quantum circuits to include dynamic circuits, bringing real-time classical computing to quantum circuits to improve accuracy and potentially reduce the required resources.

Developers exploring quantum computing today will be able to do more, faster as IBM implements technologies designed on OpenShift to work alongside quantum computers. OpenShift doesn’t work with quantum now but apparently IBM intends to get it there. As more developers from different industries have more reason and opportunity to explore quantum computing within their workflows they can avoid any need to learn new tools or languages.

IBM takes Quantum even further by envisioning a time that doesn’t require learning a new programming language and running code separately on a new device. Instead, all are integrated into a typical computing workflow (huh?) just like a graphics card or any other external computing component. When this is possible, DancingDinosaur likely won’t still be alive no matter how optimistic the actuaries are; but  if true, doesn’t this sound like heaven?

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog. See more of his work at technologywriter.com.

Zowe in 2021

February 8, 2021

It is now 2021 and the mainframe continues to be frustratingly difficult to operate, especially by the new talent the industry desperately needs to attract and put to work … fast. Those aging mainframe pros the industry has long worried would be retiring are actually retiring. Surprise, surprise. 

DancingDinosaur first  added its two-cents to the discussion in 2018 here. The goal was to encourage the Zowe development environment growth.  Well, it has taken longer than expected  but the Open Mainframe Project is announcing updates for 2021.

As powerful as it gets the mainframe still remains difficult, particularly for newcomers who have grown up with easier technologies. Even Linux on Z is not as simple or straightforward as Linux on other platforms.

According to the Open Mainframe Project, Zowe forms an integrated and extensible open source platform for z/OS with a coherent, curated set of user and programmatic interfaces that satisfy a variety of objectives.

Specifically, as the Open Mainframe Project explains: Zowe incorporates  a set of REST APIs that enable mainframe access across distributed platforms. Zowe is also extensible, enabling vendor services to integrate with this platform. Helping it along, the Zowe API Mediation Layer provides a single gateway through which all services are accessed via a catalog and a discovery service to automatically find new services.

In the first half of 2021, two overarching initiatives will be introduced for development across the full spectrum of the Zowe server-side components. Both initiatives correspond to the varied approaches which are necessary for success in a heterogeneous environment. 

  • One initiative addresses the different ways of setting up Zowe with high availability by allowing for the multiplication of the key Zowe components across different operating environments, so these components seamlessly work together as a single unit from the user’s point of view. 
  • The second focuses on different ways to containerize Zowe to improve the flexibility of deploying Zowe in heterogeneous environments.

Parts of Zowe run on the z platform, while other parts run externally in either a private or public cloud. The challenge becomes ensuring all communication is safe and efficient. This requires high availability and containerization for horizontal scaling of specific Zowe components based on load expectations. This also makes it possible to treat the environment as code, with all information necessary to run Zowe stored in version control systems.

When it comes to processing, understanding, and analyzing data there are a variety of Zowe client technologies users can leverage, such as the Zowe Application Framework,  a web UI that provides a virtual desktop containing apps to interact with the mainframe. Available out-of-the-box, these apps include a 3270 terminal, VT Terminal, as well as apps for interacting with JES, MVS Data Sets, and Unix System Services.

Other tools include the Zowe Command Line Interface (CLI), which enables interaction with the mainframe from a distributed environment and acts as a user interface as well as a programmatic interface enabling quick exploration of z/OS functions in real-time, as well as simple abstraction of basic commands into more complex and useful automation.  CLI acts as a bridge between the mainframe and the many distributed tools available, such as Jenkins.

Zowe Explorer is a Visual Studio Code extension that modernizes the way developers and system administrators interact with z/OS mainframes. Zowe Explorer lets you interact with data sets, USS files, and jobs that are stored on z/OS.

Both  Zowe Explorer and Zowe CLI are built on the Zowe Node Client SDK. This SDK provides access to a set of programmatic APIs that you can use to build client applications or scripts that interact with z/OS. Client SDKs for Python & Swift are also under development. If you try any of the tools above  please let DancingDinosaur know.


DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog. See more of his work at technologywriter.com.

BMC 2020 mainframe survey

January 28, 2021

In the 15th annual BMC mainframe survey.  90 percent of respondents report regarding the mainframe as a platform for new growth and long-term applications. Optimism about the future of the platform can also be seen in the expanding role of the mainframe, with 68 percent of respondents expecting to grow the number of  MIPS. This, notes BMC, is the highest growth outlook in the 15-year history of this survey.

A look inside the z15

To find anywhere nearly as positive platform growth outlook from IBM, you will have to go back to the 4th quarter of 2019, There you will find  revenues of $3.0 billion, up 16.0 percent, led by IBM Z, which was up 62 percent. The only other platform winner was Storage Systems where revenue grew 3 percent. The 2020 financials platform section was terrible for all the platforms, except maybe Red Hat and the hybrid cloud, which we’ve been hearing about for months from IBM.

BMC says this about the results in its report: businesses today understand  that the mainframe is a critical component of the modern digital enterprise and an emerging hub of  DevOps innovation. At the same time, mainframe operations are  changing rapidly.  

As experienced mainframe professionals depart, BMC continues, a  younger workforce is taking the reins, and the gender  gap is closing. Automation and artificial intelligence have  emerged as key tools to empower these new mainframe  stewards to leverage new technologies for managing  the platform. AI/ML also increases the speed and  quality of application delivery, strengthens security, and  improves business availability and resiliency. In this way,  organizations are working to enable the autonomous  digital enterprise for better agility, customer centricity,  and security.  

Also discussed in BMC’s report: businesses today understand  that the mainframe is a critical component of the  modern digital enterprise. Workloads are growing, while large  organizations continue to host much of their data on the  mainframe. At the same time, mainframe operations are  changing rapidly.  

For example, as has been widely noted for several years,  experienced mainframe professionals are finally departing as a  younger workforce takes the reins, while the gender gap is closing. Automation and artificial intelligence also  have finally emerged as practical tools to empower these new mainframe  stewards to leverage emerging technologies for managing  the platform.

In effect, AI/ML also increase the speed and  quality of application delivery, strengthen security, and  improves business availability and resiliency. In this way,  organizations are working to enable the long awaited autonomous  digital enterprise for better agility, customer centricity,  and security. In effect, BMC’s survey suggests, even without the retired mainframe veterans, a new generation of IT staff can effectively operate the mainframe data center.

This optimism about the future of the platform is also evident in the expanding role of the mainframe –  68 percent of respondents expect MIPS to grow— again the  highest growth outlook in the 15-year history of this  survey. Two-thirds of large shops currently have more than half of their data in mainframe environments,  further demonstrating, according to BMC,  the key role the platform plays in current and future digital business. 

This represents a clear shift from previous surveys when cost had consistently been the highest-ranked  mainframe priority among survey participants. This year, the most frequently cited top priority for the mainframe was compliance and security, cited by 63 percent of respondents.  Ironically, for years mainframe security had frequently been cited as one of the key strengths of the mainframe. It may be still, but this data suggests that mainframe respondents have  more security concerns on their minds than just what RACF addresses.

That, however, should not be taken to mean that cost is no longer a major concern. Sooner rather than later almost everyone DancingDinosaur speaks with about the mainframe brings up cost. Over the years IBM has gotten quite creative in trying to make mainframes seem less costly or otherwise rationalize the cost but the topic never goes away. (DancingDinosaur has written numerous reports justifying an organization’s mainframe investment.)

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog. See more of his work at technologywriter.com


%d bloggers like this: