Get Started With Quantum Computing

January 21, 2021

If you had free access to a quantum computer would anyone in your organization know what they would want to do with it tomorrow?  Probably not.

Roche: Calculating the unimaginable

I first heard about this thing called quantum computing when I was taking high school physics. It might not even had been called quantum computing then. I don’t remember what the teacher called it, probably quantum mechanics.  The only thing I clearly remember was the Schrodinger cat dead or alive story. Turns out in the quantum world the cat could be both at the same time, which probably was the only reason I still remember it forty years later. DancingDinosaur probably wrote about the cat before when quantum computing first came up here.  It’s a good story for us nerdy types: check it out here.

It comes up again here because IBM appears to be putting a renewed push to it, here. As IBM puts it, IBM Quantum leads the world in quantum computing, which aims to solve complex problems the world’s most powerful supercomputers cannot solve, and never will. (I’m not so sure about that; don’t dismiss future supercomputers that fast yet.) And to back that up, IBM promises the widest range of quantum computers for use in the cloud. These are not machines you are going to want to put in your data center. Just keeping them stable at the Kelvin-level cooling they require will blow your data center budget.

According to IBM its full quantum stack allows its partners to fully explore their next solutions with a sufficient level of fidelity and scale.  To play with these new machines you will need to become part of the IBM Quantum Network, which consists of about 100+  Fortune 500 companies, academic institutions, national labs, and startups. Here you could you gain access to the IBM quantum stack, allowing you to tackle whatever problems across finance, materials, logistics, and chemistry in ways you never imagined before.

So what can you do with quantum? Try these for starters, suggests IBM:

  • Daimler is exploring how quantum computing can advance the development of new materials for batteries, improve automotive manufacturing techniques and enhance product experience.
  • ExxonMobil is harnessing quantum computing to develop more accurate chemistry simulation techniques in energy technologies and solutions.
  • Mitsubishi Chemical is applying quantum computing to help develop lithium oxygen batteries with greater energy density.

The company also promises a range of goodies for everyone who might want to get involved. For instance, companies can partner with IBM Quantum to find commercial opportunities as well as learn how organizations are working with IBM now to solve today’s most challenging problems.

In terms of systems, it offers twenty-eight deployed quantum computers—the largest and most powerful set of commercial quantum devices. As part of its roadmap it plans to scale systems to 1,000 qubits and beyond. Today 100 qubits is big.

For developers, programmers can code quantum algorithms with Python and integrate quantum into their workflows by tapping IBM’s high-level libraries. That might actually help somebody like me, but don’t bet on it.

Then there is Qiskit, IBM’s quantum programming language. Qiskit is an open-source software development kit for quantum computers, used by people hoping to get a taste of quantum computing and its potential benefits. According to IBM, Qiskit has become the largest quantum computing community in the world, with over 15,000 scientists and developers actively working in the field using its tools with over 300,000 people using Qiskit resources to learn the fundamentals of quantum computation.

Putting these numbers into perspective, a 2020 survey from the American Physical Society’s Division of Quantum Information shows only 2,732 members conducting research in this area. While this core audience remains small now, its community shows potential demand and opportunity to develop an inclusive quantum workforce. If quantum is to take off in business, it needs to cultivate this type of workforce and it won’t be me, as much as I’d like to. 

As much as IBM has poured into its Quantum Experience in an attempt to turn Qiskit into a common tool that can be used with any quantum computer, it still admits a bias toward IBM Quantum’s devices. “In 2020, we’ve made improvements to both the providers interface, which serves as the layer between Qiskit and hardware as well as the transpiler to make it easier for people to have any quantum computer work well with Qiskit. I expect this trend will continue forward into 2021”, says Matthew Treinish, software developer on the Qiskit team

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog. See more of his work at technologywriter.com

IBM Bootstraps Blockchain To Speed Adoption

January 14, 2021

When a promising technology is slow to gain enough traction, IBM doesn’t hesitate to bootstrap it by building it out itself and recruit likely customers to join them. This is the proven IBM playbook which it is now using to ramp up blockchain.

In 2020, IBM claimed making big strides toward bringing blockchain into the mainstream by focusing on the benefits it offers enterprise users. Those virtues include the ability to connect with a wider range of partners to  securely share data across networks built on trust and transparency. Especially for large enterprises focused on digital transformation, the company envisions blockchain as a powerful tool.

As early as 2016 DancingDinosaur was reporting on Blockchain. To summarize: blockchain enables a distributed ledger with ability to settle transactions in seconds or minutes automatically via networked computers. This is a faster, potentially more secure settlement process than is used today among financial institutions, where clearing houses and other third-party intermediaries validate accounts and identities over a few days. Financial services, as well as other industries, are exploring blockchain for conducting transactions as diverse as trading stock, buying diamonds, and streaming music.

IBM in conjunction with the Linux Foundation’s HyperledgerProject expects the creation and management of Blockchain networks of services to power a new class of distributed ledger applications. With the HyperLedger and Blockchain developers could create digital assets and accompanying business logic to more securely and privately transfer assets among members of a permissioned Blockchain network running on IBM LinuxONE or Linux on z.

Even today Blockchain still can be shrouded in babble about consensus and cryptocurrency and ledgers, leaving IT managers confused. But in 2020, IBM made strides toward bringing blockchain into the mainstream by focusing on the benefits it offers enterprise users and recruiting companies to join specialized blockchain networks. This way they could connect with a wider range of partners to share data across secure networks built on trust and transparency. For large enterprises focused on digital transformation, this approach to blockchain could accelerate their efforts.

“I think almost every CIO that is driving transformation has pivoted to thinking how to attack this issue in a way that’s going to be iterative, agile, fast, and open,” notes IBM blockchain general manager Alistair Rennie.  “If you are trying to do multi-party integration with security and privacy, and you need to do it quickly in a way that is going to have a rapid business impact, then blockchain is a suitable technology to pursue that business goal,” he adds.

Already the company has organized, recruited, and tooled up a number of common blockchain networks, including supplier management, global trade, international payments, food supply, and more. Or how about identity protection? IBM Blockchain Trusted Identity is joining forces with others to build the Internet’s long missing, decentralized identity management layer.

Enterprises today, however, don’t have to rely on just one supplier or some crypto-currency vendor.  Gartner identifies dozens of vendors, although many have no reviews. Some include Ethereum Foundation, Microsoft Azure Blockchain, Oracle Blockchain Cloud Service, and more.

DancingDinosaur has been writing about blockchain almost yearly since 2016. Even back then it seemed a natural for large enterprises and the Z. Will this latest IBM push do the trick? Who knows? But I’ll check back again next year.  

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog. See more of his work at technologywriter.com

2021 IBM Hybrid Cloud Predictions

January 7, 2021

 

IBM has been singing the advantages of hybrid clouds for several years. For 2021, it is now going so far as to venture some hybrid cloud predictions.  

Courtesy: IBM Newsroom

At the start of 2020 nobody would have anticipated the year we just concluded. Does anyone need to be reminded that  by the end of March, the COVID-19 pandemic had disrupted operations worldwide, forcing businesses to quickly adapt their technology infrastructures to accommodate all or most of their workforces remotely and cope with unprecedented levels of long-term uncertainty. Who would have guessed that for many revamping their IT infrastructure fast would be needed for so long or be a key to their survival.

IBM and other enterprise hybrid cloud providers lucked out for sure. Haven’t seen anyone jumping up and saying I predicted it. I’m sure some will but they will probably wait until it gets closer to bonus and review time.

Not just hybrid clouds but some related technologies will also rise to the forefront. For instance. IBM notes: 

  • Wider adoption and experimentation with new security technologies, including Confidential Computing, quantum safe encryption and fully homomorphic encryption, which makes it easier to read and use encrypted data on the fly.
  • AI automation is making the shift to hybrid cloud faster and easier.
  • The integration of multiple clouds and on-premise systems into a single hybrid platform
  • The ability to leverage hybrid clouds to push more workloads onto intelligent edge devices.

These trends will continue in 2021 and beyond, especially security technologies such as Confidential Computing, quantum-safe, and fully homomorphic encryption, which allows you to  perform calculations on encrypted data without decrypting it first. This should allow even the most regulated industries to move to the hybrid cloud.

It’s already apparent, IBM continues,  that companies will continue to decentralize IT operations around hybrid cloud environments in 2021 and probably beyond. But to do that successfully, organizations will want to take further security measures that improve isolation, ensure system and data integrity, and implement zero trust strategies all while remaining compliant with tougher data privacy regulations worldwide, which aren’t going away, especially as even more complex security threats evolve.   

Not to be left out, IBM continues, is the Z. In particular, hardware systems that provide these security capabilities the company hopes, will be widely adopted to protect on premise and public cloud workloads. These hardware systems, especially LinuxONE and IBM Z, already provide a higher levels of security for both open source and traditional workloads. And the capacity, performance, security, scalability and non-fail availability are ideal for the task.

Now add to that IBM’s Industry-specific clouds, such as the IBM Cloud for Financial Services and IBM Cloud for Telecommunications. These were designed from the start to address unique challenges and security requirements of these regulated industries. IBM has already attracted Bank of America to its financial services cloud and claims more than 35 partners for its telecommunications cloud.

And it is not just hybrid clouds that have drawn IBM’s attention.  The company continues to pioneer quantum computers, which it expects will solve some of industry’s most challenging problems, problems that today’s  best performng supercomputers cannot solve. Those same quantum computers can also address risks such as the ability to fashion unbreakable encryption algorithms for when companies are ready to deploy quantum-safe cryptography. This could not only  secure data today but help protect against future threats.

This all sounds wonderful, like your most optimistic New Year’s Resolutions. You know, the ones you made a couple of nights ago after three or four drinks. But quantum computing continues to suffer from all the problems that plagued it last month and last year: noise interference, instability, faults,  loss of quantum coherence, and more. And we haven’t even mentioned the lack of applications to solve meaningful business problems. Can anyone even describe what those business problems are and what a quantum computing solution might look like?

DancingDinosaur is happy to cheer IBM and any other technology player and tout every quantum or hybrid cloud or technical achievement they make. The IT industry needs advancements this new year and every year going forward. Best wishes and go for it!

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog. See more of his work at technologywriter.com.

December 18, 2020

Introducing Spin Transfer Torq MRAM

Here is an obscure topic that promises superior performance for artificial intelligence computing workloads in hybrid cloud platforms. But that is exactly what IBM did earlier this week here.

Specifically, IBM researchers created what they describe as the first embedded Magnetoresistive Random-Access Memory technology built on a 14-nanometer architecture that the researchers call Spin Transfer Torque MRAM, or STT-MRAM. This new architecture helps solve a memory bottleneck in data-intensivehybrid cloud and AI workloads.

The problem results, explained the researchers, when enterprises move their most demanding workloads to hybrid cloud platforms where they encounter what they call a compute memory bottleneck, described as a memory shortage resulting from the processors in those systems being faster than the existing memory systems.

DancingDinosaur has been taught that memory eventually would be the bottleneck in any system. Or as TechOpedia explains: memory bottleneck refers to a memory shortage due to insufficient memory, memory leaks, defective programs or when slow memory is used in a fast processor system. A memory bottleneck affects the machine’s performance by slowing down the movement of data between the CPU and the RAM. The increased processing times lead to slow computer operations.

So DancingDinosaur would always order the most and fastest memory the budget and space would allow. He never regretted doing that. But he never faced a memory situation that resulted from the processors in those systems being so much faster than the existing memory systems. However, he had never faced a data-intensive hybrid cloud and AI workloads either.

You solve this bottleneck between memory and processors the researchers explain by enabling faster memory performance. Duh, why didn’t I think of that?

STT-MRAM, the researchers explain, works by using electron spin to store data in magnetic domains, and combine the high speed of Static RAM with the higher density of traditional DRAM to provide a more reliable memory architecture. As IBM explained; by deploying STT-MRAM in last-level CPU cache, you can reduce the amount of reading and writing to memory that’s required in data-intensive workloads, thereby reducing system latency and power consumption while increasing bandwidth. OK that makes sense to DancingDinosaur.

IBM boasts that the 14-nanometer Embedded STT-MRAM CMOS technology, described in its white paper is the most advanced MRAM system ever built. IBM insists it will allow for a “much more efficient, higher-performing system” for AI workloads in hybrid clouds.

The new architecture is enabled by the use of some advanced magnetic materials that are detailed in a second paper. The use of these materials enables greater density in STT-MRAM systems that can store twice as many bits, leading to a significant increase in data retrieval performance, IBM said.

The company also provided an update on its research into “analog in-memory computing,” where compute and memory are combined into a single device for more demanding AI workloads. This involves specialized hardware that has the potential to train increasingly complex AI models with far greater energy efficiency.

One of the challenges in creating specialized analog in-memory compute hardware for AI is known as the “synaptic weight mapping problem.” according to IBM. Synaptic weights are used to indicate the strength of a connection between two nodes in a neural network, and they need to be accurately mapped onto analog nonvolatile memory devices to enable deep learning inference. But doing so is a considerable challenge, the researchers said.

IBM’s paper, “Precision of Synaptic Weights Programmed in Phase-Change Memory Devices for Deep Learning Inference,” discusses how analog resistance-based memory devices that rely on pulse-code modulation might address the mapping challenge. The paper written by Abu Sebastian, Griselda Bonilla, and Dan Edelstein describes a way to map the synaptic weights accurately, both analytically and through array-level experiments.

At the upcoming Dec.  IEEE International Electron Devices Meeting (IEDM 2020) conference the IBM researchers will present multiple papers that address various aspects of the problem  focused on its 14 nm node embedded MRAM, debuting at IEDM, as the most advanced MRAM demonstrated to date. The advances could soon enable system designers to replace SRAM with twice the amount of MRAM in last-level CPU cache.

This will be the last DancingDinosaur in 2020. Good riddance to an awful year when over 300,000 people here died. It will resume in Jan.2021. Hope you have great holidays, that the new year will be better than 2020, wear a mask, and avoid Covid-19.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog. See more of his work at technologywriter.com.

Z and More for Hybrid Clouds

December 9, 2020

DancingDinosaur  first began writing about IBM and hybrid clouds in 2016. The topic kept popping up now and then through 2017, 2018, and 2019. In 2020 IBM got serious and began talking of accelerating hybrid clouds in a radical way.

It would do this, IBM explained, by separating the Managed Infrastructure Services unit of its Global Technology Services division into a new public company, called NewCo for now. This creates two companies, each with strategic focus and flexibility to drive client and shareholder value through hybrid clouds. All this restructuring should be finished by the end of 2021.

Today IBM promises to modernize and accelerate the path to the hybrid cloud. To do that, it is making investments in new hybrid offerings for IBM Z, IBM LinuxONE, IBM Power Systems and IBM Storage. Also included will be the bolstering of  Red Hat OpenShift and IBM Cloud Paks along with advances in IBM Storage for container workloads.

The new company (to be named at a subsequent date) will immediately be a  leading managed infrastructure services provider. It has relationships with more than 4,600 technology-intensive, highly regulated clients in 115 countries, including more than 75% of the Fortune 100, a backlog of $60 billion, and more than twice the scale of its nearest competitor.

IBM, however, is not the first company who thought of this. A recent article in ZDnet identified the 15 most important hybrid vendors. IBM was there, of course, along with many others you would expect, and a few of which you may be less familiar. 

That should not too discouraging to IBM. Recently it announced a strategic expansion of capabilities to provide clients with the flexibility to choose where to deploy workloads, from on-premises to the IBM public cloud, and drive forward their journey to the hybrid cloud. These new and enhanced solutions include containerized software across the IBM IT Infrastructure via IBM Cloud Paks (ready to deploy integrated software bundles) and advancements in data storage for containers.

According to a recent IBM commissioned Forrester study, The Key to an Effective Hybrid Multicloud Strategy, 85 percent of respondents are increasing funding for IT infrastructure outside of the public cloud, suggesting that organizations continue to rely on both on-premises and private cloud as part of their technology stack.  

“The global pandemic and ensuing economic disruption have amplified the need for speed and flexibility and accelerated plans for digital transformation,” said Tom Rosamilia, Senior Vice President of IBM Systems and Chairman, North America. “Our clients are increasing their investments in AI and cloud. They are moving swiftly to reduce costs while improving security with approaches like confidential computing and resiliency, and IBM’s hybrid multi-cloud strategy with Red Hat is at the epicenter of this transformation.”

Signaling strong adoption, many of IBM’s top IBM Z and Power Systems clients are currently running proofs-of-concept with Red Hat OpenShift on Z, and over 100 more are ready to start. In addition, more than 40 IBM clients are already using IBM Storage as a persistent, highly available repository with strong security capabilities for OpenShift. This builds on a longstanding collaboration between IBM IT infrastructure and Red Hat as well as previous investments made across IBM Z, LinuxONE, Power, and Storage offerings to support Red Hat OpenShift, a leading enterprise Kubernetes platform.

Today, over 100 new to LinuxONE and Linux on Z clients are now running mission-critical applications in the hybrid cloud. These clients are all sizes from startups to enterprises, representing industries ranging from healthcare, retail, transportation, financial, and technology services to public sector.

IBM announced new and upcoming capabilities designed to help such clients implement hybrid cloud with IBM IT Infrastructure, including:

  • Ability to infuse AI throughout the business and consolidate databases with IBM Cloud Pak for Data, already available on IBM Power Systems, planned for November on IBM Z
  • Accelerate digital transformation and connect cloud native applications to existing workloads with IBM Cloud Pak for Integration on IBM Z
  • Faster cloud native development with integrated runtimes enabled by IBM Cloud Pak for Applications for IBM Z and Power Systems
  • Increased visibility, automation, and governance across the hybrid multicloud with IBM Cloud Pak for Multi-Cloud Management for IBM Z and Power Systems

IBM Storage is also bringing extensive new and future enhancements to its storage and modern data protection solutions. For IBM, hybrid cloud is just ramping up.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog. See more of his work at technologywriter.com.

4 Reasons Z Not Outdated

November 30, 2020

DancingDinosaur continues to get questions about the Z being outdated. That’s surprising because Linux was introduced on Z over 20 years ago. It was a gutsy leading edge move  then and through the use of subsequent Z assist processors Z has continued to keep pace with the latest technology changes, from containers to python.

So here are four reasons your Z should not become outdated (unless you allow it to).

  1. Linux on Z
  2. IBM LinuxONE
  3. Red Hat OpenShift on Z
  4. Support for the most popular languages and coding practices

Linux on Z. Some other IT analysts might think otherwise, but for DancingDinosaur, today’s modern Z started with IBM’s introduction of Linux on Z. Finally there was an open source system that I actually understood and could use. Of course it would take a few more years of tweaking by IBM and others before Linux would run seamlessly on the Z but I was overjoyed anyway. It seemed at that time  a great step forward to making the Z more accessible to non-technical people and possibly less expensive and more flexible, which would attract smaller companies. The Z needed to address more than just the Fortune 100. 

IBM LinuxONE. A smaller, single frame Z. Again, poised to attract companies other than the Fortune 100. IBM also implied that it would be less expensive. DancingDinosaur wrote about this in Sept. 2018 here.

At that time IBM dubbed it as  the newest generation of the LinuxONE, the IBM LinuxONE Emperor II, built on the same technology as the IBM z14, which DancingDinosaur covered here. The key feature of the new LinuxONE Emperor II, is IBM’s Secure Service Container, presented as an exclusive LinuxONE technology representing a significant leap forward in data privacy and security capabilities. With the z14 the key capability was pervasive encryption. This time the Emperor II promised very high levels of security and data privacy assurance while rapidly addressing unpredictable data and transaction growth. And as LinuxONE it would be cheaper, right?

IBM still sees itself in a titanic struggle with Intel’s x86 platform. With the LinuxONE Emperor II IBM thought it had the chance to change some minds. That doesn’t mean, however, the Emperor II is a Linux no brainer, even for shops facing pressure around security compliance, never-fail mission critical performance, high capacity, and high performance. Change is hard and there remains a cultural mindset based on the lingering myth of the cheap PC of decades ago. IBM wasn’t likely to cut prices that low or offer deals companies couldn’t refuse. But the machine still has great security, capacity, and performance specs, and as IBM promises Linux is seamlessly built in this time. Even I might be able to get it up and running.

Red Hat OpenShift on Z According Kavita Sehgal, an IBM expert in designing and deploying Hybrid Cloud solutions on Z mainframes, gives developers agility on a platform that’s modern, scalable, automated, secure, reliable, and compliant to the standards that governments and regulated industries require. These differentiators are essential to any company that  needs to run mission-critical workloads on the Hybrid Cloud, and needs visibility into those workloads—whether on premise, on a private cloud, or on a public cloud (in effect, a hybrid cloud).  OpenShift also facilitates the mixing of different development technologies.

Support for most popular languages and coding practices  Through OpenShift on Z together with the other Z assist processors you can combine C++, Java, Python, Perl, and other common Linux languages, expedite the use of containers, and more.

So, there is enough flexibility above to ensure your Z won’t get outdated unless you want it to.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog. See more of his work at technologywriter.com.

The Pandemic and Z

November 19, 2020

DancingDinosaur really wants the pandemic to end. Being such a good Do-B, he often feels he is the only one still wearing a mask, using gloves, staying home, not physically shopping in stores etc. etc. It will be eight months in December and I go almost nowhere. My car gets about 3 weeks to the gallon. I start it up every few days just to see what’s camping out at the foot of the driveway. 

Pandemic virus: courtesy Getty

Back in April, when the pandemic was still novel and state governors were just realizing that they would have to step up and manage the situation with a few regs since the federal government apparently had punted on it. IEEE Spectrum came out with a glowing article on the Z headlined Mainframes Are Having a Moment. Nine months later it is turning into much longer than a moment.

This is not going to end in 2020 and probably not even 2021 since millions of people will have to get vaccinated first (2 doses each, spread a couple of weeks apart) before it will be safe to act like we used to, if we can still remember what that was. So many delivery drivers have been dropping stuff at my door that I’m wondering if I should give them a holiday tip. 

Back in April, Spectrum wrote: there’s a silver lining to state unemployment insurance systems’ failings caused by the COVID-19 crisis. People are urgently needed who can program mainframes. Seems the unemployment systems were choking as millions of people, week after week, month after month were filing claims and bringing the unemployment systems to their knees. Many, sadly, still are.

The Open Mainframe Project put out a desperate call for COBOL programmers at the state level and DancingDinosaur ran the announcement in mid April. As the announcement said: More than 10 million people in the United States have filed for unemployment amid the COVID-19 global pandemic and the resulting financial crisis ensued. The ranks of the unemployed continued to grow for months that followed. The growth has only recently started to moderate slightly.

Most colleges had dropped the mainframe ball years before in their mad chase of the sexy distributed systems their students wanted. Many college and university computer science departments had already dropped mainframe programming curriculum to focus on more modern languages and technologies that appealed to students. Ironically, only then faculty and staff started to  report an uptick in interest in Cobol. 

The increase actually began around the time pandemic-related layoffs inundated state unemployment agency computer systems, causing government officials to put out the urgent call for programmers who know Cobol to step in and help. DancingDinosaur learned Cobol decades earlier in college, but never used it after he squeaked by with a B-.  By the time I went to grad school for communications I peeked into the Computer Science Dept and they weren’t even offering Cobol. 

DancingDinosaur started this blog years ago after writing a series of freelance articles that said the mainframe was dead in response to an editor’s request. The editor would get a press release that said this or that company was eliminating its mainframe in favor of new distributed systems, meaning PCs. 

Being a hungry freelancer on the make for another gig, I called back those same companies that months earlier had told me how much they would gain by getting rid of the mainframe. After an embarrassing silence punctuated with a lot of ums and uhs they would sheepishly admit the mainframe is still there followed by a slew of excuses for why it wasn’t working out as planned. 

I contacted the editors who initially assigned me the mainframe-is-dead stories and told them what I learned. My big hoped-for an expose’ never followed. They had lost interest in the topic. So I started DancingDinosaur, which never replaced my freelance writing income but it is a lot more fun.

So how does one have fun with DancingDinosaur? By following any vaguely related topic that you want, like the pandemic.  

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog. See more of his work at technologywriter.com.

RPA Leads Revival of Z Terminal Emulation

November 13, 2020

When Rocket Software announced its acquisition of ConnectiQ mainframe and WebConnect mainframe terminal emulation this past October the obvious question  was why. Mainframe terminal emulation isn’t exactly new. But digging more closely into the announcement, it became clear: this is about Robotic Process Automation (RPA). 

As it turns out, RPA is just one of many forms of business process automation software. It is based on metaphorical software robots or AI digital workers.  By 2022, 65% of organizations that deployed robotic process automation also will introduce AI, including machine learning and natural language processing algorithms. So RPA encompasses more than terminal emulation for Z.

RPA, courtesy of Enterprisers Project

Gartner, in a recent report, actually takes it even further, calling it  hyperautomation, which it describes as an effective combination of complementary sets of tools that integrate functional and process silos to automate and augment business processes.

At Rocket, IBM Z customers have faced ongoing pressure to re-platform, even as enhancement and integration consistently proves to be the faster, less painful, and more cost-effective approach. With 30 years of experience optimizing, enhancing, integrating, and strengthening legacy platforms, Rocket is committed to delivering the broadest set of options for IBM Z and IBMi because no single approach fits every circumstance. Hence its latest Z terminal emulation acquisitions, which can delegate repetitive, lengthy tasks to efficient mainframe RPA.  

“Automation will take on an even more critical role in a post-pandemic world as cost takeout and business resilience become chief destinations on the technology roadmap,” said Craig Le Clair and Leslie Joseph of Forrester Research in a report entitled Ten Golden Rules For RPA Success. “RPA will be the first stop along the path to intelligent automation.”  

Gartner further raises the ante with hyperautomation, a combination of complementary sets of tools that can integrate functional and process silos to automate and augment business processes. It already pegged hyperautomation as one of the top 10 strategic trends for 2020.

Rocket is also acquiring WebConnect, an enterprise-class terminal emulation solution that will continue the company’s 15-year investment in the critical domain of host access for mainframe systems. The company’s ongoing innovation ensures that users can access their IBM Z, IBMi, and other VT-based systems in any way they require.   

Process automation, robotic or something else, is attracting much attention as organizations start to figure out how they will function in the post pandemic world that will eventually arrive in 2021 or 2022, we hope. Like Gartner, Forrester too has chimed in.

Forrester notes RPA and similar automation still present some pitfalls; 

  • Scale remains its Achilles’ heel. More than half of all RPA programs worldwide employ fewer than 10 bots. Moreover, less than 19% of RPA installations are at an advanced stage of maturity. Fragmented automation initiatives, a patchwork of vendors, incomplete governance models, and attempts to automate overly complex tasks stall efforts. 
  • Enterprise programs lack the momentum needed to meet ROI targets. At least 25% of companies struggle to meet their ROI targets. To these firms, scale means finding and automating more tasks.
  • Finding enough tasks to automate is the biggest scale issue. Need repetitive tasks that occur in high enough volume to justify the cost of building a bot but there is too much variation, even when the outcome is the same. 

“At Rocket, we work with customers to identify the best path to achieve their concrete outcomes,” said Christopher Wey, President of the Rocket IBMi business unit. “We’re also excited to bring the transformative force of RPA with ConnectiQ to IBM Z customers.” 

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog. See more of his work at technologywriter.com.

Edge Computing Meets 5G and Hybrid Clouds

November 4, 2020

We’ve known for several years that IBM adores hybrid clouds. It was just about a month ago that IBM restructured almost its entire business in an effort to pursue what it perceives as a $1 trillion hybrid cloud opportunity. DancingDinosaur covered it here.

Courtesy: PCMag.com

The company didn’t stop there. Rather, it leveraged  two key enabling technologies—edge computing and 5G high-bandwidth, low-latency, edge-based wireless networks–to enable any business to speed its digital journey.  In fact, Gartner predicts that by 2025, 75% of enterprise-grade data will be created and processed by 5G devices at the edge, and every industry will be impacted by this shift.

With that in mind, IBM teamed up with AT&T to deliver global access to 5G devices. The shift is fueled by two key enabling technologies—edge computing and 5G high-bandwidth, low-latency wireless networks, which make it possible for businesses of all sorts to boost processing efficiency. Specifically, 75% of enterprise-grade data will be created and processed by devices at the edge, close to where it was created and is used, not Gartner 

Enterprises can capture the value of these technologies simply by bringing hybrid cloud computing to a low-latency edge environment, enabling them to more quickly and securely build new applications in an edge environment or even on-premises. That’s the theory, at least 

As IBM puts it, 75% of enterprise-grade data will be created and processed by devices at the edge, and every industry will be impacted. Enterprises can capture the value of these technologies just  by bringing hybrid cloud computing to a low-latency edge environment, enabling them to more quickly and securely build new, innovative applications in an edge environment or on-premises.

That’s why IBM and AT&T announced their collaboration in the first place. AT&T brought the global network and IBM brought its open hybrid cloud platform built on Red Hat OpenShift. This should make it easier for enterprises to manage a heterogenous hybrid cloud computing environment in a low-latency, private cellular network edge environment.

Dong processing efficiently at the edge simply is faster and more efficient than shipping it elsewhere. Red Hat OpenShift facilitates the consistency of running workloads on a range of edge devices across different environments.

For telcos, as IBM puts it, this open approach is critical. IBM has been working with companies all over the world to do just that. In fact, 83% of the world’s largest telcos are IBM clients—including Vodafone, Verizon, Bharti Airtel and others.

By making it easier for businesses to manage open hybrid cloud computing in a low-latency, private cellular network edge environment like that of AT&T will help businesses across a variety of industries to quickly and securely build applications using regional or on-premises edge computing. Better still, built on Red Hat OpenShift and the IBM Cloud Satellite gives clients the flexibility to bring their applications to any environment where their data and processing may reside  while leveraging the efficiency of IBM’s open hybrid cloud-based edge processing.

As new hybrid cloud services emerge, enterprises can tap into the power of 5G for a wide range of uses, such as factory safety and efficiency, real-time health monitoring, or autonomous vehicle operation. And because they are being handled at the edge, these processes avoid the inefficiency of even the millisecond latency of sending workloads to a centralized cloud.

At the same time, companies need a secure environment in which to run and interconnect their critical workloads, from the tiniest wireless monitoring device on the network’s farthest edge to the central cloud—as well as all on- and off-premises points in between. 

Will this pay off? IBM reports it has been working with telcos on this.  It notes  83% of the world’s largest telcos are IBM clients—including Vodafone, Verizon, Bharti Airtel and others. Now all that’s needed are actual customers to put processing and applications on the edge to take advantage of this.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur

BizTransformation and the CHRO

October 27, 2020

If you haven’t had to write a proposal to win approval for a new technology project in the past year or so, make sure to include business transformation as the primary objective. Don’t worry if you haven’t a clue what you will actually be transforming. In the last eight or nine months, since the pandemic first made itself known, every business has been forced to change in multiple ways, in ways they never would have guessed. So relax, make business transformation your objective and you’ll have no shortage of projects to fill it in with when somebody requests documentation.

Just think about it. How often have you had to adjust to deal with customers who are suddenly working from home, or staff who are now working from home while they also are helping their children with a Zoom substitute for attending school. Believe me, you will have no shortage of transformations that your business is unexpectedly facing. 

Look at your supply chain, has that changed in any way? Or your customer service or sales processes. Any transformation going on there? Or product development, design, and engineering. These should be able to keep your business transforming until the end of time, at which point the pandemic might just be winding down, regardless of what the President says. 

The pandemic is paving the way for one change that human resources (HR) people have long been salivating over–elevating HR into the C-suite. DancingDinosaur, which has never been hired over his 30-year career as a technology writer for a salaried job, working instead as an independent contractor. To DancingDinosaur, HR was somebody who periodically bugged him to fill out a 1099 form, which he was always happy to do because he liked getting paid.

But the changes wreaked by the pandemic among the millions of salaried people, who suddenly found themselves utterly unprepared to be working from home for what has turned out to be months. These people often lacked even a suitable ergonomic desk chair to work at the kitchen table for extended periods.

Now the HR folks are not just blank fillers or form chasers. They have to counsel and help people organize themselves to do serious work productively for an extended period of time, like maybe forever.

Suddenly  as “C-suite leaders look to rapidly transform to meet new customer needs and overhaul business models, they report inadequate skills among their biggest hurdles to progress,” writes Amy Wright, IBM Managing Partner forTalent and Transformation. The needs include both technical skills to work with technology as well as behavioral skills like agility and the ability to collaborate effectively. She continues: “At the same time, our consumer research shows there has been a permanent shift in the expectations employees have of their employers, including better support for their physical and emotional health or skills training.”

The result is a new C-suite acronym, the Chief Human Resources Officer (CHRO). HR proponents argue that this is the moment to evolve, shifting away from a process-oriented function to an agile consulting arm and in doing so, drive engagement and productivity, foster trust in uncertain times, cultivate resilient workforces, and add some strategic perspective.

Not everybody may be ready to elevate the HR director to the CHRO, but the position got a big boost by an article in Forbes.  Now, more than ever, organizations require HR to set the tone for good workplace culture, excellent employee experiences, high retention, succession planning, change management, and other strategic business objectives. Today organizations also require HR to set the tone for good workplace culture, excellent employee experiences, high retention, succession planning, change management, and other business objectives, not just clean up awkward messes. Just don’t forget distributing 1099 forms to people like me.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog. See more of his work at technologywriter.com and here.


%d bloggers like this: