GSMA Telco Network

October 3, 2022

In September the GSMA announced the ]the formation of the GSMA Post-Quantum Telco Network Taskforce with IBM and Vodafone as initial members. The goal is to help define policy, regulation, and operator business processes for the enhanced protection of telecommunications in a future of advanced quantum computing.

GSMA Post Quantum Telco Network

The industry needs this because unlike today’s computers that rely on bits for calculation, quantum computers harness the exponential power of quantum bits (qubits). The result can be a complicated, simultaneous mix of 1s and 0s, creating the potential to solve extremely complex problems that challenge even the most powerful supercomputers today. 

According to the announcement, the GSMA Post-Quantum Telco Network Taskforce expects to help define requirements, identify dependencies, and create a roadmap to implement quantum-safe networking, mitigating the risks associated with future, more-powerful quantum computers. Without quantum-safe controls in place, sensitive information such as confidential business information and consumer data could be at risk from attackers who harvest present-day data for later decryption. 

The group noted that the World Economic Forum recently estimated that more than 20 billion digital devices will need to be either upgraded or replaced in the next 10-20 years to use the new forms of quantum-safe encrypted communications.

“Only by working together to establish consistent policies, can we define quantum-safe approaches that protect critical infrastructure and customer data, and complement our ongoing security efforts to increase resiliency in future networks,” said Alex Sinclair, the GSMA’s Chief Technology Officer.

To address the challenges presented by emerging quantum technology, the U.S. National Institute of Standards and Technology (NIST) announced in July 2022 that it had chosen the first four post-quantum cryptography algorithms to be standardized for cybersecurity in the quantum computing era.

IBM, a leader in cryptography and pioneer in quantum technology – with the world’s largest fleet of cloud-accessible quantum computers – contributed to the development of three of NIST’s four chosen post-quantum algorithms.

“Given the accelerated advancements of quantum computing, data and systems secured with today’s encryption could become insecure in a matter of years. IBM is pleased to work with the GSMA Post-Quantum Telco Network Taskforce members to prioritize the telco industry’s move to adopt quantum-safe technology,” said Scott Crowder, Vice President of IBM Quantum Adoption and Business Development.

“In a modern hybrid cloud world, communications services and compute technologies are interconnected and underpin all industries, which means the adoption of quantum-safe cryptography in telecom will affect all enterprises and consumers. This taskforce will support the telco industry by creating a roadmap to secure networks, devices and systems across the entire supply chain,” adds Steve Canepa, General Manager, Global Industries, IBM.

The GSMA Post-Quantum Telco Network Taskforce will convene to drive consensus and adoption in this new field and it will be oriented across three areas:

  • Strategy – to integrate quantum-safe capabilities into telco network operators’ technology, business processes and security.
  • Standardization – to identify the needs and common alignments for the integration of quantum-safe capabilities into existing telco networks.
  • Policy – to advise on telco network public policy, regulation and compliance and to ensure scale across the industry.

Ironically,  future quantum computing could inherently undermine the very cryptographic principles we rely on today. I guess that’s what the GSMA task force is expected to resolve.

Alan Radding is DancingDinosaur, a veteran information technology analyst, writer, and ghostwriter. Follow DancingDinosaur on Twitter.

IBM  Optica zVT System 

September 8, 2022

IBM reports the Optica zVT system now offers a standard set of advanced features that protect your data against cyber-attacks and ransomware while enabling an efficient recovery methodology if your data is compromised or corrupted, which inevitably will occur sooner or later.

Courtesty IBM

The Optica zVT system can be architected using standard features to automate the creation of air-gapped, immutable copies of your backup data to protect from increasingly common modern cyber vulnerabilities.

Here are, according to IBM, the various architectures for implementing the necessary data security features and recovery steps to mitigate your risk. You can, in short, learn from IBM’s experts how efficient and affordable cyber protection can be with its latest mainframe virtual tape.

DancingDinosaur is primarily interested in IBM’s newest mainframe tape systems. That’s the latest zVT, the Optical zVT system that can be architected using standard features to automate the creation of air-gapped, immutable copies of your backup data to provide protection from modern cyber vulnerabilities, which invariably will crop up due to human and system errors or deliberate attacks.

IBM already has consolidated its options to the high-end TS7760, retiring their TS7720 and TS7740 models. Similarly, Oracle has also delivered significant performance and enhancements to its VSM offering, where the latest VSM 7 delivers significant resources when compared with the VSM 6 and older predecessors.

Not to be outdone, Optica Technologies, also  a leader in delivering high quality storage and connectivity solutions for the mainframe, recently announced a relationship with IBM around the Optica enhancements to its Optica zVT.

The zVT comes configured and ships with license support for 16 virtual tape drives, 8TB of RAID-6 protected storage, and hardware compression in an economical 2U appliance. 

In short, welcome to this new world of mainframe tape backup.

To take advantage of this new tape backup world IBM offers three suggestions:.

1–Automate the isolation of your data in addition to standard DR copies and processes

2–Protect your data on-premise and via public-private cloud

3–Efficiently recover via a clean copy of your backup data in any location

Are you ready? This sounds far simpler than when I first encountered enterprise data center tape backup decades ago. It you’re in need of simplified data center tape backup this is probably as good as it gets until the next major advance, which could be coming next year for all anyone can tell.

Alan Radding is DancingDinosaur, a veteran information technology analyst, writer, and ghostwriter. Follow DancingDinosaur on Twitter.

Platform to build for tomorrow now

August 26, 2022

So what do you need to build your business for tomorrow, one starting right how. To begin, it needs to be cloud-ready from Day 1. That means multiple clouds, hybrid clouds, mixed clouds, any clouds however and whenever and wherever and whatever you find them.


IBM z16 (courtesy of IBM)

Why clouds? In case you haven’t noticed, clouds are increasingly the way, how, and where you will find the critical IT resources and capabilities and even the partners you will need. These won’t be only in your data center or some partners data center. You will need to pull the resources and capabilities you require from wherever and whenever you need them. And. hopefully, they will be right there for you, in the cloud. All you need is to be ready and able to find them when you want them.

Of course, it would be good if you took a moment right now to look around the cloud and identify those that look most promising to you and start the process of connecting to them.   

Just in case you missed it, digital transformation already is accelerating. In the process, it is creating new opportunities for IT, your business, you and your customers, and your customers customers and their partners too. You probab haven’t even noticed or considered yet so you better get started.

Not surprised if your head is spinning. So is mine. And so are your customers and partners too along with competitors and the partners you haven’t even met or considered yet. Naturally, all of this change brings new challenges. 

As these cloud-driven challenges transform you and everyone else, businesses should, undoubtedly, seek ways to leverage the potential of AI across the organization to proactively address the changes that already are underway as well as those you haven’t yet encountered but may want to.

You will need the AI because there will be a lot of stuff, partners, customers, potential customers and partners to keep up with. These can include everything from unplanned events, both wanted and unwanted, to increases in cyberattacks. As you can sure imagine, the impact on  business resiliency will come fast and prolonged. Like we noted above, you’ll need AI.

And businesses will also seek greater agility to capture more and more of the value  swirling round them while modernize and protect their hybrid cloud investments, which we hope you already started to build and deploy as you guide the future of your business.

Maybe the scenario portrayed above caught you by surprise or maybe it is something you are only just starting to begin preparing. Don’t worry. It is not happening tomorrow, but assuredly, it will happen, probably sooner than you or I think.

That’s where the IBM z16 comes in. Innovation lies at the heart of the new IBM z16 platform. Designed with breakthrough technologies built in, IBM z16 it can help you realize what you have what remains untapped and even unknown for now.

potential of your digital transformation. 

As IBM puts it, the z16 features the IBM Telum processor at its core, with industry-first on-chip integrated accelerators to predict and automate with AI at the speed and scale and with extremely low latency. This industry-first quantum-safe system, it proactively protects against harvest-now-decrypt later schemes on a single system that can process 25 billion secure transactions per day. 

The cyber resilience at its core, IBM continues, extends to innovation in automated compliance that saves time and resources as well as taps the new flexible capacity options that can proactively avoid disruptions while managing workloads across different sites, even residing on different clouds, in seconds. And the platform promises to be a catalyst for digital transformation through open-standards, flexible consumption models, and easy, rapid integration as part of any or all hybrid clouds.

In the process, the company adds, it can Increase decision-making velocity with the industry’s first integrated on-chip AI acceleration designed for analyzing real-time transactions at scale. It also can help protect and future-proof your data with an industry-first, quantum-safe system. Similarly, it will enable cyber resiliency through automated compliance and a capacity shift in seconds which allows it to unlock 2.5 times more value at a lower TCO than with the public cloud.

Not every organization wants or needs any or all of this now. Does this sound like something your organization can benefit from at some point? At least you should start thinking about some of it. 

Alan Radding is DancingDinosaur, a veteran information technology analyst, writer, and ghostwriter. Follow DancingDinosaur on Twitter.

Bias in Tech Advertising

August 8, 2022

Bias in Tech Advertising

Are you surprised? If you, like me, probably have been on the receiving end of more bias-laden ads than you ever cared to receive but you also probably sent more than a few yourself. Still, IBM appears to be serious about this issue. 

IBM first announced this initiative at the Cannes Lions International Festival of Creativity 2022. It brought together agencies, brands, and others while commiting organizations like IBM, Delta Air Lines, WPP, Mindshare, 4A’s, IAB and the Ad Council.

In addition, Cannes Lions International Festival of Creativity 2022, brought together agencies, brands, and other leaders to generate awareness and take action towards mitigating bias in advertising technology. The committing organizations, as noted above: IBM, Delta Air Lines, WPP, Mindshare, 4A’s, IAB and the Ad Council.

This action is the most recent effort by IBM to drive education and awareness around the impact of bias in advertising technology. In 2021, the company launched a research initiative to explore the hypothesis that bias can exist in ad technology, which initial findings subsequently confirmed. Was anyone surprised?

Toward that effort, IBM also announced the release of its gratis  ( think that means free)  Advertising Toolkit for AI Fairness 360, an open-source solution deploying 75 fairness metrics and 13 state-of-the-art algorithms to help identify and mitigate biases in discrete data sets. 

“Through WPP’s GroupM,” the IBM spokesman contininued, “we eveloped the Data Ethics Compass to help clients navigate the challenges of using datasets, while IBM’s new Advertising Toolkit for AI Fairness 360 will help us to better understand the potential impact of bias. Consumers rightly expect brands to use their information in a fair way and for the industry to tackle data bias collectively, which can ultimately result in increased engagement and commercial outcomes.”

A little bias may be unintentional but still it is important to customers. According to Salesforce’s 2022 State of the Connected Customer survey, nearly 62 percent of consumers surveyed reported they are concerned about bias in AI, up from just 54 percent two years prior, emphasizing the imperative for brands and agencies to better understand its impacts.

“As technology and data prevalence accelerates, the risk for bias in advertising compounds. It is our duty to address this head-on,” said Adam Gerhart, Global CEO of Mindshare. “We believe the industry needs to take clear and intentional action, which is why we are committing to leverage the Advertising Toolkit for AI Fairness 360.”

Nearly $1 trillion was spent on digital advertising globally in 2021, much of which flows through programmatic engines that segment and target specific audiences, sometimes missing large consumer groups in the process. Without ignoring increasing consumer demands for transparency in how their data is used, marketers must also look for new ways to remain effective. And, of course, mitigate any inherent bias. It is a tough balancing act.

“As a global brand, we know that every decision we make, whether it’s about a supplier, an employee, or an ad campaign is a reflection of our values and the change we want to see in the world,” said Emmakate Young, Delta’s Managing Director of Brand Marketing. “We’ve long been focused on inclusive representation in our campaign creative. This effort allows us to go a step further to bring more inclusive representation to our campaign delivery.” All DancingDinosaur, which, admittedly, has written its share of deceptive copy can say is “Good luck.”

Alan Radding is DancingDinosaur, a veteran information technology analyst, writer, and ghostwriter. Follow DancingDinosaur on Twitter.

Right Architecture for Uncertain Times

July 22, 2022

Are these uncertain times? Maybe. People are unsure where, when, how, or if they should report to work. Who will they report to; where, how, and when. If I was a typical employee these might appear uncertain times  FYI, DancingDinosaur has never been an employee so everything about work was always uncertain from Day One.

People I meet with are rapidly automating their operations with software to improve productivity by managing manual infrastructure tasks. They want to leverage any increased productivity to enable even higher-value projects. All the new IBM Power10 processor-based servers, for instance, have consistent automation and management to optimize workload deployments across hybrid clouds that are built on Red Hat OpenShift, managed with IBM Cloud Pak technology and automated with Red Hat Ansible. 

Organizations can also separately purchase enterprise hybrid cloud application monitoring and observability with Instan on Power to anticipate issues and leverage resource optimization with Turbonomic (IBM AI enabled) on Power.

We have also added advancements that expand and simplify our pay-as-you-go offerings, providing clients more flexible consumption choices for their infrastructure. We offer built-in cost optimization so you can take advantage of cloud economics on-prem. In addition, the Power S1022, Power S1024 and Power E1050 all have Power Private Cloud capabilities, allowing server resources to be shared in pools during spikes in demand and increase flexibility.

Metered capacity with monthly invoicing is part of Power Private Cloud for pay-per-use consumption so the payment experience mirrors how you pay for public cloud. You maintain control of your critical applications and data wherever they reside. That’s the IBM way, right?

The recently introduced Power10 processor-based systems are optimized to run mission-critical workloads (like core business applications and databases) and maximize the efficiency of containerized and cloud-native applications. An ecosystem with Red Hat OpenShift also enables incremental innovation, Meanwhile cloud-native development is available with applications on AIX, IBM i and Linux to run any of the organization’s other activities. 

For this, each Power10 processor core has four Matrix Math Accelerators designed to improve performance of AI models and lower latency by running inferencing on the same server to access and analyze data faster  

Cybersecurity invariably comes up early.  One hallmark of the new Power10 processor-based systems is platform integrity from the processor to the cloud. The new Power S1014, Power S1022, Power S1024 and Power E1050 platforms support transparent memory encryption, enhanced isolation and Trusted Boot to help prevent emerging side-channel attacks from hackers. 

They also were designed to enhance security across hybrid cloud environments without impacting performance of business-critical applications. To that end it included security standards that are intended to support cryptography advancements — such as quantum-safe cryptography and fully homomorphic encryption — to help protect today’s data from both the expected and unexpected  bad actors.

According to IBM the market reception of Power10 has been good. The company feels these are the right servers for these uncertain times with a broad family of options.

Alan Radding is DancingDinosaur, a veteran information technology analyst, writer, and ghostwriter. Follow DancingDinosaur on Twitter

Wazi as a Service for Z/OS

July 7, 2022

IBM’s Wazi-as-a-Service (WaaS), introduced by Michael Cooney at the end of June, brings hybrid-cloud app services to z/OS mainframes. The company introduced WAZI as a service for its mainframe customers to create a cloud environment for developing and testing applications. Ideally, it  can be used to create z/OS infrastructure instances for cloud app testing and development.

As such, Wazi-as-a-Service can be used to create z/OS infrastructure instances for development and testing z/OS application components in a virtualized, containerized sandbox. The instances would run on Red Hat OpenShift on x86 hardware. The service also includes access to z/OS systems and integrates with modern source-code management platforms such as GitHub and GitLab.

However, Wazi-as-a-Service cannot be used for refactoring or re-engineering existing z/OS based applications with the intention to re-platform them, and cannot be used for production workloads, IBM said. Although that does sounds tempting to me and probably to many DancingDinosaur readers.

The service does support self-provisioning of z/OS systems in virtual server instances (VSI) in the IBM Cloud Virtual Private Cloud (VPC) that can be used for development and testing. It allows the ability to manage compute, storage, and networking resources so maybe it is not as useless as it might seem at first.

The service can spin up z/OS virtual servers in six minutes or less, Alan Peacock, general manager of IBM Cloud Delivery & Operations, wrote in a blog. “You can either use a pre-installed stock image or extract components from an on-prem system and deploy a custom image onto the virtual server using IBM Wazi Image Builder in IBM Cloud’s Virtual Private Cloud environment,” he wrote. “This environment is a logically isolated, highly secured private space running in the IBM Cloud, thereby eliminating the wait times involved in getting access to resources.”

Isolated development environments combined with DevSecOps testing tools enable developers to start testing at a much earlier stage of the development lifecycle, Peacock stated. “The virtual server running on real zSystems hardware provides up to 15x better performance than comparable x86 offerings. This virtual server enables you to accelerate software release cycles and improve software quality.”

The service is not generally available, and use of it requires customers to be allow-listed by IBM sales.WAZI-as-a-service is part of recent packages IBM offers to better integrate Big Iron into the cloud. It recently rolled out the IBM Z and Cloud Modernization Stack 2022.1.1 service offers industry-standard tools to modernize z/OS applications on a pay-per-use basis. The service supports z/OS Connect, which utilizes a JavaScript Object Notation (JSON) interface to tie into and link with existing applications to make Z applications and data part of a hybrid-cloud strategy.

Multi-cloud solutions provide opportunities for IT organizations to drive value – but they must be set up and managed closely to deliver speed, flexibility, cost, and operational efficiency.

WAZI-as-a-service also supports z/OS Cloud Broker, which integrates z/OS-based services and resources into Red Hat OpenShift to create, modernize, deploy, and manage applications, data, and infrastructure. If you can deal with its various constraints, it promises to be a very useful tool for active Z shops.

Alan Radding is DancingDinosaur, a veteran information technology analyst, writer, and ghostwriter. Follow DancingDinosaur on Twitter.

Only Path to Progress?

June 29, 2022

AI, cloud, and quantum computing are revolutionary in their own right, which is why DancingDinosaur continues to look at them ever more closely.  but where they converge, IBM believes, is the potential for a change in computing that could surpass anything before. 

Sounds pretty outlandish to me, but IBM insists you could uncover solutions to complex problems. The company insists it is the future of solving problems. Even after watching its video, I’m not convinced.

pic credit: tomwieden/Pixab

OK, sure it is a tad overdramatic, even hokey, but let’s give IBM, for a moment, enough benefit of doubt to look at their pitch.  Over the years I have used their technologies and not have had any serious complaints yet. (OK, maybe about price and packaging but not about the technology performing as promised.)

The company reports it is working on new types of chips, designed specifically for artificial intelligence, quantum computing, and next-generation systems. –exactly what we would expect from them. It continues: “We’re also working on software to power those devices and to make it simpler for enterprises to tackle those problems like never before. OK, so far no quibbles with that.

Yes, AI, cloud, and quantum computing are revolutionary in their own right, but where they converge, the company sees the potential for a step change in computing that could surpass anything the industry has seen before. When taken together, IBM expects them to exponentially alter the speed and scale at which organizations can uncover solutions to complex problems.

 It calls this phenomenon accelerated discovery. We believe it’s the future of solving problems.

IBM continues: “We’re working with partners around the world to speed up the time it takes to discover new materials; we’re enabling rapid breakthroughs in code migration, chemistry, healthcare, and automation — to name a few areas. We’re working with academia, governments, and industry partners to create discovery accelerators tasked with finding new ways to practically tackle specific problems. 

One of the first accelerators the company launched was in partnership with the Cleveland Clinic, where they advanced pathogen research (while) fostering the next generation of tech workers for healthcare. In another, it worked with STFC Hartree, a division of the UK’s science council, and Unilever, to develop a new molecule that can help the skin boost its natural defense against germs. Nothing to quibble about here.

We believe that we’ve entered a new era in computing. At least they are targeting the right areas and the right technologies. Does that constitute a new ear? Or is it a continuation of the current era. Technology constantly evolves and advances and we don’t want it to stop, not now, not anytime soon, probably not ever. So when doesn’t it become a new era and how will we know?

From what I’ve seen and read from IBM it looks like the company is doing good things in the all the hot, in-demand upcoming areas. I hope other companies are directing their research budgets to address similar  problems in those areas too and come up with better solutions at better prices. Isn’t that what free market competition should be all about?

Alan Radding is DancingDinosaur, a veteran information technology analyst, writer, and ghostwriter. Follow DancingDinosaur on Twitter.

Quantum Secured Cryptography

June 15, 2022

In 2017 IBM declared the latest mainframe includes constant encryption protection. Not sure if DancingDinosaur covered it 5 years ago. From the initial announcement it sounded pretty good. If I didn’t cover it then let’s cover it now.

IBM still periodically promotes continuous or pervasive encryption with the Z. and there have been plenty of opportunities for mainframe shops to upgrade over the intervening years. Most recently, the z16 has gained considerable attention in that regard. 

The z16 (courtesy of IBM)

Since then the Z mainframe technology has evolved dramatically by embracing Linux, open source, container-driven development, and new tools and technologies. Still, securing data has remained a constant challenge.

“The vast majority of stolen or leaked data today is in the open and easy to use (and steal) because encryption has been very difficult and expensive to do at scale,” said Ross Mauri, general manager for the IBM Z, adding “We created a data protection engine for the cloud era to have a significant and immediate impact on global data security.”

Data security still  remains a serious, ongoing challenge for virtually all enterprises, and the widespread adoption of cloud and mobile technologies have only added to the data security risks. IBM used this product release to underscore a “global epidemic” behind 9 billion data records lost or stolen since 2013.

The cure for this epidemic, IBM believes, is “pervasive encryption.” And yet Big Blue — and many others — acknowledge that encryption is often sparsely applied in corporate and cloud datacenters, because encryption products for x86 environments have tended to degrade performance. And their complexity makes them a pain to manage and expensive to implement.

IBM developed its new system over a three-year period with input from 150 customers, all with data breaches and encryption at the top of their lists of concerns. The resulting IBM Z pervasive encryption capability reflects its call to action on data protection as articulated by Chief Information Security Officers and data security experts worldwide, it added.

“The pervasive encryption that is built in but is designed to extend beyond any new Z, which “really makes this the first system with an all-encompassing solution to the security threats and breaches we’ve been witnessing in the past 24 months,” said Peter Rutten, analyst at IDC’s Servers and Compute Platforms Group.

IBM Z is designed to encrypt data associated with an entire application, cloud service, or database in flight or at rest with one click. This kind of “bulk encryption” is made possible by a 7x increase in cryptographic performance over the previous generation z13, driven by a 4x increase in silicon dedicated to cryptographic algorithms, according to IBM.

The system also comes with tamper-responding encryption keys. A favorite target of hackers, encryption keys are routinely exposed in memory as they’re used. IBM Z’s key management system includes hardware that causes keys to be invalidated at any sign of intrusion, and can then be restored in safety.

Another capability included is encrypted APIs. IBM z/OS Connect technologies are designed to make it easy for cloud developers to discover and call any IBM Z application or data from a cloud service, or for IBM Z developers to call any cloud service, the company explained. IBM Z allows organizations to encrypt these too.

The IBM Z system can also give companies a means of complying with emerging standards, such as the EU’s General Data Protection Regulation (GDPR), which went into effect recently or the requirements of the Federal Financial Institutions Examination Council (FFIEC), Singapore and Hong Kong’s similar guidances, and the New York State Department of Financial Services’ newly published Cybersecurity Requirements for Financial Services Companies.

Finally, the company also announced that IBM Z will be providing an encryption engine for IBM cloud services and run IBM Blockchain services “to provide the highest commercially available levels of cryptographic hardware.” The company also announced new blockchain services in centers in Dallas, London, Frankfurt, Sao Paulo, Tokyo and Toronto.

Will all that make you sleep a bit better at night? It should.

Alan Radding is DancingDinosaur, a veteran information technology analyst, writer, and ghostwriter. Follow DancingDinosaur on Twitter.

Protect the Mainframe from Quantum Threats

May 13, 2022

For decades, most large-scale companies have used mainframes to host and run the software applications that make up their core systems. Often these mainframe computers and their applications are inherited from mergers and acquisitions, or from deferred IT investments.

Today, it is estimated that more than half of core business processes may still run on a mainframe system. But maintaining and relying on these now antiquated applications pose cost and, soon, quantum security risks.. 

IBM’s Z16 with built-in inference processing

Organizations are torn between the need to manage costs while maximizing the value of their mainframe. This leads them to ask, “If our system is not broken, why should we invest to fix it.  In DancingDinosaur’s experience in covering the mainframe for decades the mainframe is not actually broken. The real problem is a failure to modernize an increasingly problematic system. But problematic increases as quantum computing gains traction.

IBM, by the way, does not actively discourage such thinking. It always is eager to tout the latest mainframe with the latest bells and whistles. Today that is the z16, introduced in April, covered here, as an inference processing workhorse. Now the trick is to determine when you actually need that kind of processing and can it take on the quantum threats you may face.

Quantum algorithms running on sufficiently powerful quantum computers have the potential to weaken or break the core cryptographic primitives that we currently use to secure systems and communications. The fact that these algorithms can be broken leaves the foundation for global digital security at risk, notes IBM. Temporary solutions like increasing RSA or ECC key size will only buy a little time — like extra months, not extra years.

Fortunately, IBM is extremely active around the latest security, which addresses quantum computing. IBM refers to it as the next technology revolution. Whether a revolution or not, when a sufficiently powerful quantum computer is available, it invariably will give rise to new security challenges that bad guys can access. There are many exciting applications in industries including pharmaceuticals, finance, and manufacturing but they also need to be thinking about quantum security. 

Organizations and standards bodies already have started taking action to address the threat. The National Institute of Standards and Technology (NIST) initiated a process to solicit, evaluate and standardize new public-key cryptographic algorithms that can resist threats posed by both the classical computers we have today and quantum computers that will be available soon. 

NIST plans to select a small number of new quantum-safe algorithms this year and have new quantum-safe standards in place by 2024. IBM researchers have also been involved in the development of quantum-safe  cryptographic algorithms based on lattice cryptography, which are in the final round of consideration.

Unfortunately, we have a only little time to implement quantum-safe solutions before the advent of large-scale quantum computers that can break quantum-grade security threats arise. That’s not much time. We don’t know when a large-scale quantum computer capable of breaking public key cryptographic algorithms will be available, but experts predict that this could be possible before the end of the decade. 

And, sensitive data with a long lifespan is already vulnerable to harvest-now-decrypt-later attacks: That suggests hackers can capture encrypted data today and store it for later when they can decrypt it using a quantum computer.

This wouldn’t be Dancing Dinosaur if it didn’t think IBM wasn’t already thinking about, planning for, and preparing new quantum-safe cryptographic technology. IBM boasts the z16 as the industry’s first quantum-safe system, protected by quantum-safe technologies across multiple layers of firmware to protect business-critical infrastructure and data from quantum attacks. And it won’t be the last, for sure. In the meantime, stay tuned and keep your fingers crossed.

Alan Radding is DancingDinosaur, a veteran information technology analyst, writer, and ghostwriter. Follow DancingDinosaur on Twitter.

Cloud or Mainframe?

May 3, 2022

Surprise, surprise. IBM’s answer is both. 

To respond to the ongoing pressures of the global pandemic and more, IBM observed that businesses around the world have turbo-charged their digital transformations.  

At least some of these priorities are companies looking to take advantage of cloud computing.   Strategically, they also have concerns about optionality, or lock-in.  These realities explain why so few clients have made a wholesale move to cloud and many may never.

The unique needs each company faces in their business transformation journey require a diverse mix of applications and environments, including traditional data centers, edge computing and SaaS. It also raises the question of the role of the mainframe in today’s IT infrastructure?

According to a recent IBM study*, the vast majority (a whopping 71%) of IT executives surveyed from major corporations say critical mainframe-based applications not only have a place in their IT platforms today but are central to their business strategy. And in three years, the percentage of organizations leveraging mainframe assets in a hybrid cloud environment is expected to more than double. 

Why? Four of five executives say their organizations need to rapidly transform to keep up with competition, which includes modernizing mainframe-based apps and adopting a more open approach to cloud migration.

A hybrid cloud approach that includes and integrates mainframe computing can drive up to five times the value of a public cloud platform alone. The main sources of that value fall into five categories: 1) increased business acceleration, 2) developer productivity, 3) infrastructure efficiency, 4) risk and compliance management, and 5) long-term flexibility. 

With the billions businesses have invested in business-critical mainframe applications like financial management, customer data, and transaction processing over the years, this strategy holds true for both mainframe customers as well as those of IBM’s global consulting practice. Mainframe customers’ primary goal is to modernize their existing investments and minimize risk while delivering hybrid cloud innovation when they are ready to make that move.

IBM aims to guide its cloud migration clients on their application modernization journey with these recommendations:

Adopt an iterative approach. Many enterprises are experiencing firsthand the complexity of their IT environments. Adding to the vertical cloud silos undercut flexibility by making processes related to development, operations, and security even more fragmented than before, in effect making it nearly impossible to achieve the standardization and scale that cloud promises to deliver. 

Your plan to integrate new and existing environments must factor in your industry and workload attributes to co-create a business case and road map designed to meet your strategic goals. Adopt an incremental and adaptive approach to modernization as compared to a big bang approach. Leverage techniques such as coexistence architecture to gradually make the transition to the integrated hybrid architecture.

Then, assess your portfolio and build your roadmap. To understand your desired future state, assess your current state. Examine the capabilities that define the role of the mainframe in your enterprise today and how those capabilities tie into your hybrid cloud technology. BTW, the mainframe is an ideal partner for hosting clouds. Finally, take stock of your existing talent and resources and determine any changes. 

IBM, don’t be surprised, also suggests the new IBM z16 can perform many of the critical functions underpinning an open and secure hybrid cloud environment while closing some gaps. This includes accessing storage of unstructured on-premises data across a hybrid cloud platform, scaling and automating data-driven insights with AI, and being sufficiently agile to process critical apps and data in real-time;  all the while assessing security risks. 

Storing data across multiple clouds and moving it between partners and third parties can leave companies more vulnerable to security issues such as data breaches. Just remember to assess infrastructure solutions that support the ability to protect data, especially when it leaves your platform.

Then leverage multiple modernization strategies and enable easy access to existing mainframe applications and data by using APIs. This means providing a common developer experience by integrating open-source tools and a streamlined process for agility in addition to developing cloud native applications on the mainframe and containerizing those applications.

IT executives expect significant usage increases in both mainframe (35%) and cloud-based applications (44%) over the next two years. So consider how you can extract more value from both your mainframe and cloud investments. Blending mainframe power into the cloud landscape helps achieve the enterprise-wide agility and capability required to keep pace with changing business needs.

Alan Radding is DancingDinosaur, a veteran information technology analyst, writer, and ghostwriter. Follow DancingDinosaur on Twitter, @mainframeblog.