IBM Cloud Pak–Back to the Future

December 19, 2019

It had seemed that IBM was in a big rush to get everybody to cloud and hybrid cloud. But then in a recent message, it turned out maybe not such a rush. 

What that means is the company believes coexistence will involve new and existing applications working together for some time to come. Starting at any point new features may be added to existing applications. Eventually a microservices architecture should be exposed to new and existing applications. Whew, this is not something you should feel compelled to do today or next quarter or in five years, maybe not even in 10 years.


Here is more from the company earlier this month. When introducing its latest Cloud Paks as enterprise-ready cloud software the company presents it as a containerized software packaged with open source components, pre-integrated with common operational services and a secure-by-design container platform and operational services consisting of  logging, monitoring, security, and identity access management. DancingDinosaur tried to keep up for a couple of decades but in recent years has given up. Thankfully, no one is counting on me to deliver the latest code fast.

IBM has been promoting packaged software  and hardware for as long as this reporter has been following the company, which was when my adult married daughters were infants. (I could speed them off to sleep by reading them the latest IBM white paper I had just written for IBM or other tech giants. Don’t know if they retained or even appreciated any of that early client/server stuff but they did fall asleep, which was my intent.)

Essentially IBM is offering as enterprise-ready Cloud Paks, already packaged and integrated with hardware and software, ready to deploy.  It worked back then as it will now, I suspect, with the latest containerized systems because systems are more complex than ever before, not less by a long shot. Unless you have continuously retained and retrained your best people while continually refreshing your toolset you’ll find it hard to  keep up. You will need pre-integrated and packaged containerized cloud packages that will work right out of the box. 

This is more than just selling you a pre-integrated bundle. This is back to the future; I mean way back. Called Cloud Pak for data system, IBM is offering what it describes as a  fusion of hardware and software. The company chooses the right storage and hardware; all purpose built by IBM in one system. That amounts to convergence of storage, network, software, and data in a single system–all taken care of by IBM and deployed as containers and microservices. As I noted above, a deep trip back to the future.

IBM has dubbed it  Cloud-in-a-box. In short, this is an appliance. You can start very small, paying for what you use now. If later you want more, just expand it then. Am sure your IBM sales rep will be more than happy to provide you with the details. It appears from the briefing that there is an actual base configuration consisting of  2 enclosures with 32 or 128 TB. The company promises to install this and get you up and running in 4 hours, leaving only the final provisioning for you.

This works for existing mainframe shops too, at least those running Linux on the mainframe.  LinuxONE shops are probably ideal. It appears all z shops will need is DB2 and maybe Netezza. Much of the work will be done off the mainframe so at least you should  save some MIPS.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/ 

This is the last appearance of DancingDinosaur this year. It will reappear in the week of Jan. 6, 2020. Best wishes for the holidays.

Syncsort Acquires Pitney Bowes Software & Data

December 10, 2019

It is easy to forget that there are other ISVs  who work with the z. A recent list of z ISVs ran to over a dozen, including Rocket Software, Compuware, GT Software, and Syncsort, among others.  

Syncsort has grabbed some attention of late by announcing  the completion of an agreement to combine Pitney Bowes, the postal metering company, to take over its software and data operations. As a result, Syncsort claims a position of one of the leading data management software companies in the world, serving more than 11,000 primarily z customers.

The combined portfolio brings together capabilities in location intelligence, data enrichment, customer information management, and engagement solutions with powerful data integration and optimization software. About the only thing they haven’t listed is AI.

Over the coming months, teams will be working to combine the Syncsort-Pitney Bowes organizations and portfolios. While there may be some changes within the Syncsort organization, not much will change for its customers immediately. They can still expect to receive the same level of service they have received to support their everyday needs.

Syncsort’s acquisition of the Pitney Bowes software and data business creates a data management software company with more than 11,000 enterprise customers, $600 million in revenue, and 2,000 employees worldwide. Although modest in comparison with today’s Internet tech giants and even IBM, the resulting company brings sufficient scale, agility, and breadth of portfolio to enable leading enterprises to gain a competitive advantage from their data, Syncsort noted in its announcement.

“Enterprises everywhere are striving to increase their competitiveness through the strategic use of data…”  As a result, “organizations must invest in next-generation technologies like cloud, streaming, and machine learning, while simultaneously leveraging and modernizing decades of investment in traditional data infrastructure,” said Josh Rogers, CEO, Syncsort. Now “our increased scale allows us to expand the scope of partnerships with customers so that they can maximize the value of all their data,” he added.

According to Paige Bartley of 451 Research accompanying Syncsort’s announcement:  “The ability to derive actionable human intelligence from data requires ensuring that it has been integrated from all relevant sources, is representative and high quality, and has been enriched with additional context and information. Syncsort, as a longtime player in the data management space, is further addressing these issues with the acquisition of Pitney Bowes Software Solutions’ assets – technology that complements existing data-quality capabilities to provide additional context and enrichment for data, as well as leverage customer data and preferences to drive business outcomes.” 

The combined portfolio brings together much-in-demand capabilities in location intelligence, data enrichment, customer information management, and engagement solutions with powerful data integration and optimization software. These end-to-end capabilities, Syncsort adds,  will empower organizations to overcome ever-increasing challenges around the integrity of their data so their IT and business operations can easily integrate, enrich, and improve data assets to maximize insights.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/ 

IBM Suggests Astounding Productivity with Cloud Pak for Automation

November 25, 2019

DancingDinosaur thought IBM would not introduce another Cloud Pak until after the holidays, but I was wrong. Last week IBM launched Cloud Pak for security. According to IBM it helps an organization uncover threats, make more informed risk-based decisions, and prioritize your team’s time. 

More specifically, it connects the organization’s existing data sources to generate deeper insights. In the process you can access IBM and third-party tools to search for threats across any cloud or on-premises location. Quickly orchestrate actions and responses to those threats  while leaving your data where it is.

DancingDinosaur’s only disappointment in the IBM’s new security cloud pak as with other IBM Cloud Paks is that it runs only on Linux. That means it doesn’t run RACF, the legendary IBM access control tool for zOS. IBM’s Cloud Paks reportedly run on z Systems, but only those running Linux. Not sure how IBM can finesse this particular issue. 

Of the 5 original IBM Cloud Paks (application, data, integration, multicloud mgt, and automation) only one offers the kind of payback that will wow top c-level execs; automation.  Find Cloud Park for Automation here.

To date, IBM reports  over 5000 customers have used IBM Digital Business Automation to run their digital business. At the same time, IBM claims successful digitization has increased organizational scale and fueled growth of knowledge work.

McKinsey & Company notes that such workers spend up to 28 hours each week on low value work. IBM’s goal with digital business automation is to bring digital scale to knowledge work and free these workers to work on high value tasks.

Such tasks include collaborating and using creativity to come up with new ideas or meeting and building relationships with clients or resolving issues and exceptions. By automating these tasks the payoff, says IBM, can be staggering simply  by applying intelligent automation.

“We can reclaim 120 billion hours a year  spent by knowledge workers on low value work by using intelligent automation,” declares IBM.  So what value can you reclaim over the course of the year for your operation with, say, 100 knowledge workers, earning, maybe, $22 per hour, or maybe 1000 workers earning $35/hr. You can do the math. 

As you would expect,  automation is the critical component of this particular Cloud Pak. The main targets for enhancement or assistance among the rather broad category of knowledge workers are administrative/departmental work and expert work, which includes cross enterprise work.  IBM offers vendor management as one example.

The goal is to digitize core services by automating at scale and building low code/no code apps for your knowledge workers. For what IBM refers to as digital workers, who are key to this plan, the company wants to free them for higher value work. IBM’s example of such an expert worker would be a loan officer. 

Central to IBM’s Cloud Pak for Automation is what IBM calls its Intelligent Automation Platform. Some of this is here now, according to the company, with more coming in the future. Here now is the ability to create apps using low code tooling, reuse assets from business automation workflow, and create new UI assets.

Coming up in some unspecified timeframe is the ability to enable  digital workers to automate job roles, define and create content services to enable intelligent capture and extraction, and finally to envision and create decision services to offload and automate routine decisions.

Are your current and would-be knowledge workers ready to contribute or participate in this scheme? Maybe for some. it depends for others. To capture those billions of hours of increased productivity, however, they will have to step up to it. But you can be pretty sure IBM will do it for you if you ask.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/ 

 

IBM Cloud Pak Rollouts Continue

November 14, 2019

IBM Cloud Paks have emerged as a key strategy by the company to grow not just its cloud, but more importantly, its hybrid cloud business. For the past year or so, IBM shifted its emphasis from clouds to hybrid clouds. No doubt this is driven by its realization that its enterprise clients are adopting multiple clouds, necessitating the hybrid cloud.

The company is counting on success in hybrid clouds.  For years IBM has scrambled to claw a place for itself among the top cloud players but from the time DancingDinosaur has tracked IBM’s cloud presence it has never risen higher than third. In 2019, the top cloud providers are AWS, Microsoft, Google, IBM, Oracle, Alibaba, with IBM slipping to fourth in one analyst’s ranking.

Hybrid clouds, over time, can change the dynamics of the market. It has not, however, changed things much according to a ranking from Datamation. “There are too many variables to strictly rank hybrid cloud providers,” notes Datamation. With that said, Datamation still ranked them starting with  Amazon’s Amazon Web Services (AWS), which remains the unquestioned leader of the business with twice the market share as its next leading competitor, Microsoft/Azure, and followed by IBM. The company is counting on its Red Hat acquisition, which includes OpenShift along with Enterprise Linux, to alter its market standing.. 

The hybrid cloud segment certainly encompasses a wider range of customer needs, so there are ways IBM can work Red Hat to give it some advantages in pricing and packaging, which it has already signaled it can and will do, starting with OpenShift. DancingDinosaur doubts it will overtake AWS outright, but as noted above, hybrid clouds are a different beast. So don’t rule out IBM in the hybrid cloud market.

Another thing that may give IBM an edge in hybrid clouds among its enterprise customers are its Cloud Paks.  As IBM describes them Cloud Paks are enterprise-ready, containerized software that give organizations an open, faster and more secure way to move core business applications to any cloud. Each IBM Cloud Pak runs on Red Hat OpenShift, IBM Cloud, and Red Hat Enterprise Linux. 

Each pak includes containerized IBM middleware and common software services for development and management. Also included is a common integration layer designed to reduce development time by up to 84 percent and operational expenses by up to 75 percent, according to IBM.

Cloud Paks, IBM continues,, enable you to easily deploy modern enterprise software either on-premises, in the cloud, or with pre-integrated systems and quickly bring workloads to production by seamlessly leveraging Kubernetes as the container management framework supporting production-level qualities of service and end-to-end lifecycle management. This gives organizations an open, faster, more secure way to move core business applications to any cloud.

When IBM introduced Cloud Paks a few weeks ago they planned a suite of five Cloud Paks:  

  • Application
  • Data
  • Integration
  • Automation
  • Multi Cloud mgt

Don’t be surprised as hybrid cloud usage evolves if even more Cloud Paks eventually appear. It becomes an opportunity for IBM to bundle together more of its existing tools and products and send them to the cloud too.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/ 

Has Google Achieved Quantum Supremacy?

October 31, 2019

Google said they did last week. “If true, it is big news,” writes Chelsea Whyte in New Scientist. Quantum computers have the potential to change the way organizations design new materials, work out logistics, build artificial intelligence, and break encryption. That is why firms like Google, Intel and IBM – along with plenty of start-ups – have been racing to reach this crucial milestone, Whyte continued.

Or maybe Google hasn’t. Its claim that its 53-qubit computer performed, in 200 seconds, an arcane task that would take 10,000 years for Summit, a supercomputer that is currently the world’s fastest, which IBM built for the Department of Energy. 

qubit

As IBM puts it, writes Adrian Cho, in Science Magazine: On 21 October, IBM announced that, by tweaking the way Summit approaches the task, it can do it far faster: in 2.5 days. Therefore, notes IBM, the threshold for quantum supremacy—doing something a classical computer cannot.—has still not been met.  By tweaking the way Summit approaches the task, IBM says it can still do it far faster: in 2.5 days, Chou reports. Therefore, the threshold for quantum supremacy—doing something a classical computer can’t—has thus still not been met, IBM insists. 

Somehow, when the industry reaches the point of quantum supremacy, DancingDinosaur suspects, it won’t be with a 53-qubit device. That’s easily within range of the biggest quantum computers already available. The problem Google solved involved random numbers; specifically they tackled a random sampling problem – that is, checking that a set of numbers has a truly random distribution. Reportedly, this is very difficult for a traditional computer when there are a lot of numbers involved.  Maybe I’m still a Z big iron bigot, but breaking this threshold should take hundreds if not thousands of qubits according to published pieces.

But this raises an interesting point. If IBM can tweak its best supercomputer to reduce a process that would take years to 2.5 days, why didn’t they do it earlier? And how many other such inefficiencies are lying around that they could streamline right now?

The race for quantum supremacy may not equate with effective quantum computing or even competitively priced systems. Except for a couple of newcomers nobody is talking about price. IBM offered some free time on  its smallest qubit machines in the cloud for those who joined its quantum program but for how long? With Google, Intel, HP, and IBM – along with a handful of newcomers like D-Wave and Rigetti and any startups that pop up the race to quantum supremacy surely will be competitive but who knows what the cost will be.

DancingDinosaur’s guess is that, given the players currently involved, the quantum computing market will look a lot like today’s enterprise computing market minus the startups. As for pricing, beyond IBM’s promotional offer of free time on one of it cloud-based quantum machines these systems can’t be cheap when they finally become available. 

Just the cooling required to keep a quantum machine stable will be costly.  IBM is taking the right approach for now; put some machines in the cloud where it can provide and manage all the infrastructure required to support the machines. Don’t expect to ever buy one at Best Buy and plug it into your data center.

And then there is the talent problem.  Skilled quantum programmers and technicians probably don’t post their availability on Indeed. You’ll need a few PhDs in mathematics or physics, at the least, to get you started. Suggest you get in line at MIT or Stanford.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/ 

October 23, 2019

IBM Introduces Cloudpaks

IBM is counting on success in the cloud, especially hybrid clouds, but only about 20 percent of enterprise workloads have moved to the cloud yet. The problem is an old familiar one: moving applications, especially vital enterprise applications to a new platform, any platform, is difficult and, as a result, risky. 

IBM Multicloud Manager Dashboard

When the new platform is the cloud or, just as likely, a hybrid cloud the level of and corresponding fear among managers is that much greater. Already many managers, both IT and business managers consider the cloud, any cloud but especially a public cloud, particularly risky. In their minds it represents a sea full of sharks just waiting to grab your valuable data along with anything else they can reach. And every news report of a data breach, especially a large one, only confirms their worst fears.

Before you get too nervous, IBM is jumping in with a ready-made, packaged set of services, IBM Cloud Paks, which will initially come in five flavors, starting with the IBM multi-cloud pak. As IBM explains it: IBM Cloud Paks are enterprise-ready, containerized software solutions that give clients an open, faster, and more secure way to move core business applications to any cloud. Each IBM Cloud Pak runs on Red Hat OpenShift on  the IBM Cloud and Red Hat Enterprise Linux and includes containerized IBM middleware and common software services for development and management–all running on top of a common integration layer and, IBM claims, can reduce development time by up to 84 percent and operational expenses by up to 75 percent. No details on how they calculate that, however.

And to some extent, these fears are justified, making the idea of cloud paks that much more appealing. If public clouds and hybrid clouds are not configured correctly with security and safety foremost in mind things can go wrong. Not always, of course, but the possibility seems always there.

Furthermore, making hybrid cloud particularly scary are all the new technologies they entail, things you never heard of just few years ago. Suddenly, you are dealing with containers, kubernetes, agile integration, devops, microservices, devsecops, multicloud & application-centric management, and integrated agile management, site reliability engineering, and much more. How many tools and technologies do you want to acquire, learn, manage, maintain, and upgrade. And at what cost? And then revisit it again as soon as the earth spins a few more times. 

Simply from the deployment and management aspect, these new containerized, hybrid clouds can appear intimidating. To address that, each flavor of cloud pak presents the same common architecture. Now in the early stages, IBM already is seeing what amounts to a mix of everything: apps running on-premises and others in the cloud, many mixing multiple clouds. It quickly can amount to a vast landscape of micro-apps comprising the most hybrid of hybrid worlds.

With its Cloud Paks IBM truly is trying to  leverage its Red Hat acquisition. As IBM sees it: Red Hat incorporates openness, starting with enterprise Linux. Then, throw in containers, kubernetes, and OpenShift, which provides a solid platform for kupernetes and you now have truly common and mostly open platform.

And IBM is not leaving out its Power and z platforms. It will work with any Power or Z system that incorporates a current version of Linux. That means any z14, z15, LinuxONE or Power System running Linux. 

IBM also is working with Red Hat to ensure that it can incorporate good pricing for CloudPaks that include various software.  Customer get embedded licensing. You pay based on a specific capacity need and get the Open Shift licensing for the cloud pak and support from IBM and Red Hat.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/ 

October 8, 2019

z15 LinuxONE III for Hybrid Cloud

 

It didn’t take long following the introduction of the z15 for a LinuxONE to arrive. Meet the LinuxONE III, a z15 machine with dedicated built-in Linux. And it comes with the primary goodies that the z15 offers: automatic pervasive compression of everything along with a closely related privacy capability, Data Passport.

3-frame LinuxONE III

Z-quality security, privacy, and availability, it turns out, has become central to the mission of the LinuxONE III.The reason is simple: Cloud. According to IBM, only 20% of workloads have been moved to cloud. Why? Companies need assurance that their data privacy and security will not be breached. To many IT pros and business executives, the cloud remains the wild, wild west where bad guys roam looking to steal whatever they can.

IBM is touting the LinuxONE III, which is built on its newly introduced z15, for hybrid clouds. The company has been preaching the gospel of clouds and, particularly, hybrid clouds for several years, which was its primary reason for acquiring Red Hat. Red Hat Linux is built into the LinuxONE III, probably its first formal appearance since IBM closed its acquisition of Red Hat this spring. 

With Red Hat and z15 IBM is aiming to cash in on what it sees as a big opportunity in hybrid clouds. While the Cloud brings the promise of flexibility, agility and openness, only 20% of workloads have been moved to cloud, notes IBM. Why? Companies need assurance that their data privacy and security will not be breached. LinuxONE III also promises cloud native development.

By integrating the new IBM LinuxONE III as a key element in an organization’s hybrid cloud strategy, it adds another level of security and stability and availability to its cloud infrastructure. It gives the organization both agile deployment and unbeatable levels of uptime, reliability, and security. While the cloud already offers appealing flexibility and costs, the last three capabilities–uptime, reliability, security–are not usually associated with cloud computing. By security, IBM means 100% data encryption automatically, from the moment the data arrives or is created. And it remains encrypted for the rest of its life, at rest or in transit.

Are those capabilities important? You bet. A Harris study commissioned by IBM found that 64 percent of all consumers have opted not to work with a business out of concerns over whether that business could keep their data secure. However, that same study found 76 percent of respondents would be more willing to share personal information if there was a way to fully take back and retrieve that data at any time. Thus the importance of the z15’s pervasive encryption and the new data passports.

IBM has previously brought out its latest z running dedicated Linux. Initially it was a way to expand the z market through a reduced cost z.  DancingDinosaur doesn’t know the cost of the LinuxONE III. In the past they have been discounted but given the $34 billion IBM spent to acquire Red Hat the new machines might not be such a bargain this time.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/ 

IBM Introduces 53 Qubit Quantum Machine

September 23, 2019

IBM made two major system announcements within just a couple of weeks: On Sept. 18 IBM announced a 53 qubit guantum machine. The week before, IBM introduced its latest mainframe, the z15. Already buzz is circulating of a z16 in two years, about a normal release cycle for the next generation of  an IBM mainframe. 

Quantum computer up close
IBM’s largest quantum machine at 53 qubits

Along with the 53 qubit machine IBM announced the opening of a Quantum Computation Center in New York state. The new center expands, according to IBM, its fleet of quantum computing systems for commercial and research activity that exist beyond the confines of experimental lab environments. IBM’s offerings run from 5 to 10 to 20 to, now, 53 qubits. These are actual quantum machines hosted by IBM in the cloud, not just simulations. 

The IBM Quantum Computation Center will support the growing needs of a community of over 150,000 registered users and nearly 80 commercial clients, academic institutions and research laboratories to advance quantum computing and explore practical applications. To date, notes IBM, this  global community of users have run more than 14 million experiments on IBM’s quantum computers through the cloud since 2016, and published more than 200 scientific papers. To meet growing demand for access to real quantum hardware, ten quantum computing systems are now online through IBM’s Quantum Computation Center. The fleet is composed of five 20-qubit systems, one 14-qubit system, and four 5-qubit systems. Five of the systems now have a quantum volume of 16 – a measure of the power of a quantum computer used by IBM demonstrating a new sustained performance milestone.

IBM’s quantum systems are optimized for the reliability and reproducibility of programmable multi-qubit operations. Due to these factors, the systems enable state-of-the-art quantum computational research with 95 percent availability, according to the company.

Within one month, IBM’s commercially available quantum fleet will grow to 14 systems, including the new 53-qubit quantum computer, the single largest universal quantum system made available for external access in the industry to date. The new system offers a larger lattice and gives users the ability to run even more complex entanglement and connectivity experiments. Industry observers note that serious work requires a minimum of 200 qubits, probably just a couple more product intros away. 

Advances in quantum computing could open the door to future scientific discoveries such as new medicines and materials, vast improvements in the optimization of supply chains, and new ways computers to model financial data to make better investments. Examples of IBM’s  work with clients and partners, include:

  • J.P. Morgan Chase and IBM posted on arXiv,  Option Pricing using Quantum Computers, a methodology to price financial options and portfolios of such options, on a gate-based quantum computer. This resulted in an algorithm that provides a quadratic speedup, i.e. whereby classically computers need millions of samples, this methodology requires only a few thousands of samples to achieve the same result, It allows financial analysts to perform the option pricing and risk analysis in near real time. The implementation is available as open source in Qiskit Finance. 
  • Mitsubishi Chemical, Keio University and IBM simulated the initial steps of the reaction mechanism between lithium and oxygen in lithium-air batteries. Also available on arXiv,  this represents a first step in modeling the entire lithium-oxygen reaction on a quantum computer. Better understanding of this interaction could lead to more efficient batteries for mobile devices or automotive vehicles.

In the meantime IBM continues to simulate quantum algorithms on conventional supercomputers. According to one 2-year old report: at roughly 50 qubits, existing methods for calculating quantum amplitudes require either too much computation to be practical, or more memory than is available on any existing supercomputer, or both. You can bet that IBM or somebody else will push beyond 53 qubits pretty quickly. Google already claims a 72-qubit device, but it hasn’t let outsiders run programs on it. IBM has been making quantum available via the cloud since 2016. Other companies putting quantum computers in the cloud, include IBM’s Quantum Computation Center.IBM’s Quantum Computation Center. Others include  Rigetti Computing,  and Canada’s D-Wave

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/ 

New z15 for Multicloud– Cloud-Native– Instant Recovery

September 16, 2019

The z14 was introduced in the summer of 2017. So, it’s been a little over two years since IBM released a new top end mainframe, which seems about the right time for a new z. Unlike the z14, this machine is not focusing on the speeds and feeds, although they appear quite robust. With the z15, the focus is on multicloud environments.  No surprise there. IBM has been singing the gospel of multicloud and hybrid clouds for over a year.

z15–2-frames 19″ rack each

IBM does not describe the z15 in the usual terms for a top of the line mainframe, the ever-larger speeds and feeds. Rather, IBM describes it as its new enterprise platform delivering the ability to manage the privacy of customer data across hybrid multicloud environments. With z15, clients can manage who gets access to data via policy-based controls, with an industry-first capability to revoke access to data even across the hybrid cloud.

One of the capabilities IBM cites most often about the z15 isn’t even new: it’s pervasive encryption.  This was available on the z14. With pervasive encryption data is encrypted immediately without your having to do anything and it imposes no cost in terms of system overhead. Thank you embedded hardware. The data also is automatically protected both at rest and in flight. The savings in terms of staff time alone should be considerable.

Related to pervasive encryption is a new z15 capability: Data Privacy Passport. This allows you to gain control over how data is stored and shared, enabling the ability to protect and provision data and revoke access to that data at any time, not only within the z15 environment but across an enterprise’s hybrid multicloud environment. Similarly, the z15 can also encrypt data everywhere – across hybrid multicloud environments – to help enterprises secure their data wherever it travels and lands.

Another new z15 capability is Cloud-Native Development. This capability facilitates how you can modernize apps in place by building new cloud-native apps and securely integrate your important workloads across clouds. This just expedites what Z shops have long done: leverage existing Z software assets. Now you can more easily build, deploy and manage next-gen apps and protect data through advanced security and pervasive encryption.

Yet another new capability is instant recovery. Here the goal is to limit the cost and impact of planned and unplanned downtime by enabling you to access full system capacity for a period of time to accelerate shutdown and restart IBM Z services and provide a temporary capacity boost to rapidly recover from lost time.

Instant Recovery addresses what has become a recent customer imperative, concern about organizations mishandling their organization’s private data.  IBM commissioned a study and found that 64 percent of all consumers have opted not to work with a business out of concerns over whether they would keep their data secure. In addition, that same study found 76 percent of respondents would be more willing to share personal information if there was a way to fully take back and retrieve that data at any time. With z15, pervasive encryption is designed to extend across the enterprise to enforce data privacy by policy even when it leaves the platform. With this built-in capability, your clients, as IBM explains it,  can offer new services and features that gives customers stronger control over how their personal data is used.

The new z15 also addresses the rising Importance of data privacy  to clients. IBM commissioned a study by Harris, the polling organization, and released it this week to coincide with the z15 introduction. The Harris study found that 64 percent of all consumers have opted not to work with a business out of concerns over whether a given partner could keep their data secure. However, that same study found 76 percent of respondents would be more willing to share personal information if there was a way to fully take back and retrieve that data at any time. With z15, pervasive encryption is designed to extend across the enterprise enforcing data privacy by policy even when it leaves the platform. With this you can offer new services and features that give clients stronger control over how their personal data is used.

Along with the new z15, this week IBM also introduced a new mainframe storage array, the DS8000F.  In keeping with the hybrid, multicloud theme, the new array is specifically designed for mission critical, hybrid, multicloud environments. This new array, according to IBM, promises comprehensive next-level cyber security, data availability, and system resiliency. In addition to z15, the IBM DS8900F delivers  more than 99.99999 (seven nines) of availability.percent uptime along with several Disaster Recovery options designed for near-zero recovery times to ensure protection of data. 

Seven nines of availability surpases any level of availability DancingDinosaur has previously written about, even with the z14. It comes to, according to Google (https://uptime.is/complex?sla=99.99999&dur=24&dur=24&dur=24&dur=24&dur=24&dur=24&dur=24):

  • Weekly: 0.1s
  • Monthly: 0.3s
  • Yearly: 3.2s

With this level of availability,  you don’t even have time even to dash off to the restroom during an instance of downtime. With the new seven nines (99.99999) of availability,  IBM Z, clients, says the company, now have a new level of control of how and when they store their data to make the best economic and business sense, while always keeping it resilient and available.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/ 

TradeLens Gives IBM Blockchain Traction

August 27, 2019

In August IBM and Maersk, the global shipping company, announced the launch on TradeLens, a secure open network for the exchange of shipping information among the multiple parties involved in a shipment. TradeLens is expected to eliminate shipping’s information blind spot by delivering end-to-end information about each shipment.

Since the announcement, Trade Lens has attracted over 140 participants to the network. By the end of 2019 IBM expects the network to handle  over 50% of global shipping volume. Competitors like Oracle, however, have recently jumped in with its own blockchain platform as a cloud service. Expect more to come.

TradeLens uses blockchain technology, specifically HyperLedger, to create an industry standard for the secure digitization and transmission of supply chain documents around the world. TradeLens will generate savings for participants and enhance global supply chain security.

TradeLens connects all participants using  public facing, documented apps that connect into same API. Third parties can develop their own apps for the network. While blockchain/hyperledger is the primary technology for IBM and Maersk it is not the only one. TradeLens can accommodate legacy comm and data formats too.

Global trade is big, generating $16 trillion annually.  Just squeezing some efficiency out of what amounts to a cumbersome process bogged down with paper and handoffs should generate an attractive payback in terms of efficiency and speed. During the 12-month trial, Maersk and IBM worked with dozens of ecosystem partners to identify opportunities to prevent delays caused by documentation errors, information slowdowns, and other impediments.

One example demonstrated how TradeLens can reduce the transit time of a shipment of packaging materials to a production line in the United States by 40 percent, avoiding thousands of dollars in cost. Through better visibility and more efficient means of communicating, some supply chain participants estimate they could reduce the steps taken to answer such basic operation questions as “where is my container” from 10 steps and five people to, with TradeLens, one step and one person.

Apps are the key to using TradeLens. TradeLens apps all connect into same public facing and documented API. Underneath network is blockchain-hyperledger. Blockchain is just one of the technologies  being used. TradeLens can accommodate legacy comm formats. Data models, supply chain reference models, consignments, heirarchical models get shared.

During the 12-month trial, Maersk and IBM worked with dozens of ecosystem partners to identify opportunities to prevent delays caused by documentation errors, information delays, and other impediments. One example demonstrated how TradeLens can reduce the transit time of a shipment of packaging materials to a production line in the United States by 40 percent, avoiding thousands of dollars in cost.

Looking ahead, IBM and Mearsk envision a future roadmap that brings new capabilities. These may include shipping instructions, bills of lading with specific language.  As it turns out, you need an original bill of lading to take possession of shipped items. Solving that alone could eliminate a major bottleneck.

Currently, the platform handles 10 million events and more than 100,000 documents every week, and it is growing rapidly, according to IBM.  Similarly, the system handles about 20% of trade volume. With more shippers joining the network, IBM expects to reach 65% volume, And it can grow more. TradLens is not limited to ocean transport only. TradeLens could support other transport modes.

Blockchain is ideal for the z, with its security, scalability, and performance. There is no reason that only the biggest shippers should run TradeLens.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at technologywriter.com.


%d bloggers like this: