Archive for the ‘Uncategorized’ Category

IBM Suggests Astounding Productivity with Cloud Pak for Automation

November 25, 2019

DancingDinosaur thought IBM would not introduce another Cloud Pak until after the holidays, but I was wrong. Last week IBM launched Cloud Pak for security. According to IBM it helps an organization uncover threats, make more informed risk-based decisions, and prioritize your team’s time. 

More specifically, it connects the organization’s existing data sources to generate deeper insights. In the process you can access IBM and third-party tools to search for threats across any cloud or on-premises location. Quickly orchestrate actions and responses to those threats  while leaving your data where it is.

DancingDinosaur’s only disappointment in the IBM’s new security cloud pak as with other IBM Cloud Paks is that it runs only on Linux. That means it doesn’t run RACF, the legendary IBM access control tool for zOS. IBM’s Cloud Paks reportedly run on z Systems, but only those running Linux. Not sure how IBM can finesse this particular issue. 

Of the 5 original IBM Cloud Paks (application, data, integration, multicloud mgt, and automation) only one offers the kind of payback that will wow top c-level execs; automation.  Find Cloud Park for Automation here.

To date, IBM reports  over 5000 customers have used IBM Digital Business Automation to run their digital business. At the same time, IBM claims successful digitization has increased organizational scale and fueled growth of knowledge work.

McKinsey & Company notes that such workers spend up to 28 hours each week on low value work. IBM’s goal with digital business automation is to bring digital scale to knowledge work and free these workers to work on high value tasks.

Such tasks include collaborating and using creativity to come up with new ideas or meeting and building relationships with clients or resolving issues and exceptions. By automating these tasks the payoff, says IBM, can be staggering simply  by applying intelligent automation.

“We can reclaim 120 billion hours a year  spent by knowledge workers on low value work by using intelligent automation,” declares IBM.  So what value can you reclaim over the course of the year for your operation with, say, 100 knowledge workers, earning, maybe, $22 per hour, or maybe 1000 workers earning $35/hr. You can do the math. 

As you would expect,  automation is the critical component of this particular Cloud Pak. The main targets for enhancement or assistance among the rather broad category of knowledge workers are administrative/departmental work and expert work, which includes cross enterprise work.  IBM offers vendor management as one example.

The goal is to digitize core services by automating at scale and building low code/no code apps for your knowledge workers. For what IBM refers to as digital workers, who are key to this plan, the company wants to free them for higher value work. IBM’s example of such an expert worker would be a loan officer. 

Central to IBM’s Cloud Pak for Automation is what IBM calls its Intelligent Automation Platform. Some of this is here now, according to the company, with more coming in the future. Here now is the ability to create apps using low code tooling, reuse assets from business automation workflow, and create new UI assets.

Coming up in some unspecified timeframe is the ability to enable  digital workers to automate job roles, define and create content services to enable intelligent capture and extraction, and finally to envision and create decision services to offload and automate routine decisions.

Are your current and would-be knowledge workers ready to contribute or participate in this scheme? Maybe for some. it depends for others. To capture those billions of hours of increased productivity, however, they will have to step up to it. But you can be pretty sure IBM will do it for you if you ask.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/ 

 

IBM Cloud Pak Rollouts Continue

November 14, 2019

IBM Cloud Paks have emerged as a key strategy by the company to grow not just its cloud, but more importantly, its hybrid cloud business. For the past year or so, IBM shifted its emphasis from clouds to hybrid clouds. No doubt this is driven by its realization that its enterprise clients are adopting multiple clouds, necessitating the hybrid cloud.

The company is counting on success in hybrid clouds.  For years IBM has scrambled to claw a place for itself among the top cloud players but from the time DancingDinosaur has tracked IBM’s cloud presence it has never risen higher than third. In 2019, the top cloud providers are AWS, Microsoft, Google, IBM, Oracle, Alibaba, with IBM slipping to fourth in one analyst’s ranking.

Hybrid clouds, over time, can change the dynamics of the market. It has not, however, changed things much according to a ranking from Datamation. “There are too many variables to strictly rank hybrid cloud providers,” notes Datamation. With that said, Datamation still ranked them starting with  Amazon’s Amazon Web Services (AWS), which remains the unquestioned leader of the business with twice the market share as its next leading competitor, Microsoft/Azure, and followed by IBM. The company is counting on its Red Hat acquisition, which includes OpenShift along with Enterprise Linux, to alter its market standing.. 

The hybrid cloud segment certainly encompasses a wider range of customer needs, so there are ways IBM can work Red Hat to give it some advantages in pricing and packaging, which it has already signaled it can and will do, starting with OpenShift. DancingDinosaur doubts it will overtake AWS outright, but as noted above, hybrid clouds are a different beast. So don’t rule out IBM in the hybrid cloud market.

Another thing that may give IBM an edge in hybrid clouds among its enterprise customers are its Cloud Paks.  As IBM describes them Cloud Paks are enterprise-ready, containerized software that give organizations an open, faster and more secure way to move core business applications to any cloud. Each IBM Cloud Pak runs on Red Hat OpenShift, IBM Cloud, and Red Hat Enterprise Linux. 

Each pak includes containerized IBM middleware and common software services for development and management. Also included is a common integration layer designed to reduce development time by up to 84 percent and operational expenses by up to 75 percent, according to IBM.

Cloud Paks, IBM continues,, enable you to easily deploy modern enterprise software either on-premises, in the cloud, or with pre-integrated systems and quickly bring workloads to production by seamlessly leveraging Kubernetes as the container management framework supporting production-level qualities of service and end-to-end lifecycle management. This gives organizations an open, faster, more secure way to move core business applications to any cloud.

When IBM introduced Cloud Paks a few weeks ago they planned a suite of five Cloud Paks:  

  • Application
  • Data
  • Integration
  • Automation
  • Multi Cloud mgt

Don’t be surprised as hybrid cloud usage evolves if even more Cloud Paks eventually appear. It becomes an opportunity for IBM to bundle together more of its existing tools and products and send them to the cloud too.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/ 

Has Google Achieved Quantum Supremacy?

October 31, 2019

Google said they did last week. “If true, it is big news,” writes Chelsea Whyte in New Scientist. Quantum computers have the potential to change the way organizations design new materials, work out logistics, build artificial intelligence, and break encryption. That is why firms like Google, Intel and IBM – along with plenty of start-ups – have been racing to reach this crucial milestone, Whyte continued.

Or maybe Google hasn’t. Its claim that its 53-qubit computer performed, in 200 seconds, an arcane task that would take 10,000 years for Summit, a supercomputer that is currently the world’s fastest, which IBM built for the Department of Energy. 

qubit

As IBM puts it, writes Adrian Cho, in Science Magazine: On 21 October, IBM announced that, by tweaking the way Summit approaches the task, it can do it far faster: in 2.5 days. Therefore, notes IBM, the threshold for quantum supremacy—doing something a classical computer cannot.—has still not been met.  By tweaking the way Summit approaches the task, IBM says it can still do it far faster: in 2.5 days, Chou reports. Therefore, the threshold for quantum supremacy—doing something a classical computer can’t—has thus still not been met, IBM insists. 

Somehow, when the industry reaches the point of quantum supremacy, DancingDinosaur suspects, it won’t be with a 53-qubit device. That’s easily within range of the biggest quantum computers already available. The problem Google solved involved random numbers; specifically they tackled a random sampling problem – that is, checking that a set of numbers has a truly random distribution. Reportedly, this is very difficult for a traditional computer when there are a lot of numbers involved.  Maybe I’m still a Z big iron bigot, but breaking this threshold should take hundreds if not thousands of qubits according to published pieces.

But this raises an interesting point. If IBM can tweak its best supercomputer to reduce a process that would take years to 2.5 days, why didn’t they do it earlier? And how many other such inefficiencies are lying around that they could streamline right now?

The race for quantum supremacy may not equate with effective quantum computing or even competitively priced systems. Except for a couple of newcomers nobody is talking about price. IBM offered some free time on  its smallest qubit machines in the cloud for those who joined its quantum program but for how long? With Google, Intel, HP, and IBM – along with a handful of newcomers like D-Wave and Rigetti and any startups that pop up the race to quantum supremacy surely will be competitive but who knows what the cost will be.

DancingDinosaur’s guess is that, given the players currently involved, the quantum computing market will look a lot like today’s enterprise computing market minus the startups. As for pricing, beyond IBM’s promotional offer of free time on one of it cloud-based quantum machines these systems can’t be cheap when they finally become available. 

Just the cooling required to keep a quantum machine stable will be costly.  IBM is taking the right approach for now; put some machines in the cloud where it can provide and manage all the infrastructure required to support the machines. Don’t expect to ever buy one at Best Buy and plug it into your data center.

And then there is the talent problem.  Skilled quantum programmers and technicians probably don’t post their availability on Indeed. You’ll need a few PhDs in mathematics or physics, at the least, to get you started. Suggest you get in line at MIT or Stanford.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/ 

October 23, 2019

IBM Introduces Cloudpaks

IBM is counting on success in the cloud, especially hybrid clouds, but only about 20 percent of enterprise workloads have moved to the cloud yet. The problem is an old familiar one: moving applications, especially vital enterprise applications to a new platform, any platform, is difficult and, as a result, risky. 

IBM Multicloud Manager Dashboard

When the new platform is the cloud or, just as likely, a hybrid cloud the level of and corresponding fear among managers is that much greater. Already many managers, both IT and business managers consider the cloud, any cloud but especially a public cloud, particularly risky. In their minds it represents a sea full of sharks just waiting to grab your valuable data along with anything else they can reach. And every news report of a data breach, especially a large one, only confirms their worst fears.

Before you get too nervous, IBM is jumping in with a ready-made, packaged set of services, IBM Cloud Paks, which will initially come in five flavors, starting with the IBM multi-cloud pak. As IBM explains it: IBM Cloud Paks are enterprise-ready, containerized software solutions that give clients an open, faster, and more secure way to move core business applications to any cloud. Each IBM Cloud Pak runs on Red Hat OpenShift on  the IBM Cloud and Red Hat Enterprise Linux and includes containerized IBM middleware and common software services for development and management–all running on top of a common integration layer and, IBM claims, can reduce development time by up to 84 percent and operational expenses by up to 75 percent. No details on how they calculate that, however.

And to some extent, these fears are justified, making the idea of cloud paks that much more appealing. If public clouds and hybrid clouds are not configured correctly with security and safety foremost in mind things can go wrong. Not always, of course, but the possibility seems always there.

Furthermore, making hybrid cloud particularly scary are all the new technologies they entail, things you never heard of just few years ago. Suddenly, you are dealing with containers, kubernetes, agile integration, devops, microservices, devsecops, multicloud & application-centric management, and integrated agile management, site reliability engineering, and much more. How many tools and technologies do you want to acquire, learn, manage, maintain, and upgrade. And at what cost? And then revisit it again as soon as the earth spins a few more times. 

Simply from the deployment and management aspect, these new containerized, hybrid clouds can appear intimidating. To address that, each flavor of cloud pak presents the same common architecture. Now in the early stages, IBM already is seeing what amounts to a mix of everything: apps running on-premises and others in the cloud, many mixing multiple clouds. It quickly can amount to a vast landscape of micro-apps comprising the most hybrid of hybrid worlds.

With its Cloud Paks IBM truly is trying to  leverage its Red Hat acquisition. As IBM sees it: Red Hat incorporates openness, starting with enterprise Linux. Then, throw in containers, kubernetes, and OpenShift, which provides a solid platform for kupernetes and you now have truly common and mostly open platform.

And IBM is not leaving out its Power and z platforms. It will work with any Power or Z system that incorporates a current version of Linux. That means any z14, z15, LinuxONE or Power System running Linux. 

IBM also is working with Red Hat to ensure that it can incorporate good pricing for CloudPaks that include various software.  Customer get embedded licensing. You pay based on a specific capacity need and get the Open Shift licensing for the cloud pak and support from IBM and Red Hat.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/ 

October 8, 2019

z15 LinuxONE III for Hybrid Cloud

 

It didn’t take long following the introduction of the z15 for a LinuxONE to arrive. Meet the LinuxONE III, a z15 machine with dedicated built-in Linux. And it comes with the primary goodies that the z15 offers: automatic pervasive compression of everything along with a closely related privacy capability, Data Passport.

3-frame LinuxONE III

Z-quality security, privacy, and availability, it turns out, has become central to the mission of the LinuxONE III.The reason is simple: Cloud. According to IBM, only 20% of workloads have been moved to cloud. Why? Companies need assurance that their data privacy and security will not be breached. To many IT pros and business executives, the cloud remains the wild, wild west where bad guys roam looking to steal whatever they can.

IBM is touting the LinuxONE III, which is built on its newly introduced z15, for hybrid clouds. The company has been preaching the gospel of clouds and, particularly, hybrid clouds for several years, which was its primary reason for acquiring Red Hat. Red Hat Linux is built into the LinuxONE III, probably its first formal appearance since IBM closed its acquisition of Red Hat this spring. 

With Red Hat and z15 IBM is aiming to cash in on what it sees as a big opportunity in hybrid clouds. While the Cloud brings the promise of flexibility, agility and openness, only 20% of workloads have been moved to cloud, notes IBM. Why? Companies need assurance that their data privacy and security will not be breached. LinuxONE III also promises cloud native development.

By integrating the new IBM LinuxONE III as a key element in an organization’s hybrid cloud strategy, it adds another level of security and stability and availability to its cloud infrastructure. It gives the organization both agile deployment and unbeatable levels of uptime, reliability, and security. While the cloud already offers appealing flexibility and costs, the last three capabilities–uptime, reliability, security–are not usually associated with cloud computing. By security, IBM means 100% data encryption automatically, from the moment the data arrives or is created. And it remains encrypted for the rest of its life, at rest or in transit.

Are those capabilities important? You bet. A Harris study commissioned by IBM found that 64 percent of all consumers have opted not to work with a business out of concerns over whether that business could keep their data secure. However, that same study found 76 percent of respondents would be more willing to share personal information if there was a way to fully take back and retrieve that data at any time. Thus the importance of the z15’s pervasive encryption and the new data passports.

IBM has previously brought out its latest z running dedicated Linux. Initially it was a way to expand the z market through a reduced cost z.  DancingDinosaur doesn’t know the cost of the LinuxONE III. In the past they have been discounted but given the $34 billion IBM spent to acquire Red Hat the new machines might not be such a bargain this time.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/ 

IBM Introduces 53 Qubit Quantum Machine

September 23, 2019

IBM made two major system announcements within just a couple of weeks: On Sept. 18 IBM announced a 53 qubit guantum machine. The week before, IBM introduced its latest mainframe, the z15. Already buzz is circulating of a z16 in two years, about a normal release cycle for the next generation of  an IBM mainframe. 

Quantum computer up close
IBM’s largest quantum machine at 53 qubits

Along with the 53 qubit machine IBM announced the opening of a Quantum Computation Center in New York state. The new center expands, according to IBM, its fleet of quantum computing systems for commercial and research activity that exist beyond the confines of experimental lab environments. IBM’s offerings run from 5 to 10 to 20 to, now, 53 qubits. These are actual quantum machines hosted by IBM in the cloud, not just simulations. 

The IBM Quantum Computation Center will support the growing needs of a community of over 150,000 registered users and nearly 80 commercial clients, academic institutions and research laboratories to advance quantum computing and explore practical applications. To date, notes IBM, this  global community of users have run more than 14 million experiments on IBM’s quantum computers through the cloud since 2016, and published more than 200 scientific papers. To meet growing demand for access to real quantum hardware, ten quantum computing systems are now online through IBM’s Quantum Computation Center. The fleet is composed of five 20-qubit systems, one 14-qubit system, and four 5-qubit systems. Five of the systems now have a quantum volume of 16 – a measure of the power of a quantum computer used by IBM demonstrating a new sustained performance milestone.

IBM’s quantum systems are optimized for the reliability and reproducibility of programmable multi-qubit operations. Due to these factors, the systems enable state-of-the-art quantum computational research with 95 percent availability, according to the company.

Within one month, IBM’s commercially available quantum fleet will grow to 14 systems, including the new 53-qubit quantum computer, the single largest universal quantum system made available for external access in the industry to date. The new system offers a larger lattice and gives users the ability to run even more complex entanglement and connectivity experiments. Industry observers note that serious work requires a minimum of 200 qubits, probably just a couple more product intros away. 

Advances in quantum computing could open the door to future scientific discoveries such as new medicines and materials, vast improvements in the optimization of supply chains, and new ways computers to model financial data to make better investments. Examples of IBM’s  work with clients and partners, include:

  • J.P. Morgan Chase and IBM posted on arXiv,  Option Pricing using Quantum Computers, a methodology to price financial options and portfolios of such options, on a gate-based quantum computer. This resulted in an algorithm that provides a quadratic speedup, i.e. whereby classically computers need millions of samples, this methodology requires only a few thousands of samples to achieve the same result, It allows financial analysts to perform the option pricing and risk analysis in near real time. The implementation is available as open source in Qiskit Finance. 
  • Mitsubishi Chemical, Keio University and IBM simulated the initial steps of the reaction mechanism between lithium and oxygen in lithium-air batteries. Also available on arXiv,  this represents a first step in modeling the entire lithium-oxygen reaction on a quantum computer. Better understanding of this interaction could lead to more efficient batteries for mobile devices or automotive vehicles.

In the meantime IBM continues to simulate quantum algorithms on conventional supercomputers. According to one 2-year old report: at roughly 50 qubits, existing methods for calculating quantum amplitudes require either too much computation to be practical, or more memory than is available on any existing supercomputer, or both. You can bet that IBM or somebody else will push beyond 53 qubits pretty quickly. Google already claims a 72-qubit device, but it hasn’t let outsiders run programs on it. IBM has been making quantum available via the cloud since 2016. Other companies putting quantum computers in the cloud, include IBM’s Quantum Computation Center.IBM’s Quantum Computation Center. Others include  Rigetti Computing,  and Canada’s D-Wave

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/ 

New z15 for Multicloud– Cloud-Native– Instant Recovery

September 16, 2019

The z14 was introduced in the summer of 2017. So, it’s been a little over two years since IBM released a new top end mainframe, which seems about the right time for a new z. Unlike the z14, this machine is not focusing on the speeds and feeds, although they appear quite robust. With the z15, the focus is on multicloud environments.  No surprise there. IBM has been singing the gospel of multicloud and hybrid clouds for over a year.

z15–2-frames 19″ rack each

IBM does not describe the z15 in the usual terms for a top of the line mainframe, the ever-larger speeds and feeds. Rather, IBM describes it as its new enterprise platform delivering the ability to manage the privacy of customer data across hybrid multicloud environments. With z15, clients can manage who gets access to data via policy-based controls, with an industry-first capability to revoke access to data even across the hybrid cloud.

One of the capabilities IBM cites most often about the z15 isn’t even new: it’s pervasive encryption.  This was available on the z14. With pervasive encryption data is encrypted immediately without your having to do anything and it imposes no cost in terms of system overhead. Thank you embedded hardware. The data also is automatically protected both at rest and in flight. The savings in terms of staff time alone should be considerable.

Related to pervasive encryption is a new z15 capability: Data Privacy Passport. This allows you to gain control over how data is stored and shared, enabling the ability to protect and provision data and revoke access to that data at any time, not only within the z15 environment but across an enterprise’s hybrid multicloud environment. Similarly, the z15 can also encrypt data everywhere – across hybrid multicloud environments – to help enterprises secure their data wherever it travels and lands.

Another new z15 capability is Cloud-Native Development. This capability facilitates how you can modernize apps in place by building new cloud-native apps and securely integrate your important workloads across clouds. This just expedites what Z shops have long done: leverage existing Z software assets. Now you can more easily build, deploy and manage next-gen apps and protect data through advanced security and pervasive encryption.

Yet another new capability is instant recovery. Here the goal is to limit the cost and impact of planned and unplanned downtime by enabling you to access full system capacity for a period of time to accelerate shutdown and restart IBM Z services and provide a temporary capacity boost to rapidly recover from lost time.

Instant Recovery addresses what has become a recent customer imperative, concern about organizations mishandling their organization’s private data.  IBM commissioned a study and found that 64 percent of all consumers have opted not to work with a business out of concerns over whether they would keep their data secure. In addition, that same study found 76 percent of respondents would be more willing to share personal information if there was a way to fully take back and retrieve that data at any time. With z15, pervasive encryption is designed to extend across the enterprise to enforce data privacy by policy even when it leaves the platform. With this built-in capability, your clients, as IBM explains it,  can offer new services and features that gives customers stronger control over how their personal data is used.

The new z15 also addresses the rising Importance of data privacy  to clients. IBM commissioned a study by Harris, the polling organization, and released it this week to coincide with the z15 introduction. The Harris study found that 64 percent of all consumers have opted not to work with a business out of concerns over whether a given partner could keep their data secure. However, that same study found 76 percent of respondents would be more willing to share personal information if there was a way to fully take back and retrieve that data at any time. With z15, pervasive encryption is designed to extend across the enterprise enforcing data privacy by policy even when it leaves the platform. With this you can offer new services and features that give clients stronger control over how their personal data is used.

Along with the new z15, this week IBM also introduced a new mainframe storage array, the DS8000F.  In keeping with the hybrid, multicloud theme, the new array is specifically designed for mission critical, hybrid, multicloud environments. This new array, according to IBM, promises comprehensive next-level cyber security, data availability, and system resiliency. In addition to z15, the IBM DS8900F delivers  more than 99.99999 (seven nines) of availability.percent uptime along with several Disaster Recovery options designed for near-zero recovery times to ensure protection of data. 

Seven nines of availability surpases any level of availability DancingDinosaur has previously written about, even with the z14. It comes to, according to Google (https://uptime.is/complex?sla=99.99999&dur=24&dur=24&dur=24&dur=24&dur=24&dur=24&dur=24):

  • Weekly: 0.1s
  • Monthly: 0.3s
  • Yearly: 3.2s

With this level of availability,  you don’t even have time even to dash off to the restroom during an instance of downtime. With the new seven nines (99.99999) of availability,  IBM Z, clients, says the company, now have a new level of control of how and when they store their data to make the best economic and business sense, while always keeping it resilient and available.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/ 

TradeLens Gives IBM Blockchain Traction

August 27, 2019

In August IBM and Maersk, the global shipping company, announced the launch on TradeLens, a secure open network for the exchange of shipping information among the multiple parties involved in a shipment. TradeLens is expected to eliminate shipping’s information blind spot by delivering end-to-end information about each shipment.

Since the announcement, Trade Lens has attracted over 140 participants to the network. By the end of 2019 IBM expects the network to handle  over 50% of global shipping volume. Competitors like Oracle, however, have recently jumped in with its own blockchain platform as a cloud service. Expect more to come.

TradeLens uses blockchain technology, specifically HyperLedger, to create an industry standard for the secure digitization and transmission of supply chain documents around the world. TradeLens will generate savings for participants and enhance global supply chain security.

TradeLens connects all participants using  public facing, documented apps that connect into same API. Third parties can develop their own apps for the network. While blockchain/hyperledger is the primary technology for IBM and Maersk it is not the only one. TradeLens can accommodate legacy comm and data formats too.

Global trade is big, generating $16 trillion annually.  Just squeezing some efficiency out of what amounts to a cumbersome process bogged down with paper and handoffs should generate an attractive payback in terms of efficiency and speed. During the 12-month trial, Maersk and IBM worked with dozens of ecosystem partners to identify opportunities to prevent delays caused by documentation errors, information slowdowns, and other impediments.

One example demonstrated how TradeLens can reduce the transit time of a shipment of packaging materials to a production line in the United States by 40 percent, avoiding thousands of dollars in cost. Through better visibility and more efficient means of communicating, some supply chain participants estimate they could reduce the steps taken to answer such basic operation questions as “where is my container” from 10 steps and five people to, with TradeLens, one step and one person.

Apps are the key to using TradeLens. TradeLens apps all connect into same public facing and documented API. Underneath network is blockchain-hyperledger. Blockchain is just one of the technologies  being used. TradeLens can accommodate legacy comm formats. Data models, supply chain reference models, consignments, heirarchical models get shared.

During the 12-month trial, Maersk and IBM worked with dozens of ecosystem partners to identify opportunities to prevent delays caused by documentation errors, information delays, and other impediments. One example demonstrated how TradeLens can reduce the transit time of a shipment of packaging materials to a production line in the United States by 40 percent, avoiding thousands of dollars in cost.

Looking ahead, IBM and Mearsk envision a future roadmap that brings new capabilities. These may include shipping instructions, bills of lading with specific language.  As it turns out, you need an original bill of lading to take possession of shipped items. Solving that alone could eliminate a major bottleneck.

Currently, the platform handles 10 million events and more than 100,000 documents every week, and it is growing rapidly, according to IBM.  Similarly, the system handles about 20% of trade volume. With more shippers joining the network, IBM expects to reach 65% volume, And it can grow more. TradLens is not limited to ocean transport only. TradeLens could support other transport modes.

Blockchain is ideal for the z, with its security, scalability, and performance. There is no reason that only the biggest shippers should run TradeLens.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at technologywriter.com.

IBM Advances Commercial Quantum Computing

August 7, 2019

The reason IBM and others are so eager for quantum computing is simple: money. Recent efforts have demonstrated that quantum analytics can process massive amounts of transactions quickly and accurately, as much as nearly $70 Trillion last year, according to the World Bank.

“These are enormous amounts of money,” says mathematician Cornelis Oosterlee of Centrum Wiskunde & Informatica, a national research institute in the Netherlands for a piece in Wired Magazine. “Some single trades involve numbers that are scary to imagine”—part of a company’s pension fund, say, or a university endowment, he continues.

Of course, this isn’t exactly new. Large organizations with access to huge amounts of resources devote inordinate quantities of those resources in an effort to predict how much their assets will be worth in the future.  If they could do this modeling faster or more accurately or more efficiently, maybe just shaving off a few seconds here or there; well you can do the arithmetic.

Today these calculations are expensive to run, requiring either an in-house supercomputer or two or a big chunk of cloud computing processors and time. But if or when quantum computing could deliver on some of its theoretical promise to drive these analyses faster, more accurately, more efficiently and cheaper that’s something IBM could build into the next generation of systems.. 

And it is not just IBM. From Google on down to startups, developers are working on machines that could one day beat conventional computers at various tasks, such as classifying data through machine learning or inventing new drugs—and running complex financial calculations. In a step toward delivering on that promise, researchers affiliated with IBM and J.P. Morgan recently figured out how to run a simplified risk calculation on an actual quantum computer.

Using one of IBM’s machines, located in Yorktown Heights, New York, the researchers demonstrated they could simulate the future value of a financial product called an option. Currently, many banks use what’s called  the Monte Carlo method to simulate prices of all sorts of financial instruments. In essence, the Monte Carlo method models the future as a series of forks in the road. A company might go under; it might not. President Trump might start a trade war; he might not. Analysts estimate the likelihood of such scenarios, then generate millions of alternate futures at random. To predict the value of a financial asset, they produce a weighted average of these millions of possible outcomes.

Quantum computers are particularly well suited to this sort of probabilistic calculation, says Stefan Woerner, who led the IBM team. Classical (or conventional) computers—the kind most of us use—are designed to manipulate bits. Bits are binary, having a value of either 0 or 1. Quantum computers, on the other hand, manipulate qubits, which represent an in-between state. A qubit is like a coin flipping in the air—neither heads nor tails, neither 0 nor 1 but some probability of being one or the other. And because a qubit has unpredictability built in, it promises to  be a natural tool for simulating uncertain outcomes.

Woerner and his colleagues ran their Monte Carlo calculations using three of the 20 qubits available on their quantum machine. The experiment was too simplistic to be useful commercially, but it’s a promising proof of concept; once bigger and smoother-running quantum computers are available, the researchers hope to execute the algorithm faster than conventional machines.

But this theoretical advantage is just that, theoretical. Existing machines remain too error-ridden to compute consistently, In addition, financial institutions already have ample computing power available, onsite or in the cloud.. And they will have even more as graphics processing units (GPU), which can execute many calculations in parallel, come on line. A quantum computer might well be faster than an individual chip but it’s unclear whether it could beat a fleet of high performance GPUs in a supercomputer.

Still, it’s noteworthy that the IBM team was able to implement the algorithm on actual hardware, says mathematician Ashley Montanaro of the University of Bristol in the UK, who was not involved with the work. Academics first developed the mathematical proofs behind this quantum computing algorithm in 2000, but it remained a theoretical exercise for years. Woerner’s group took a 19-year-old recipe and figured out how to make it quantum-ready on actual quantum hardware.

Now they’re looking to improve their algorithm by using more qubits. The most powerful quantum computers today have fewer than 200 qubits, Practitioners suggest it may take thousands to consistently beat conventional methods.

But demonstrations like Woerner’s, even with their limited scope, are useful in that they apply quantum computers to problems organizationz actually want to solve, And that is what it will take if IBM expects to build quantum computing into a viable commercial business.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at technologywriter.com. 

IBM teams with Cloudera and Hortonworks 

July 11, 2019

Dancing Dinosaur has a friend on the West coast who finally left IBM after years of complaining, swearing never to return, and has been happily working at Cloudera ever since. IBM and Cloudera this week announced a strategic partnership to develop joint go-to-market programs designed to bring advanced data and AI solutions to more organizations across the expansive Apache Hadoop ecosystem.

Graphic representing a single solution for big data analytics

Deploy a single solution for big data

The agreement builds on the long-standing relationship between IBM and Hortonworks, which merged with Cloudera this past January to create integrated solutions for data science and data management. The new agreement builds on the integrated solutions and extends them to include the Cloudera platform. “This should stop the big-data-is-dead thinking that has been cropping up,” he says, putting his best positive spin on the situation.

Unfortunately, my West coast buddy may be back at IBM sooner than he thinks. With IBM finalizing its $34 billion Red Hat acquisition yesterday, it is small additional money to just buy Horton and Cloudera and own them all as a solid big data-cloud capabilities block IBM owns.  

As IBM sees it, the companies have partnered to offer an industry-leading, enterprise-grade Hadoop distribution plus an ecosystem of integrated products and services – all designed to help organizations achieve faster analytic results at scale. As a part of this partnership, IBM promises to:

  • Resell and support of Cloudera products
  • Sell and support of Hortonworks products under a multi-year contract
  • Provide migration assistance to future Cloudera/Hortonworks unity products
  • Deliver the benefits of the combined IBM and Cloudera collaboration and investment in the open source community, along with commitment to better support analytics initiatives from the edge to AI.

IBM also will resell the Cloudera Enterprise Data Hub, Cloudera DataFlow, and Cloudera Data Science Workbench. In response, Cloudera will begin to resell IBM’s Watson Studio and BigSQL.

“By teaming more strategically with IBM we can accelerate data-driven decision making for our joint enterprise customers who want a hybrid and multi-cloud data management solution with common security and governance,” said Scott Andress, Cloudera’s Vice President of Global Channels and Alliances in the announcement. 

Cloudera enables organizations to transform complex data into clear and actionable insights. It delivers an enterprise data cloud for any data, anywhere, from the edge to AI. One obvious question: how long until IBM wants to include Cloudera as part of its own hybrid cloud? 

But IBM isn’t stopping here. It also just announced new storage solutions across AI and big data, modern data protection, hybrid multicloud, and more. These innovations will allow organizations to leverage more heterogeneous data sources and data types for deeper insights from AI and analytics, expand their ability to consolidate rapidly expanding data on IBM’s object storage, and extend modern data protection to support more workloads in hybrid cloud environments.

The key is IBM Spectrum Discover, metadata management software that provides data insight for petabyte-scale unstructured storage. The software connects to IBM Cloud Object Storage and IBM Spectrum Scale, enabling it to rapidly ingest, consolidate, and index metadata for billions of files and objects. It provides a rich metadata layer that enables storage administrators, data stewards, and data scientists to efficiently manage, classify, and gain insights from massive amounts of unstructured data. Combining that with Cloudera and Horton on the IBM’s hybrid cloud should give you a powerful data analytics solution. 

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at technologywriter.com. 

 


%d bloggers like this: