Posts Tagged ‘blockchain’

IBM Spotlights Blockchain and Hyperledger Fabric at IBM InterCONNECT

March 23, 2017

IBM announced earlier this week Hyperledger Fabric v 1.0 beta, with security for regulated industries, governance tools, and over 1,000 transactions per second possible.  This is represents the first enterprise-ready blockchain service based on the Linux Foundation’s open source Hyperledger Fabric version 1.0. The service enables developers to quickly build and host security-rich production blockchain networks on the IBM Cloud and underpinned by IBM LinuxONE.

Maersk and IBM transform global trade with blockchain

LinuxONE, a dedicated z-based Linux system with as much security as any commercial platform is likely to have, should play a central role in blockchain networks. The machine also delivers all the itys the z is renowned for: scalability, availability, flexibility, manageability, and more.

The Linux Foundation’s open source Hyperledger Fabric v1.0 is being developed by members of the Hyperledger consortium alongside other open source blockchain technologies. The Hyperledger consortium’s Technical Steering Committee recently promoted Fabric from incubator to active state, and it is expected to be available in the coming weeks. It is designed to provide a framework for enterprise-grade blockchain networks that can transact at over 1,000 transactions per second.

Safety and security is everything with blockchain, which means blockchain networks are only as safe as the infrastructures on which they reside, hence the underpinning on LinuxONE. In addition, IBM’s High Security Business Network brings an extremely secure Linux infrastructure that, according to IBM, integrates security from the hardware up through the software stack, specifically designed for enterprise blockchains by providing:

  • Protection from insider attacks – helps safeguard entry points on the network and fight insider threats from anyone with system administrator credentials
  • The industry’s highest certified level of isolation for a commercial system- Evaluation Assurance Level certification of EAL5+ is critical in highly regulated industries such as government, financial services and healthcare, to prevent the leakage of information from one party’s environment to another
  • Secure Service Containers – to help protect code throughout the blockchain application and effectively encapsulating the blockchain into a virtual appliance, denying access even to privileged users
  • Tamper-responsive hardware security modules –to protect encrypted data for storage of cryptographic keys. These modules are certified to FIPS 140-2 Level 4, the highest level of security certification available for cryptographic modules
  • A highly auditable operating environment – comprehensive , immutable log data supports forensics, audit, and compliance

IBM also announced today the first commercially available blockchain governance tools, and new open-source developer tools that automate the steps it takes to build with the Hyperledger Fabric, reportedly speeding the process from weeks to days.

The new blockchain governance tools also make it easy to set up a blockchain network and assign roles and levels of visibility from a single dashboard. They help network members set rules, manage membership, and enforce network compliance once the network is up and running.

This seems straightforward enough. Once setup is initiated, members can determine the rules of the blockchain and share consent when new members request to join the network. In addition, the deployment tool assigns each network a Network Trust Rating of 1 to 100. New network members can view this before joining and determine whether or not they can trust the network enough to participate. Organizations can also take steps to improve their Trust Ratings before moving into production.

To make it easier for developers to translate business needs from concept to actual code, IBM Blockchain includes a new open-source developer tools for the Hyperledger Fabric called Fabric Composer. Fabric Composer promises to help users model business networks, create APIs that integrate with the blockchain network and existing systems of record, and quickly build a user interface. Fabric Composer also automates tasks that traditionally could take weeks, allowing developers to complete them in minutes instead.

IBM Blockchain for Hyperledger Fabric v1.0 is now available through a beta program on IBM Bluemix. Hyperledger Fabric also is available on Docker Hub as an IBM-certified image available for download at no cost.

At this point, IBM has over 25 publicly named Blockchain projects underway. They address everything from carbon asset management to consumer digital ID, post trade derivatives processing, last mile shipping, supply chain food safety, provenance, securities lending, and more seemingly are being added nearly weekly.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Please follow DancingDinosaur on Twitter, @mainframeblog. See more of his IT writing at technologywriter.com and here.

IBM and Northern Trust Collaborate on Blockchain for Private Equity Markets

March 3, 2017

At a briefing for IT analysts, IBM laid out how it sees blockchain working in practice. Surprisingly, the platform for the Hyperledger effort was not x86 but LinuxONE due to its inherent security.  As the initiative grows the z-based LinuxONE can also deliver the performance, scalability, and reliability the effort eventually will need too.

IBM describes its collaboration with Northern Trust and other key stakeholders as the first commercial deployment of blockchain technology for the private equity market. Although as the private equity market stands now the infrastructure supporting private equity has seen little innovation in recent years even as investors seek greater transparency, security, and efficiency. Enter the open LinuxONE platform, the Hyperledger fabric, and Unigestion, a Geneva, Switzerland-based asset manager with $20 billion in assets under management.

IBM Chairman and CEO Ginni Rometty discusses how cognitive technology and innovations such as Watson and blockchain have the potential to radically transform the financial services industry at Sibos 2016 in Geneva, Switzerland on Weds., September 28, 2016. (Feature Photo Service)

IBM Chairman and CEO Ginni Rometty discusses  blockchain at Sibos

The new initiative, as IBM explains it, promises a new and comprehensive way to access and visualize data.  Blockchain captures and stores information about every transaction and investment as meta data. It also captures details about relevant documents and commitments. Hyperledger itself is a logging tool that creates an immutable record.

The Northern Trust effort connects business logic, legacy technology, and blockchain technology using a combination of Java/JavaScript and IBMs blockchain product. It runs on IBM Bluemix (cloud) using IBM’s Blockchain High Security Business Network. It also relies on key management to ensure record/data isolation and enforce geographic jurisdiction. In the end it facilitates managing the fund lifecycle more efficiently than the previous primarily paper-based process.

More interesting to DancingDinosaur is the selection of the z through LinuxONE and blockchain’s use of storage.  To begin with blockchain is not really a database. It is more like a log file, but even that is not quite accurate because “it is a database you play as a team sport,” explained Arijit Das, Senior Vice President, FinTech Solutions, at the analyst briefing. That means you don’t perform any of the usual database functions; no deletes or updates, just appends.

Since blockchain is an open technology, you actually could do it on any x86 Linux machine, but DancingDinosaur readers probably wouldn’t want to do that. Blockchain essentially ends up being a distributed group activity and LinuxONE is unusually well optimized for the necessary security. It also brings scalability, reliability, and high performance along with the rock-solid security of the latest mainframe. In general LinuxONE can handle 8000 virtual servers in a single system and tens of thousands of containers. Try doing that with an x86 machine or even dozens.   You can read more on LinuxONE that DancingDinosaur wrote when it was introduced here and here.

But you won’t need near that scalability with the private equity application, at least at first. Blockchain gets more interesting when you think about storage. Blockchain has the potential to generate massive numbers of files fast, but that will only happen when it is part of, say, a supply chain with hundreds, or more likely, thousands of participating nodes on the chain and those nodes are very active. More likely for private equity trading, certainly at the start, blockchain will handle gigabytes of data and maybe only megabytes at first. This is not going to generate much revenue for IBM storage. A little bit of flash could probably do the trick.

Today, current legal and administrative processes that support private equity are time consuming and expensive, according to Peter Cherecwich, president of Corporate & Institutional Services at Northern Trust. They lack transparency while inefficient market practices leads to lengthy, duplicative and fragmented investment and administration processes. Northern Trust’s solution based on blockchain and Hyperledger, however, promises to deliver a significantly enhanced and efficient approach to private equity administration.

Just don’t expect to see overnight results. In fact, you can expect more inefficiency since the new blockchain/Hyperledger-based system is running in parallel with the disjointed manual processes. Previous legacy systems remain; they are not yet being replaced. Still, IBM insists that blockchain is an ideal technology to bring innovation to the private equity market, allowing Northern Trust to improve traditional business processes at each stage to deliver greater transparency and efficiency. Guess we’ll just have to wait and watch.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Please follow DancingDinosaur on Twitter, @mainframeblog. See more of his IT writing at technologywriter.com and here.

 

Arcati 2017 Mainframe Survey—Cognitive a No-Show

February 2, 2017

DancingDinosaur checks into Arcati’s annual mainframe survey every few years. You can access a copy of the 2017 report here.  Some of the data doesn’t change much, a few percentage points here or there. For example, 75% of the respondents consider the mainframe too expensive. OK, people have been saying that for years.

On the other hand, 65% of the respondents’ mainframes are involved with web services. Half also run Java-based mainframe apps, up from 30% last year, while 17% more are planning to run Java with their mainframe this year. Similarly, 35% of respondents report running Linux on the mainframe, up from 22% last year. Again, 13% of the respondents expect to add Linux this year.  Driving this is the advantageous cost and management benefits that result from consolidating distributed Linux workloads on the z. Yes, things are changing.

linuxone-5558_d_ibm_linuxone_social_tile_990_550_4_081515

The biggest surprise for DancingDinosaur, however, revolved around IBM’s latest strategic initiatives, especially cognitive computing and blockchain.  Other strategic initiatives may include, depending on who is briefing you at the moment—security, data analytics, cloud, hybrid cloud, and mobile. These strategic imperatives, especially cognitive computing, are expected to drive IBM’s revenue. In the latest statement, reported last week in DancingDinosaur, strategic imperatives amounted to 41% of revenue.  Cloud revenue and Cloud-as-a-service also rose considerably, 35% and 61% respectively.

When DancingDinosaur searched the accompanying Arcati vendor report (over 120 vendors with brief descriptions) for cognitive only GT Software came up. IBM didn’t even mention cognitive in its vendor listing, which admittedly was skimpy. The case was the same with Blockchain; only one vendor, Atos, mentioned it and nothing about blockchain in the IBM listing. More vendors, however, noted supporting one or some of the other supposed strategic initiatives.

Overall, the Arcati survey is quite positive about the mainframe. The survey found that 50 percent of sites viewed their mainframe as a legacy system (down from last year’s 62 percent). However, 22 percent (up from 16 percent last year) viewed mainframe as strategic, with 28 percent (up from 22 percent) viewing mainframes as both strategic and legacy.

Reinforcing the value of the mainframe, the survey found 78 percent of sites experienced some kind of increase in capacity. With increased demand for mainframe resources (data and processing), it should not be surprising that respondents report an 81 percent an increase in technology costs. Yet, 38 percent of sites report their people costs have decreased or stayed the same.

Unfortunately, the survey also found that 70 percent of respondents thought there were a cultural barrier between mainframe and other IT professionals. That did not discourage respondents from pointing out the mainframe advantages: 100 percent highlighted the benefit of the mainframe’s availability, 83 percent highlighted security, 75 percent identified scalability, and 71 percent picked manageability as a mainframe benefit.

Also, social media runs on the mainframe. Respondents found social media (Facebook, Twitter, YouTube) useful for their work on the mainframe. Twenty-seven percent report using social (up slightly from 25 percent last year) with the rest not using it at all despite IBM offering Facebook pages dedicated to IMS, CICS, and DB2. DancingDinosaur, only an occasional FB visitor, will check it out and report.

In terms of how mainframes are being used, the Arcati survey found that 25 percent of sites are planning to use Big Data; five percent of sites have adopted it for DevOps while 48 percent are planning to use mainframe DevOps going forward. Similarly, 14 percent of respondents already are reusing APIs while another 41 percent are planning to.

Arcati points out another interesting thought: The survey showed a 55:45 percent split in favor of distributed systems. So, you might expect the spend on the two types of platform to be similar. Yet, the survey found that 87 percent of an organization’s IT spend was going to distributed systems! Apparently mainframes aren’t as expensive as people think. Or put it another way, the cost of owning and operating distributed systems with mainframe-caliber QoS amounts to a lot more than people are admitting.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Please follow DancingDinosaur on Twitter, @mainframeblog. See more of his IT writing at technologywriter.com and here.

 

BMC Mainframe Survey Confirms z System Is Here to Stay

November 11, 2016

No surprise there. BMC’s 11th annual mainframe survey covering 1,200 mainframe executives and tech professionals found 58% of respondents reported usage of the mainframe is increasing as they look to capitalize on every infrastructure advantage it provides and add more workloads. Another 23% consider the mainframe as the best option to run critical work.

ibm_system_z10

IBM z10

Driving the continuing interest in the mainframe are the new demands for data handling, scalable processing, analytics, and more. According to the BMC survey nearly 60% of companies are seeing increased data and transaction volumes. They opt to stay with the mainframe for its highly secure, superior data handling and transaction serving, particularly as digital business adds unpredictability and volatility to workloads.

Overall respondents fell into three primary groups: 1) entrenched mainframe shops, 58% that are on board for the long haul; 2) shops, 23% that intend to maintain a steady amount of work on the mainframe; and 3) the 19% that are moving away from the mainframe.  The first two groups, committed mainframe shops, amount to just over survey 80% of the respondents.

Many companies surveyed are focused on addressing the increased workload demands, especially the rapidly growing demand for new applications. But surprisingly, the survey does not directly touch on hybrid cloud, cognitive computing or any of the latest technologies IBM has been promoting, not even DevOps, which can streamline mainframe application development and deployment. “We are not hearing much about a hybrid cloud environments or blockchain yet. Most companies seem to be in the early tire kicking stage, observed John McKenny, BMC Vice President, Strategy and Operations.

Eighty-eight percent of companies in the first group, entrenched mainframe shops, for example, are looking to increase the workloads they run on Java on the mainframe, primarily to address new application demands. It also doesn’t hurt that Java on the mainframe also can help lower data center costs by directing workloads to lower cost assist processors.

Other interesting BMC survey findings:

  • Half of the respondents report keeping 50% of their data on the mainframe and continue to invest in the platform for reasons you already know—security, availability, data serving capability
  • Continued steady growth of Linux in production on the z: 41% in 2014, 48% in 2015, 52% in 2016
  • Increased use of Java on the mainframe report as 67% of respondents cite need to meet growing application demand

Those looking to reduce mainframe presence cited three reasons: 1) perception of high cost, 2) outdated management understanding, and 3) looking for ways to reduce workloads over time.  DancingDinosaur has spoken with mainframe shops intending to migrate off the z and they cite the usual reasons, especially #1 above.

Top mainframe priorities for 2016 according to the BMC survey:  Cost reduction/optimization (65%); data privacy, compliance, security (50%); application availability (49%); application modernization (41%. Responses indicated the priorities for next year haven’t changed at all.

Surprisingly, many of the latest technologies for the z that IBM has touted recently have not yet shown up in the BMC survey responses, except maybe Java and Linux. This would include hybrid clouds, blockchain, IoT, and cognitive computing. IDC, for example, already is projecting cognitive computing to grow at a CAGR of 55.1% from 2016 to 2020. For z shops, however, cognitive computing appears almost invisible.

In some case with surveys like this you need to read between the lines. Where respondents report changes in activity levels driving application growth or the growth of interest in Java or the frequency of application changes and references to operational analytics they’re making oblique references to mobile or big data or even cognitive computing or other recent technologies for the z.

At its best, the BMC notes that digital technologies are transforming the ways in which mainframe shops conduct business and interact with their customers.  Adds BMC mainframe customer Credit Suisse: “IT departments are moving toward centralized, virtualized, and highly automated environments. This is being pursued to drive cost and processing efficiencies. Many companies realize that the Mainframe has provided these benefits for many years and is a mature and stable environment,” said Frank Cortell, Credit Suisse Director of Information Technology.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Please follow DancingDinosaur on Twitter, @mainframeblog. See more of his IT writing at technologywriter.com and here.

 

 

IBM 3Q16 Results Telegraph a New z System in 2017

October 27, 2016

DancingDinosaur usually doesn’t like to read too much into the statements of IBM suits at financial briefings. This has been especially true since IBM introduced a new presentation format this year to downplay its platform business and emphasize its strategic imperatives. (Disclaimer: DancingDinosaur is NOT a financial analyst but a technology analyst.)

But this quarter the CFO said flat out: “Our z Systems results reflect a product cycle dynamic, seven quarters into the z13 cycle; revenue was down while margins continue to expand. We continue to add new clients to the platform and we are introducing new technologies like block chain. We announced new services to make it easier to build and test block chain networks in a secure environment as we build our block chain platform it’s been engineered to run on multiple platforms but is optimized for scale, security and resilience on both the IBM mainframe and the IBM cloud.”

linuxone-emperorLinuxONE Emperor

If you parse the first sentence–reflect a product cycle dynamic–he is not too subtly hinting that IBM needs a z System refresh if they want to stop the financial losses with z. You don’t have to be a genius to expect a new z, probably the z14, in 2017. Pictured above is the LinuxONE Emperor, a z optimized to run Linux. The same suit said “We’ve been shifting our platform to address Linux, and in the third quarter Linux grew at a double digit rate, faster than the market.” So based on that we can probably guess that the z14 (or whatever it will be called) will run z/OS, followed shortly by a LinuxONE version to further expand the z System’s Linux footprint.

Timothy Prickett Morgan picked that up too and more. He expects a z14 processor complex will be announced next year around the same time that the Power9 chip ships. In both cases, Power and z customers who can wait will wait, or, if they are smart, will demand very steep discounts on current Power8 hardware to make up for the price/performance improvements that are sure to accompany the upcoming Power9 and z machines.

When it comes to revenue 3Q16 was at best flat, but actually was down again overall. The bright spot again was IBM’s strategic imperatives. As the suit stated: in total, we continue to deliver double-digit revenue growth in our strategic imperatives led by our cloud business. Specifically, cognitive solutions were up 5% and, within that, solution software was up 8%.

Overall, growth in IBM’s strategic imperatives rose 15%. Over the last 12 months, strategic imperatives delivered nearly $32 billion in revenue and now represent 40% of IBM. The suit also emphasized strong performance in IBM’s cloud offerings which increased over 40%, led by the company’s as-a-service offerings. IBM ended the third quarter with an as-a-service run rate of $7.5 billion, up from $6.7 billion last quarter. Most of that was attributed to organic growth, not acquisitions. Also strong was IBM’s revenue performance in security and mobile. In addition, the company experienced growth in its analytic offerings, up 14% this quarter with contributions from the core analytics platform, especially the Watson platform, Watson Health, and Watson IoT.

IBM apparently is convinced that cognitive computing, defined as using data and adding intelligence into products and services to help companies make better decisions, is the wave of the future. As the company sees it, real value lies in providing cognitive capabilities via the IBM cloud. A critical element of its strategy is IBM’s industry focus. Initially industry platforms will address two substantial opportunity areas, financial services and block chain solutions. You can probably add healthcare too.

Blockchain may emerge as the sleeper, although DancingDinosaur has long been convinced that blockchain is ideal for z shops—the z already handles the transactions and delivers the reliability, scalability, availability, and security to do it right.  As IBM puts it, “we believe block chain has the potential to do for trusted transactions what the Internet did for information.” Specifically, IBM is building a complete block chain platform and is now working with over 300 clients to pioneer block chain for business, including CLS, which settles $5 trillion per day in the currency markets, to implement a distributed ledger in support of its payment netting service, and Bank of Tokyo Mitsubishi, for smart contracts to manage service level agreements and automate multi party transactions.

Says Morgan: “IBM is very enthusiastic about using Blockchain in commercial transaction processing settings, and has 40 clients testing it out on mainframes, but this workload will take a long time to grow. Presumably, IBM will also push Blockchain on Power as well.”  Morgan may be right about blockchain coming to Power, but it is a natural for the z right now, whether as a new z14 or a new z-based LinuxONE machine.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghostwriter. Please follow DancingDinosaur on Twitter, @mainframeblog. See more of his IT writing at technologywriter.com and here.

 

OpenCAPI, Gen-Z, CCIX Initiate a New Computing Era

October 20, 2016

The next generation data center will be a more open, cooperative, and faster place judging from the remarkably similar makeup of three open consortia, OpenCAPI , Gen-Z, and CCIX. CCIX allows processors based on different instruction set architectures to extend their cache coherency to accelerators, interconnect, and I/O.

OpenCAPI provides a way to attach accelerators and I/O devices with coherence and virtual addressing to eliminate software inefficiency associated with the traditional I/O subsystem, and to attach advanced memory technologies.  The focus of OpenCAPI is on attached devices primarily within a server. Gen-Z, announced around the same time, is a new data access technology that primarily enables read and write operations among disaggregated memory and storage.

open-power-rethink-datacenter

Rethink the Datacenter

It’s quite likely that your next data center will use all three. The OpenCAPI group includes AMD, Dell EMC, Google, Hewlett Packard Enterprise, IBM, Mellanox Technologies, Micron, NVIDIA and Xilinx. Their new specification promises to enable up to 10X faster server performance with the first products expected in the second half of 2017.

The Gen-Z consortium consists Advanced Micro Devices, Broadcom, Huawei Technologies, Red Hat, Micron, Xilinx, Samsung, IBM, and Cray. Other founding members are Cavium, IDT, Mellanox Technologies, Microsemi, Seagate, SK Hynix, and Western Digital. They plan to develop a scalable computing interconnect and protocol that will enable systems to keep with the rapidly rising tide of data that is being generated and that needs to be analyzed. This will require the rapid movement of high volumes of data between memory and storage.

The CCIX initial members include Amphenol Corp., Arteris Inc., Avery Design Systems, Atos, Cadence Design Systems, Inc., Cavium, Inc., Integrated Device Technology, Inc., Keysight Technologies, Inc., Micron Technology, Inc., NetSpeed Systems, Red Hat Inc., Synopsys, Inc., Teledyne LeCroy, Texas Instruments, and TSMC.

The basic problem all three address revolves around how to make the volume and variety of new hardware forge fast communications and work together. In effect each group, from its own particular perspective, aims to boost the performance and interoperability of data center servers, devices, and components engaged in generating and handling myriad data and tasked with analyzing large amounts of that data. This will only be compounded as IoT, blockchain, and cognitive computing ramp up.

To a large extent, this results from the inability of Moore’s Law to continue to double the number of processors indefinitely. Future advances must rely on different sorts of hardware tweaks and designs to deliver greater price/performance.

Then in Aug. 2016 IBM announced a related chip breakthrough.  It unveiled the industry’s first 7 nm chip that could hold more than 20 billion tiny switches or transistors for improved computing power. The new chips could help meet demands of future cloud computing and Big Data systems, cognitive computing, mobile products and other emerging technologies, according to IBM.

Most chips today in servers and other devices use microprocessors between 14 and 22 nanometers (nm). The 7nm technology represents at least a 50 percent power improvement. IBM intends to apply the new chips to analyze DNA, viruses, and exosomes. IBM expects to test this lab-on-a-chip technology starting with prostate cancer.

The point of this digression into chips and Moore’s Law is to suggest the need for tools and interfaces like Open CAPI, Gen-Z, and CCIX. As the use cases for ultra fast data analytics expands along with the expected proliferation of devices speed becomes critical. How long do you want to wait for an analysis of your prostate or breast cells? If the cells are dear to you, every nanosecond matters.

For instance, OpenCAPI provides an open, high-speed pathway for different types of technology – advanced memory, accelerators, networking and storage – to more tightly integrate their functions within servers. This data-centric approach to server design puts the compute power closer to the data and removes inefficiencies in traditional system architectures to help eliminate system bottlenecks that significantly improve server performance.  In some cases OpenCAPI enables system designers to access memory with sub-500 nanosecond latency.

IBM plans to introduce POWER9-based servers that leverage the OpenCAPI specification in the second half of 2017. Similarly, expect other members of OpenPOWER Foundation to introduce OpenCAPI enabled products in the same time frame. In addition, Google and Rackspace’s new server under development, codenamed Zaius and announced at the OpenPOWER Summit in San Jose, will leverage POWER9 processor technology and plans to provide the OpenCAPI interface in its design. Also, Mellanox plans to enable the new specification capabilities in its future products and Xilinx plans to support OpenCAPI enabled FPGAs

As reported at the Gen-Z announcement, “The formation of these new consortia (CCIX, OpenCAPI, and Gen-Z), backed by more than 30 industry-leading global companies, supports the premise that the datacenter of the future will require open standards. We look forward to collaborating with CCIX and OpenCAPI as this new ecosystem takes shape,” said Kurtis Bowman, Gen-Z Consortium president. Welcome to the 7nm computing era.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghostwriter. Please follow DancingDinosaur on Twitter, @mainframeblog. See more of his IT writing at technologywriter.com and here.

 

Put your z System at the Center of Blockchain

October 6, 2016

The zSystem has been a leading platform for the world’s top banks for decades and with blockchain the z could capture even more banking and financial services data centers. Two recent IBM Institute for Business Value (IBV) studies show commercial blockchain solutions are rapidly being adopted throughout banking and financial markets dramatically faster than initially expected, according to an IBM announcement late in Sept.  Of course, not every blockchain deployment runs on z but more should be.

blockchainexplained-willian-mougayer

Copyright William Mougayer

According to an IBV study, more than 70 percent of early adopters are prioritizing blockchain efforts in order to break down current barriers to creating new business models and reaching new markets. IBV analyst report the respondents are better positioned to defend themselves against competitors, including those untraditional disruptors like non-bank startups. The majority of respondents are focusing their blockchain efforts on four areas: clearing and settlement, wholesale payments, equity and debt issuance, and reference data.

But blockchain isn’t just a financial services story. Mougayer identifies government services, healthcare, energy, supply chains, and world trade as blockchain candidates. IoT will also be an important area for blockchain, according to a new book on IoT by Maciej Kranz, an IoT pioneer.

As Kranz explains: blockchain has emerged as a technology that allows a secure exchange of value between entities in a distributed fashion. The technology first appeared on most IT radar screens a few years ago in the form of Bitcoin, a virtual currency that relies on blockchain technology to ensure its security and integrity. Although Bitcoin’s future is still uncertain, blockchain is a completely different story.

Blockchain is attracting considerable attention for its ability to ensure the integrity of transactions over the network between any entities. Automobile companies are considering the technology to authenticate connected vehicles in the vehicle-to-vehicle (V2V) environment, notes Kranz. Still others are looking at blockchain to trace the sources of goods, increase food safety, create smart contracts, perform audits, and do much more. Blockchain also provides a natural complement to IoT security in a wide variety of use cases.

The z and especially the newest generation of z Systems is ideal for blockchain work. Zero downtime, industry-leading security, massive I/O, flexibility, high performance at scale, and competitive price/performance along with its current presence in the middle of most transactions, especially financial transactions, makes z a natural for blockchain.

A key driver for blockchain, especially in the banking and financial services segment is the Linux Foundation’s HyperLedger project. This entails a collaborative, open source effort to establish an open blockchain platform that will satisfy a variety of use cases across multiple industries to streamline business processes. Through a cross-industry, open standard for distributed ledgers, virtually any digital exchange of value, such as real estate contracts, energy trades, even marriage licenses can securely and cost-effectively be tracked and traded.

According to Linux Foundation documents, “the Hyperledger Project has ramped up incredibly fast, a testament to how much pent-up interest, potential, and enterprise demand there is for a cross-industry open standard for distributed ledgers.” Linux Foundation members of the Hyperledger Project are moving blockchain technology forward at remarkable speed. IBM has been an early and sizeable contributor of code to the project. It contributed 44,000 lines of code as a founding member.

That it is catching on so quickly in the banking and financial services sector shouldn’t be a surprise either.  What blockchain enables is highly secure and unalterable distributed transaction tracking at every stage of the transaction.  Said Likhit Wagle, Global Industry General Manager, IBM Banking and Financial Markets, when ticking off blockchain advantages: To start, first movers are setting business standards and creating new models that will be used by future adopters of blockchain technology. We’re also finding that these early adopters are better able to anticipate disruption and fight off new competitors along the way.

It is the larger banks leading the charge to embrace blockchain technology with early adopters twice as likely to be large institutions with more than a hundred thousand employees. Additionally, 77 percent of these larger banks are retail banking organizations.

As the IBV surveys found, trailblazers expect the benefits from blockchain technology to impact several business areas, including reference data (83 percent), retail payments (80 percent) and consumer lending (79 percent). When asked which blockchain-based new business models could emerge, 80 percent of banks surveyed identified trade finance, corporate lending, and reference data as having the greatest potential.

IBM is making it easy to tap blockchain by making it available through Docker containers, as a signed and certified distribution of IBM’s code submission to Hyperledger, and through Bluemix services. As noted above, blockchain is a natural fit for the z and LinuxOne. To that end, Bluemix Blockchain Services and a fully integrated DevOps Tool is System z- and IoT-enabled.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghostwriter. Please follow DancingDinosaur on Twitter, @mainframeblog. See more of his IT writing at technologywriter.com and here.

 

IBM Discounts z/OS Cloud Activity

August 12, 2016

The latest iteration of IBM’s z/OS workload pricing aims at to lower the cost of running cloud workloads.  In a recent announcement, z Systems Workload Pricing for Cloud (zWPC) for z/OS seeks to minimize the impact of new public cloud workload transaction growth on Sub-Capacity license charges. IBM did the same thing with mobile workloads when they started driving up the 4-hour workload averages on the z. As more z workloads interact with public clouds this should start to add up, if it hasn’t already.

bluemix garage -ni_5554516560

Bluemix Garages in the Cloud

As IBM puts it: zWPC applies to any organization that has implemented Sub-Capacity pricing via the basic AWLC or AEWLC pricing mechanisms for the usual MLC software suspects. These include z/OS, CICS, DB2, IMS, MQ and WebSphere Application Server (WAS).  An eligible transaction is one classified as Public Cloud-originated, connecting to a z/OS hosted transactional service and/or data source via a REST or SOAP web service.  Public cloud workloads are defined as transactions processed by named Public cloud application transactions identified as originating from a recognized Public Cloud offering, including but not limited to, Amazon Web Services (AWS), Microsoft Azure, IBM Bluemix, and more.

IBM appears to have simplified how you identify eligible workloads. As the company notes: zWPC does not require you to isolate the public cloud work in separate partitions, but rather offers an enhanced way of reporting. The z/OS Workload Manager (WLM) allows clients to use WLM classification rules to distinguish cloud workloads, effectively easing the data collection requirements for public cloud workload transactions.

So how much will you save? It reportedly reduces eligible hourly values by 60 percent. The discount produces an adjusted Sub-Capacity value for each reporting hour. What that translates into on your monthly IBM software invoice once all the calculations and fine print are considered amounts to a guess at this point. But at least you’ll save something. The first billing eligible under this program starts Dec. 1, 2016.

DancingDinosaur expects IBM to eventually follow with discounted z/OS workload pricing for IoT and blockchain transactions and maybe even cognitive activity. Right now the volume of IoT and blockchain activity is probably too low to impact anybody’s monthly license charges. Expect those technologies ramp up in coming years with many industry pundits projecting huge numbers—think billions and trillions—that will eventually impact the mainframe data center and associated software licensing charges.

Overall, Workload License Charges (WLC) constitute a monthly software license pricing metric applicable to IBM System z servers running z/OS or z/TPF in z/Architecture (64-bit) mode.  The driving principle of WLS amounts to pay-for-what-you-use, a laudable concept. In effect it lowers the cost of incremental growth while further reducing software costs by proactively managing associated peak workload utilization.

Generally, DancingDinosaur applauds anything IBM does to lower the cost of mainframe computing.  Playing with workload software pricing in this fashion, however, seems unnecessary. Am convinced there must be simpler ways to lower software costs without the rigmarole of metering and workload distribution tricks. In fact, a small mini-industry has cropped up among companies offering tools to reduce costs, primarily through various ways to redistribute workloads to avoid peaks.

A modification to WLC, the variable WLC (VWLC) called AWLC (Advanced) and the EWLC (Entry), aligns with most of the z machines introduced over the past couple of years.  The result, according to IBM, forms a granular cost structure based on MSU (CPU) capacity that applies to VWLC and associated pricing mechanisms.

From there you can further tweak the cost by deploying Sub-Capacity and Soft Capping techniques.  Defined Capacity (DC), according to IBM, allows the sizing of an LPAR in MSU such that the LPAR will not exceed the designated MSU amount.  Group Capacity Limit (GCL) extends the Defined Capacity principle for a single LPAR to a group of LPARs, allowing MSU resources to be shared accordingly.  BTW, a potential downside of GCL is that is one LPAR in the group can consume all available MSUs due to a rogue transaction. Again, an entire mini industry, or maybe no so mini, has emerged to help handle workload and capacity pricing on the z.

At some point in most of the conference pricing sessions the eyes of many attendees glaze over.  By Q&A time the few remaining pop up holding a copy of a recent invoice and ask what the hell this or that means and what the f$#%@#$ they can do about it.

Have to admit that DancingDinosaur did not attend the most recent SHARE conference, where pricing workshops can get quite energetic, so cannot attest to the latest fallout. Still, the general trend with mobile and now with cloud pricing discounts should be lower costs.

DancingDinosaur is Alan Radding, a veteran information technology analyst and writer. Please follow DancingDinosaur on Twitter, @mainframeblog. See more of his IT writing at technologywriter.com and here.

 

IBM 2Q 2016 Report—Where’s z System and POWER?

July 22, 2016

“IBM continues to establish itself as the leading cognitive solutions and cloud platform company,” said Ginni Rometty, IBM chairman, president and chief executive officer, in a statement accompanying the latest IBM 2Q financial report. The strategic imperatives grew; second-quarter revenues from its cloud, analytics, and engagement units— increased 12 percent year to year.

IBM Quantum Computing Lab - Friday, April 29, 2016, Yorktown Heights, NY (Jon Simon/Feature Photo Service for IBM)

IBM Quantum Experience delivered via Cloud (Jon Simon/Feature Photo Service for IBM)

Where’s z and POWER? The z and POWER platforms continued to flounder: revenues of $2.0 billion, down 23.2 percent. Revenue reflects z Systems product cycle dynamics; gross profit margin improved in both z Systems and Power. “Product cycle dynamics” refers to the lack of a new z.  In the past year IBM introduced the new LinuxONE and, more recently a new z13s, essentially what used to be known as a Business Class mainframe.

There is no hint, however, of a new z, a z14 that will drive product dynamics upward. IBM showed a POWER roadmap going all the way out to the POWER10 in 2020 but nothing comparable for the z.

DancingDinosaur, a longtime big iron bigot, remains encouraged by IBM’s focus on its strategic initiatives and statements like this: “And we continue to invest for growth with recent breakthroughs in quantum computing, Internet of Things and blockchain solutions for the IBM Cloud.” IBM strategic initiatives in cloud, mobile, IoT, and blockchain will drive new use of the mainframe, especially as the projected volumes of things, transactions, users, and devices skyrocket.

Second-quarter revenues from the company’s strategic imperatives — cloud, analytics and engagement — increased 12 percent year to year.  Cloud revenues (public, private and hybrid) for the quarter increased 30 percent.  Cloud revenue over the trailing 12 months was $11.6 billion.  The annual run rate for cloud as-a-service revenue — a subset of total cloud revenue — increased to $6.7 billion from $4.5 billion in the second quarter of 2015.  Revenues from analytics increased 5 percent.  Revenues from mobile increased 43 percent and from security increased 18 percent.

IBM indirectly is trying to boost the z and the cloud. CSC and IBM  announced an alliance with IBM in which IBM will provide CSC Cloud Managed Services for z Systems. CSC already includes IBM SoftLayer as part of its “Service-enabled Enterprise” strategy. “Cloud for z” extends that offering and will be of interest to current and potential mainframe customers in healthcare, insurance, and finance. CSC still sees life in the managed mainframe market, and IBM Global Technology Services, a competitor to CSC, apparently is happy to let them sell managed cloud services for mainframes. All this is taking place as IBM scrambles to secure a leadership share of cloud revenue, and any cloud billing CSC brings will help.

Microsoft, like IBM, claimed big cloud momentum on its fourth quarter conference call, according to a report in Fortune Magazine. It was enough to send Microsoft share price up 4% at one point in after hours trading.

As Fortune notes, for Microsoft as for IBM and other legacy IT providers like Oracle, putting up big cloud numbers is mandatory as more companies change the way they buy IT products. Instead of purchasing hundreds or thousands of new servers or storage boxes every few years, more companies are running their software and storing their data on shared public cloud infrastructure, like Microsoft Azure, Amazon Web Services, the Google Compute Platform, or the IBM Cloud.

For reporting purposes, Microsoft combines Azure with other products in its intelligent cloud product segment. Overall, that segment’s revenue grew about 7% year over year to $6.7 billion from about $6.3 billion.

Oracle, too, is facing the same scramble to establish an enterprise cloud presence. Cloud software as a service (SaaS) and platform as a service (PaaS) revenues were $690 million, up 66% in U.S. dollars. Total Cloud revenues, including infrastructure as a service (IaaS), were $859 million, up 49% in U.S. dollars. At the same time, Oracle’s hardware revenue fell by 7% to $1.3 billion, and its software license revenue fell by 2% to $7.6 billion.

“We added more than 1,600 new SaaS customers and more than 2,000 new PaaS customers in Q4” (which ended in June), said Oracle CEO, Mark Hurd. “In Fusion ERP alone, we added more than 800 new cloud customers. Today, Oracle has nearly 2,600 Fusion ERP customers in the Oracle Public Cloud — that’s ten-times more cloud ERP customers than Workday.”

Hewlett Packard Enterprise (HPE) is the last of the big public enterprise platform vendors, along with IBM and Oracle. (Dell is private and acquired EMC). HPE recently reported its best quarter in years. Second quarter net revenue of $12.7 billion, up 1% from the prior-year period. “Today’s results represent our best performance since I joined in 2011,” said Meg Whitman, president and chief executive officer, Hewlett Packard Enterprise. The businesses comprising HPE grew revenue over the prior-year period on an as-reported basis for the first time in five years.

IBM needs to put up some positive numbers. Seventeen consecutive losing quarters is boring. Wouldn’t it be exciting if a turnaround started with a new enterprise z14?

DancingDinosaur is Alan Radding, a veteran information technology analyst and writer. Please follow DancingDinosaur on Twitter, @mainframeblog. See more of his IT writing at technologywriter.com and here.

 

IBM Racks Up Blockchain Success

July 15, 2016

It hasn’t even been a year (Dec. 17, 2015) since IBM first publicly introduced its participation in the Linux Foundation’s newest collaborative project, Open Ledger Project, a broad-based Blockchain initiative.  And only this past April did IBM make serious noise publicly about Blockchain on the z, here. But since then IBM has been ramping up Blockchain initiatives fast.

LinuxONE rockhopper

Courtesy of IBM: LinuxONE Rockhopper

Just this week IBM made its security framework for blockchain public, first announced in April, by releasing the beta of IBM’s Blockchain Enterprise Test Network. This enables organizations to easily access a secure, partitioned blockchain network on the cloud to deploy, test, and run blockchain projects.

The IBM Blockchain Enterprise Test Network is a cloud platform built on a LinuxONE system.  Developers can now test four-node networks for transactions and validations with up to four parties.  The Network provides the next level of service for developers ready to go beyond the two-node blockchain service currently available in Bluemix for testing and simulating transactions between two parties. The Enterprise Test Network runs on LinuxONE, which IBM touts as the industry’s most secure Linux server due to the z mainframe’s Evaluation Assurance Level 5+ (EAL5+) security rating.

Also this week, Everledger, a fraud detection system for use with big data, announced it is building a business network using IBM Blockchain for their global certification system designed to track valuable items through the supply chain. Such items could be diamonds, fine art, and luxury goods.

Things continued to crank up around blockchain with IBM announcing a collaboration with the Singapore Economic Development Board (EDB) and the Monetary Authority of Singapore (MAS). With this arrangement IBM researchers will work with government, industries, and academia to develop applications and solutions based on enterprise blockchain, cyber-security, and cognitive computing technologies. The effort will draw on the expertise in the Singapore talent pool as well as that of the IBM Research network.  The Center also is expected to engage with small- and medium-sized enterprises to create new applications and grow new markets in finance and trade.

Facilitating this is the cloud. IBM expects new cloud services around blockchain will make these technologies more accessible and enable leaders from all industries to address what is already being recognized as profound and disruptive implications in finance, banking, IoT, healthcare, supply chains, manufacturing, technology, government, the legal system, and more. The hope, according to IBM, is that collaboration with the private sector and multiple government agencies within the same country will advance the use of Blockchain and cognitive technologies to improve business transactions across several different industries.

That exactly is the goal of blockchain. In a white paper from the IBM Institute of Business Value on blockchain, here, the role of blockchain is as a distributed, shared, secure ledger. These shared ledgers write business transactions as an unbreakable chain that forms a permanent record viewable by the parties in a transaction. In effect, blockchains shifts the focus from information held by an individual party to the transaction as a whole, a cross-entity history of an asset or transaction. This alone promises to reduce or even eliminate friction in the transaction while removing the need for most middlemen.

In that way, the researchers report, an enterprise, once constrained by complexity, can scale without unnecessary friction. It can integrate vertically or laterally across a network or ecosystem, or both. It can be small and transact with super efficiency. Or, it can be a coalition of individuals that come together briefly. Moreover, it can operate autonomously; as part of a self-governing, cognitive network. In effect, distributed ledgers can become the foundation of a secure distributed system of trust, a decentralized platform for massive collaboration. And through the Linux Foundation’s Open Ledger Project, blockchain remains open.

Even at this very early stage there is no shortage of takers ready to push the boundaries of this technology. For example, Crédit Mutuel Arkéa recently announced the completion of its first blockchain project to improve the bank’s ability to verify customer identity. The result is an operational permissioned blockchain network that provides a view of customer identity to enable compliance with Know Your Customer (KYC) requirements. The bank’s success demonstrated the disruptive capabilities of blockchain technology beyond common transaction-oriented use cases.

Similarly, Mizuho Financial Group and IBM announced in June a test of the potential of blockchain for use in settlements with virtual currency. Blockchain, by the way, first gained global attention with Bitcoin, an early virtual currency. By incorporating blockchain technology into settlements with virtual currency, Mizuho plans to explore how payments can be instantaneously swapped, potentially leading to new financial services based on this rapidly evolving technology. The pilot project uses the open source code IBM contributed to the Linux Foundation’s Hyperledger Project.

Cloud-based blockchain running on large LinuxONE clusters may turn out to play a big role in ensuring the success of IoT by monitoring and tracking the activity between millions of things participating in a wide range of activities. Don’t let your z data center get left out; at least make sure it can handle Linux at scale.

DancingDinosaur is Alan Radding, a veteran information technology analyst and writer. Please follow DancingDinosaur on Twitter, @mainframeblog. See more of his IT writing at technologywriter.com and here.

 


%d bloggers like this: