Posts Tagged ‘BlueMix’

IBM Changes the Economics of Cloud Storage

March 31, 2017

Storage tiering used to be simple: active data went to your best high performance storage, inactive data went to low cost archival storage, and cloud storage filled in for one or whatever else was needed. Unfortunately, today’s emphasis on continuous data analytics, near real-time predictive analytics, and now cognitive has complicated this picture and the corresponding economics of storage.

In response, last week IBM unveiled new additions to the IBM Cloud Object Storage family. The company is offering clients new choices for archival data and a new pricing model to more easily apply intelligence to unpredictable data patterns using analytics and cognitive tools.

Analytics drive new IBM cloud storage pricing

By now, line of business (LOB) managers, having been exhorted to leverage big data and analytics for years, are listening. More recently, the analytics drumbeat has expanded to include not just big data but sexy IoT, predictive analytics, machine learning, and finally cognitive science. The idea of keeping data around for a few months and parking it in a long term archive to never be looked at again until it is finally deleted permanently just isn’t happening as it was supposed to (if it ever did). The failure to permanently remove expired data can become costly from a storage standpoint as well as risky from an e-discovery standpoint.

IBM puts it this way: Businesses typically have to manage across three types of data workloads: “hot” for data that’s frequently accessed and used; “cool” for data that’s infrequently accessed and used; and “cold” for archival data. Cold storage is often defined as cheaper but slower. For example, if a business uses cold storage, it typically has to wait to retrieve and access that data, limiting the ability to rapidly derive analytical or cognitive insights. As a result, there is a tendency to store data in more expensive hot storage.

IBM’s new cloud storage offering, IBM Cloud Object Storage Flex (Flex), uses a “pay as you use” model of storage tiers potentially lowering the price by 53 percent compared to AWS S3 IA1 and 75 percent compared to Azure GRS Cool Tier.2 (See footnotes at the bottom of the IBM press release linked to above. However IBM is not publishing the actual Flex storage prices.) Flex, IBM’s new cloud storage service, promises simplified pricing for clients whose data usage patterns are difficult to predict. Flex promises organizations will benefit from the cost savings of cold storage for rarely accessed data, while maintaining high accessibility to all data.

Of course, you could just lower the cost of storage by permanently removing unneeded data.  Simply insist that the data owners specify an expiration date when you set up the storage initially. When the date arrives in 5, 10, 15 years automatically delete the data. At least that’s how I was taught eons ago. Of course storage costs orders of magnitude less now although storage volumes are orders of magnitude greater and near real-time analytics weren’t in the picture.

Without the actual rates for the different storage tiers you cannot determine how much Storage Flex may save you.  What it will do, however, is make it more convenient to perform analytics on archived data you might otherwise not bother with.  Expect this issue to come up increasingly as IoT ramps up and you are handling more data that doesn’t need hot storage beyond the first few minutes of its arrival.

Finally, the IBM Cloud Object Storage Cold Vault (Cold Vault) service gives clients access to cold storage data on the IBM Cloud and is intended to lead the category for cold data recovery times among its major competitors. Cold Vault joins its existing Standard and Vault tiers to complete a range of IBM cloud storage tiers that are available with expanded expertise and methods via Bluemix and through the IBM Bluemix Garages.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Please follow DancingDinosaur on Twitter, @mainframeblog. See more of his IT writing at technologywriter.com and here.

 

IBM Spotlights Blockchain and Hyperledger Fabric at IBM InterCONNECT

March 23, 2017

IBM announced earlier this week Hyperledger Fabric v 1.0 beta, with security for regulated industries, governance tools, and over 1,000 transactions per second possible.  This is represents the first enterprise-ready blockchain service based on the Linux Foundation’s open source Hyperledger Fabric version 1.0. The service enables developers to quickly build and host security-rich production blockchain networks on the IBM Cloud and underpinned by IBM LinuxONE.

Maersk and IBM transform global trade with blockchain

LinuxONE, a dedicated z-based Linux system with as much security as any commercial platform is likely to have, should play a central role in blockchain networks. The machine also delivers all the itys the z is renowned for: scalability, availability, flexibility, manageability, and more.

The Linux Foundation’s open source Hyperledger Fabric v1.0 is being developed by members of the Hyperledger consortium alongside other open source blockchain technologies. The Hyperledger consortium’s Technical Steering Committee recently promoted Fabric from incubator to active state, and it is expected to be available in the coming weeks. It is designed to provide a framework for enterprise-grade blockchain networks that can transact at over 1,000 transactions per second.

Safety and security is everything with blockchain, which means blockchain networks are only as safe as the infrastructures on which they reside, hence the underpinning on LinuxONE. In addition, IBM’s High Security Business Network brings an extremely secure Linux infrastructure that, according to IBM, integrates security from the hardware up through the software stack, specifically designed for enterprise blockchains by providing:

  • Protection from insider attacks – helps safeguard entry points on the network and fight insider threats from anyone with system administrator credentials
  • The industry’s highest certified level of isolation for a commercial system- Evaluation Assurance Level certification of EAL5+ is critical in highly regulated industries such as government, financial services and healthcare, to prevent the leakage of information from one party’s environment to another
  • Secure Service Containers – to help protect code throughout the blockchain application and effectively encapsulating the blockchain into a virtual appliance, denying access even to privileged users
  • Tamper-responsive hardware security modules –to protect encrypted data for storage of cryptographic keys. These modules are certified to FIPS 140-2 Level 4, the highest level of security certification available for cryptographic modules
  • A highly auditable operating environment – comprehensive , immutable log data supports forensics, audit, and compliance

IBM also announced today the first commercially available blockchain governance tools, and new open-source developer tools that automate the steps it takes to build with the Hyperledger Fabric, reportedly speeding the process from weeks to days.

The new blockchain governance tools also make it easy to set up a blockchain network and assign roles and levels of visibility from a single dashboard. They help network members set rules, manage membership, and enforce network compliance once the network is up and running.

This seems straightforward enough. Once setup is initiated, members can determine the rules of the blockchain and share consent when new members request to join the network. In addition, the deployment tool assigns each network a Network Trust Rating of 1 to 100. New network members can view this before joining and determine whether or not they can trust the network enough to participate. Organizations can also take steps to improve their Trust Ratings before moving into production.

To make it easier for developers to translate business needs from concept to actual code, IBM Blockchain includes a new open-source developer tools for the Hyperledger Fabric called Fabric Composer. Fabric Composer promises to help users model business networks, create APIs that integrate with the blockchain network and existing systems of record, and quickly build a user interface. Fabric Composer also automates tasks that traditionally could take weeks, allowing developers to complete them in minutes instead.

IBM Blockchain for Hyperledger Fabric v1.0 is now available through a beta program on IBM Bluemix. Hyperledger Fabric also is available on Docker Hub as an IBM-certified image available for download at no cost.

At this point, IBM has over 25 publicly named Blockchain projects underway. They address everything from carbon asset management to consumer digital ID, post trade derivatives processing, last mile shipping, supply chain food safety, provenance, securities lending, and more seemingly are being added nearly weekly.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Please follow DancingDinosaur on Twitter, @mainframeblog. See more of his IT writing at technologywriter.com and here.

IBM Introduces First Universal Commercial Quantum Computers

March 9, 2017

A few years ago DancingDinosaur first encountered the possibility of quantum computing. It was presented as a real but distant possibility. This is not something I need to consider I thought at the time.  By the time it is available commercially I will be long retired and probably six feet under. Well, I was wrong.

This week IBM unveiled its IBM Q quantum systems. IBM Q will be leading Watson and blockchain to deliver the most advanced set of services on the IBM Cloud platform. There are organizations using it now, and DancingDinosaur continues to be living and working still.

IBM Quantum Computing scientists Hanhee Paik (left) and Sarah Sheldon (right) examine the hardware inside an open dilution fridge at the IBM Q Lab

As IBM explains: While technologies that currently run on classical (or conventional) computers, such as Watson, can help find patterns and insights buried in vast amounts of existing data, quantum computers will deliver solutions to multi-faceted problems where patterns cannot be seen because the data doesn’t exist and the possibilities that you need to explore are too enormous to ever be processed by conventional computers.

Just don’t retire your z or Power system in favor on an IBM Q yet. As IBM explained at a recent briefing on the quantum computing the IBM Q universal quantum computers will be able to do any type of problem that conventional computers do today. However, many of today’s workloads, like on-line transaction processing, data storage, and web serving will continue to run more efficiently on conventional systems. The most powerful quantum systems of the next decade will be a hybrid of quantum computers with conventional computers to control logic and operations on large amounts of data.

The most immediate use cases will involve molecular dynamics, drug design, and materials. The new quantum machine, for example, will allow the healthcare industry to design more effective drugs faster and at less cost and the chemical industry to develop new and improved materials.

Another familiar use case revolves around optimization in finance and manufacturing. The problem here comes down to computers struggling with optimization involving an exponential number of possibilities. Quantum systems, noted IBM, hold the promise of more accurately finding the most profitable investment portfolio in the financial industry, the most efficient use of resources in manufacturing, and optimal routes for logistics in the transportation and retail industries.

To refresh the basics of quantum computing.  The challenges invariably entail exponential scale. You start with 2 basic ideas; 1) the uncertainty principle, which states that attempting to observe a state in general disturbs it while obtaining only partial information about the state. Or 2) where two systems can exist in an entangled state, causing them to behave in ways that cannot be explained by supposing that each has some state of its own. No more zero or 1 only.

The basic unit of quantum computing is the qubit. Today IBM is making available a 5 qubit system, which is pretty small in the overall scheme of things. Large enough, however, to experiment and test some hypotheses; things start getting interesting at 20 qubits. An inflexion point, IBM researchers noted, occurs around 50 qubits. At 50-100 qubits people can begin to do some serious work.

This past week IBM announced three quantum computing advances: the release of a new API for the IBM Quantum Experience that enables developers and programmers to begin building interfaces between IBM’s existing 5 qubit cloud-based quantum computer and conventional computers, without needing a deep background in quantum physics. You can try the 5 qubit quantum system via IBM’s Quantum Experience on Bluemix here.

IBM also released an upgraded simulator on the IBM Quantum Experience that can model circuits with up to 20 qubits. In the first half of 2017, IBM plans to release a full SDK on the IBM Quantum Experience for users to build simple quantum applications and software programs. Only the publically available 5 qubit quantum system with a web-based graphical user interface now; soon to be upgraded to more qubits.

 IBM Research Frontiers Institute allows participants to explore applications for quantum computing in a consortium dedicated to making IBM’s most ambitious research available to its members.

Finally, the IBM Q Early Access Systems allows the purchase of access to a dedicated quantum system hosted and managed by IBM. Initial system is 15+ qubits, with a fast roadmap promised to 50+ qubits.

“IBM has invested over decades to growing the field of quantum computing and we are committed to expanding access to quantum systems and their powerful capabilities for the science and business communities,” said Arvind Krishna, senior vice president of Hybrid Cloud and director for IBM Research. “We believe that quantum computing promises to be the next major technology that has the potential to drive a new era of innovation across industries.”

Are you ready for quantum computing? Try it today on IBM’s Quantum Experience through Bluemix. Let me know how it works for you.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Please follow DancingDinosaur on Twitter, @mainframeblog. See more of his IT writing at technologywriter.com and here.

IBM and Northern Trust Collaborate on Blockchain for Private Equity Markets

March 3, 2017

At a briefing for IT analysts, IBM laid out how it sees blockchain working in practice. Surprisingly, the platform for the Hyperledger effort was not x86 but LinuxONE due to its inherent security.  As the initiative grows the z-based LinuxONE can also deliver the performance, scalability, and reliability the effort eventually will need too.

IBM describes its collaboration with Northern Trust and other key stakeholders as the first commercial deployment of blockchain technology for the private equity market. Although as the private equity market stands now the infrastructure supporting private equity has seen little innovation in recent years even as investors seek greater transparency, security, and efficiency. Enter the open LinuxONE platform, the Hyperledger fabric, and Unigestion, a Geneva, Switzerland-based asset manager with $20 billion in assets under management.

IBM Chairman and CEO Ginni Rometty discusses how cognitive technology and innovations such as Watson and blockchain have the potential to radically transform the financial services industry at Sibos 2016 in Geneva, Switzerland on Weds., September 28, 2016. (Feature Photo Service)

IBM Chairman and CEO Ginni Rometty discusses  blockchain at Sibos

The new initiative, as IBM explains it, promises a new and comprehensive way to access and visualize data.  Blockchain captures and stores information about every transaction and investment as meta data. It also captures details about relevant documents and commitments. Hyperledger itself is a logging tool that creates an immutable record.

The Northern Trust effort connects business logic, legacy technology, and blockchain technology using a combination of Java/JavaScript and IBMs blockchain product. It runs on IBM Bluemix (cloud) using IBM’s Blockchain High Security Business Network. It also relies on key management to ensure record/data isolation and enforce geographic jurisdiction. In the end it facilitates managing the fund lifecycle more efficiently than the previous primarily paper-based process.

More interesting to DancingDinosaur is the selection of the z through LinuxONE and blockchain’s use of storage.  To begin with blockchain is not really a database. It is more like a log file, but even that is not quite accurate because “it is a database you play as a team sport,” explained Arijit Das, Senior Vice President, FinTech Solutions, at the analyst briefing. That means you don’t perform any of the usual database functions; no deletes or updates, just appends.

Since blockchain is an open technology, you actually could do it on any x86 Linux machine, but DancingDinosaur readers probably wouldn’t want to do that. Blockchain essentially ends up being a distributed group activity and LinuxONE is unusually well optimized for the necessary security. It also brings scalability, reliability, and high performance along with the rock-solid security of the latest mainframe. In general LinuxONE can handle 8000 virtual servers in a single system and tens of thousands of containers. Try doing that with an x86 machine or even dozens.   You can read more on LinuxONE that DancingDinosaur wrote when it was introduced here and here.

But you won’t need near that scalability with the private equity application, at least at first. Blockchain gets more interesting when you think about storage. Blockchain has the potential to generate massive numbers of files fast, but that will only happen when it is part of, say, a supply chain with hundreds, or more likely, thousands of participating nodes on the chain and those nodes are very active. More likely for private equity trading, certainly at the start, blockchain will handle gigabytes of data and maybe only megabytes at first. This is not going to generate much revenue for IBM storage. A little bit of flash could probably do the trick.

Today, current legal and administrative processes that support private equity are time consuming and expensive, according to Peter Cherecwich, president of Corporate & Institutional Services at Northern Trust. They lack transparency while inefficient market practices leads to lengthy, duplicative and fragmented investment and administration processes. Northern Trust’s solution based on blockchain and Hyperledger, however, promises to deliver a significantly enhanced and efficient approach to private equity administration.

Just don’t expect to see overnight results. In fact, you can expect more inefficiency since the new blockchain/Hyperledger-based system is running in parallel with the disjointed manual processes. Previous legacy systems remain; they are not yet being replaced. Still, IBM insists that blockchain is an ideal technology to bring innovation to the private equity market, allowing Northern Trust to improve traditional business processes at each stage to deliver greater transparency and efficiency. Guess we’ll just have to wait and watch.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Please follow DancingDinosaur on Twitter, @mainframeblog. See more of his IT writing at technologywriter.com and here.

 

IBM Launches New IoT Collaborative Initiative

February 23, 2017

Collaboration partners can pull hundreds of millions of dollars in new revenue from IoT, according to IBM’s recent IoT announcement. Having reached what it describes as a tipping point with IoT innovation the company now boasts of having over 6,000 clients and partners around the world, many of whom are now wanting to join in its new global Watson IoT center to co-innovate. Already Avnet, BNP Paribas, Capgemini, and Tech Mahindra will collocate development teams at the IBM Munich center to work on IoT collaborations.

new-ibm-watson-iot-center

IBM Opens New Global Center for Watson IoT

The IBM center also will act as an innovation space for the European IoT standards organization EEBus.  The plan, according to Harriet Green, General Manager, IBM Watson IoT, Cognitive Engagement and Education (pictured above left), calls for building a new global IoT innovation ecosystem that will explore how cognitive and IoT technologies will transform industries and our daily lives.

IoT and more recently cognitive are naturals for the z System, and POWER Systems have been the platform for natural language processing and cognitive since Watson won Jeopardy three years ago. With the latest enhancements IBM has brought to the z in the form of on-premises cognitive and machine learning the z should assume an important role as it gathers, stores, collects, and processes IoT data for cognitive analysis. DancingDinosaur first reported on this late in 2014 and again just last week. As IoT and cognitive workloads ramp up on z don’t be surprised to see monthly workload charges rise.

Late last year IBM announced that car maker BMW will collocate part of its research and development operations at IBM’s new Watson IoT center to help reimagine the driving experience. Now, IBM is announcing four more companies that have signed up to join its special industry “collaboratories” where clients and partners work together with 1,000 Munich-based IBM IoT experts to tap into the latest design thinking and push the boundaries of the possible with IoT.

Let’s look at the four newest participants starting with Avnet. According to IBM, an IT distributor and global IBM partner, Avnet will open a new joint IoT Lab within IBM’s Watson IoT HQ to develop, build, demonstrate and sell IoT solutions powered by IBM Watson. Working closely with IBM’s leading technologists and IoT experts, Avnet also plans to enhance its IoT technical expertise through hands-on training and on-the-job learning. Avnet’s team of IoT and analytics experts will also partner with IBM on joint business development opportunities across multiple industries including smart buildings, smart homes, industry, transportation, medical, and consumer.

As reported by BNP Paribas, Consorsbank, its retail digital bank in Germany, will partner with IBM´s new Watson IoT Center. The company will collocate a team of solution architects, developers and business development personnel at the Watson facility. Together with IBM’s experts, they will explore how IoT and cognitive technologies can drive transformation in the banking industry and help innovate new financial products and services, such as investment advice.

Similarly, global IT consulting and technology services provider Capgemini will collocate a team of cognitive IoT experts at the Watson center. Together they will help customers maximize the potential of Industry 4.0 and develop and take to market sector-specific cognitive IoT solutions. Capgemini plans a close link between its Munich Applied Innovation Exchange and IBM’s new Customer Experience zones to collaborate with clients in an interactive environment.

Finally, the Indian multinational provider of enterprise and communications IT and networking technology Tech Mahindra, is one of IBM’s Global System Integrators with over 3,000 specialists focused on IBM technology around the world. The company will locate a team of six developers and engineers within the Watson IoT HQ to help deliver on Tech Mahindra’s vision of generating substantial new revenue based on IBM’s Watson IoT platform. Tech Mahindra will use the center to co-create and showcase new solutions based on IBM’s Watson IoT platform for Industry 4.0 and Manufacturing, Precision Farming, Healthcare, Insurance and Banking, and automotive.

To facilitate connecting the z to IoT IBM offers a simple recipe. It requires 4 basic ingredients and 4 steps: Texas Instrument’s SensorTag, a Bluemix account, IBM z/OS Connect Enterprise Edition, and a back-end service like CICS.  Start by exposing an existing z Systems application as a RESTful AP. This is where the z/OS Connect Edition comes in.  Then enable your SensorTag device to Watson IoT Quick Start. From there connect the Cloud to your on-premises Hybrid Cloud.  Finally, enable the published IoT data to trigger a RESTful API. Sounds pretty straightforward but—full disclosure—Dancing Dinosaur has not tried it due to lacking the necessary pieces. If you try it, please tell DancingDinosaur how it works (info@radding.net). Good luck.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Please follow DancingDinosaur on Twitter, @mainframeblog. See more of his IT writing at technologywriter.com and here.

 

z System-Power-Storage Still Live at IBM

January 5, 2017

A mid-December briefing by Tom Rosamilia, SVP, IBM Systems, reassured some that IBM wasn’t putting its systems and platforms on the backburner after racking up financial quarterly losses for years. Expect new IBM systems in 2017. A few days later IBM announced that Japan-based APLUS Co., Ltd., which operates credit card and settlement service businesses, selected IBM LinuxONE as its mission-critical system for credit card payment processing. Hooray!

linuxone-emperor-2

LinuxONE’s security and industry-leading performance will ensure APLUS achieves its operational objectives as online commerce heats up and companies rely on cloud applications to draw and retain customers. Especially in Japan, where online and mobile shopping has become increasingly popular, the use of credit cards has grown, with more than 66 percent of consumers choosing that method for conducting online transactions. And with 80 percent enterprise hybrid cloud adoption predicted by 2017, APLUS is well positioned to connect cloud transactions leveraging LinuxONE. Throw in IBM’s expansion of blockchain capabilities and the APLUS move looks even smarter.

With the growth of international visitors spending money, IBM notes, and the emergence of FinTech firms in Japan have led to a diversification of payment methods the local financial industry struggles to respond. APLUS, which issues well-known credit cards such as T Card Plus, plans to offer leading-edge financial services by merging groups to achieve lean operations and improved productivity and efficiency. Choosing to update its credit card payment system with LinuxONE infrastructure, APLUS will benefit from an advanced IT environment to support its business growth by helping provide near-constant uptime. In addition to updating its server architecture, APLUS has deployed IBM storage to manage mission-critical data, the IBM DS8880 mainframe-attached storage that delivers integration with IBM z Systems and LinuxONE environments.

LinuxONE, however, was one part of the IBM Systems story Rosamilia set out to tell.  There also is the z13s, for encrypted hybrid clouds and the z/OS platform for Apache Spark data analytics and even more secure cloud services via blockchain on LinuxONE, by way of Bluemix or on premises.

z/OS will get attention in 2017 too. “z/OS is the best damn OLTP system in the world,” declared Rosamilia. He went on to imply that enhancements and upgrades to key z systems were coming in 2017, especially CICS, IMS, and a new release of DB2. Watch for new announcements coming soon as IBM tries to push z platform performance and capacity for z/OS and OLTP.

Rosamilia also talked up the POWER story. Specifically, Google and Rackspace have been developing OpenPOWER systems for the Open Compute Project.  New POWER LC servers running POWER8 and the NVIDIA NVLink accelerator, more innovations through the OpenCAPI Consortium, and the team of IBM and Nvidia to deliver PowerAI, part of IBM’s cognitive efforts.

As much as Rosamilia may have wanted to talk about platforms and systems IBM continues to avoid using terms like systems and platforms. So Rosamilia’s real intent was to discuss z and Power in conjunction with IBM’s strategic initiatives.  Remember these: cloud, big data, mobile, analytics. Lately, it seems, those initiatives have been culled down to cloud, hybrid cloud, and cognitive systems.

IBM’s current message is that IT innovation no longer comes from just the processor. Instead, it comes through scaling performance by workload and sustaining leadership through ecosystem partnerships.  We’ve already seen some of the fruits of that innovation through the Power community. Would be nice to see some of that coming to the z too, maybe through the open mainframe project. But that isn’t about z/0S. Any boost in CICS, DB2, and IMS will have to come from the core z team. The open mainframe project is about Linux on z.

The first glimpse we had of this came last spring in a system dubbed Minsky, which was described back then by commentator Timothy Prickett Morgan. With the Minsky machine, IBM is using NVLink ports on the updated Power8 CPU, which was shown in April at the OpenPower Summit and is making its debut in systems actually manufactured by ODM Wistron and rebadged, sold, and supported by IBM. The NVLink ports are bundled up in a quad to deliver 80 GB/sec bandwidth between a pair of GPUs and between each GPU and the updated Power8 CPU.

The IBM version, Morgan describes, aims to create a very brawny node with very tight coupling of GPUs and CPUs so they can better share memory, have fewer overall GPUs, and more bandwidth between the compute elements. IBM is aiming Minsky at HPC workloads, according to Morgan, but there is no reason it cannot be used for deep learning or even accelerated databases.

Is this where today’s z data center managers want to go?  No one is likely to spurn more performance, especially if it is accompanied with a price/performance improvement.  Whether rank-and-file z data centers are queueing up for AI or cognitive workloads will have to be seen. The sheer volume and scale of expected activity, however, will require some form of automated intelligent assist.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Please follow DancingDinosaur on Twitter, @mainframeblog. See more of his IT writing at technologywriter.com and here

AI and IBM Watson Fuel Interest in App Dev among Mainframe Shops

December 1, 2016

BMC’s 2016 mainframe survey, covered by DancingDinosaur here, both directly and indirectly pointed to increased activity in regard to data center applications. Mainly this took the form of increased interest in Java on the z as a platform for new applications. Specifically, 72% of overall respondents reported using Java today while 88% reported plans to increase their use Java. At the same time, the use of Linux on the z has been steadily growing year over year; 41% in 2014, 48% in 2015, 52% in 2016. This growth of both point to a heightened interest in application development, management, and change.

ibm-project-dataworks-visualization-1

IBM’s Project DataWorks uses Watson Analytics to create complex visualizations with one line of code

IBM has been feeding this kind of AppDev interest with its continued enhancement of Bluemix and the rollout of the Bluemix Garage method.  More recently, it recently announced a partnership with Topcoder, a global software development community comprised of more than one million designers, developers, data scientists, and competitive programmers with the aim of stimulating developers looking to harness the power of Watson to create the next generation AI apps, APIs, and solutions.

According to Forrester VP and Principal Analyst JP Gownder in the IBM announcement, by 2019, automation will change every job category by at least 25%. Additionally, IDC predicts that 75% of developer teams will include cognitive/AI functionality in one or more applications by 2018. The industry is driving toward a new level of computing potential not witnessed since the introduction of Big Data

To further drive the cultivation of this new style of developer, IBM is encouraging participation in Topcoder-run hackathons and coding competitions. Here developers can easily access a range of Watson services – such as Conversation, Sentiment Analysis, or speech APIs – to build powerful new tools with the help of cognitive computing and artificial intelligence. Topcoder hosts 7,000 code challenges a year and has awarded $80 million to its community. In addition, now developers will have the opportunity to showcase and monetize their solutions on the IBM Marketplace, while businesses will be able to access a new pipeline of talent experienced with Watson and AI.

In addition to a variety of academic partnerships, IBM recently announced the introduction of an AI Nano degree program with Udacity to help developers establish a foundational understanding of artificial intelligence.  Plus, IBM offers the IBM Learning Lab, which features more than 100 curated online courses and cognitive uses cases from providers like Codeacademy, Coursera, Big Data University, and Udacity. Don’t forget, IBM DeveloperWorks, which offers how-to tutorials and courses on IBM tools and open standard technologies for all phases of the app dev lifecycle.

To keep the AI development push going, recently IBM unveiled the experimental release of Project Intu, a new system-agnostic platform designed to enable embodied cognition. The new platform allows developers to embed Watson functions into various end-user devices, offering a next generation architecture for building cognitive-enabled experiences.

Project Intu is accessible via the Watson Developer Cloud and also available on Intu Gateway and GitHub. The initiative simplifies the process for developers wanting to create cognitive experiences in various form factors such as spaces, avatars, robots, or IoT devices. In effect, it extends cognitive technology into the physical world. The platform enables devices to interact more naturally with users, triggering different emotions and behaviors and creating more meaningful and immersive experiences for users.

Developers can simplify and integrate Watson services, such as Conversation, Language, and Visual Recognition with the capabilities of the device to act out the interaction with the user. Instead of a developer needing to program each individual movement of a device or avatar, Project Intu makes it easy to combine movements that are appropriate for performing specific tasks like assisting a customer in a retail setting or greeting a visitor in a hotel in a way that is natural for the visitor.

Project Intu is changing how developers make architectural decisions about integrating different cognitive services into an end-user experience – such as what actions the systems will take and what will trigger a device’s particular functionality. Project Intu offers developers a ready-made environment on which to build cognitive experiences running on a wide variety of operating systems – from Raspberry PI to MacOS, Windows to Linux machines.

With initiatives like these, the growth of cognitive-enabled applications will likely accelerate. As IBM reports, IDC estimates that “by 2018, 75% of developer teams will include Cognitive/AI functionality in one or more applications/services.”  This is a noticeable jump from last year’s prediction that 50% of developers would leverage cognitive/AI functionality by 2018

For those z data centers surveyed by BMC that worried about keeping up with Java and big data, AI adds yet an entirely new level of complexity. Fortunately, the tools to work with it are rapidly falling into place.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghostwriter. Please follow DancingDinosaur on Twitter, @mainframeblog. See more of his IT writing at technologywriter.com and here.

 

 

 

IBM Power System S822LC for HPC Beat Sort Record by 3.3x

November 17, 2016

The new IBM Power System S822LC for High Performance Computing servers set a new benchmark for sorting by taking less than 99 seconds (98.8 seconds) to finish sorting 100 terabytes of data in the Indy GraySort category, improving on last year’s best result, 329 seconds, by a factor of 3.3. The win proved a victory not only for the S822LC but for the entire OpenPOWER community. The team of Tencent, IBM, and Mellanox has been named the Winner of the Sort Benchmark annual global computing competition for 2016.

rack-of-new-ibm-power-systems-s822lc-for-high-performance-computing-servers-1Power System S822LC for HPC

Specifically, the machine, an IBM Power S822LC for High Performance Computing (HPC), features NVIDIA NVLink technology optimized for the Power architecture and NVIDIA’s latest GPU technology. The new system supports emerging computing methods of artificial intelligence, particularly deep learning. The combination, newly dubbed IBM PowerAI, provides a continued path for Watson, IBM’s cognitive solutions platform, to extend its artificial intelligence expertise in the enterprise by using several deep learning methods to train Watson.

Actually Tencent Cloud Data Intelligence (the distributed computing platform of Tencent Cloud) won each category in both the GraySort and MinuteSort benchmarks, establishing four new world records with its performance, outperforming the 2015 best speeds by 2-5x. Said Zeus Jiang, Vice President of Tencent Cloud and General Manager of Tencent’s Data Platform Department: “In the future, the ability to manage big data will be the foundation of successful Internet businesses.”

To get this level of performance Tencent runs 512 IBM OpenPOWER LC servers and Mellanox’100Gb interconnect technology, improving the performance of Tencent Cloud big data products with the infrastructure. Online prices for the S822LC starts at about $9600 for 2-socket, 2U with up to 20 cores (2.9-3.3Ghz), 1 TB memory (32 DIMMs), 230 GB/sec sustained memory bandwidth, 2x SFF (HDD/SSD), 2 TB storage, 5 PCIe slots, 4 CAPI enabled, up to 2 NVidia K80 GPU. Be sure to shop for volume discounts.

The 2016 Sort Benchmark Results below (apologies in advance if this table breaks apart)

Sort Benchmark Competition 20 Records (Tencent Cloud ) 2015 World Records 2016 Improvement
Daytona GraySort 44.8 TB/min 15.9 TB/min 2.8X greater performance
Indy GraySort 60.7 TB/min 18.2 TB/min 3.3X greater performance
Daytona MinuteSort 37 TB/min 7.7 TB/min 4.8X greater performance
Indy MinuteSort 55 TB/min 11 TB/min 5X greater performance

Pretty impressive, huh. As IBM explains it: Tencent Cloud used 512 IBM OpenPOWER servers and Mellanox’100Gb interconnect technology, improving the performance of Tencent Cloud big data products with the infrastructure. Then Tom Rosamilia, IBM Senior VP weighed in: “Industry leaders like Tencent are helping IBM and our OpenPOWER partners push performance boundaries for a cognitive era defined by big data and advanced analytics.” The computing record achieved by Tencent Cloud on OpenPOWER turned out to be an important milestone for the OpenPOWER Foundation too.

Added Amir Prescher, Sr. Vice President, Business Development, at Mellanox Technologies: “Real-time-analytics and big data environments are extremely demanding, and the network is critical in linking together the extra high performance of IBM POWER-based servers and Tencent Cloud’s massive amounts of data,” In effect, Tencent Cloud developed an optimized hardware/software platform to achieve new computing records while demonstrating that Mellanox’s 100Gb/s Ethernet technology can deliver total infrastructure efficiency and improve application performance, which should make it a favorite for big data applications.

Behind all of this was the new IBM Power System S822LC for High Performance Computing servers. Currently the servers feature a new IBM POWER8 chip designed for demanding workloads including artificial intelligence, deep learning and advanced analytics.  However, a new POWER9 chips has already been previewed and is expected next year.  Whatever the S822LC can do running POWER8 just imagine how much more it will do running POWER9, which IBM describes as a premier acceleration platform. DancingDinosaur covered POWER9 in early Sept. here.

To capitalize on the hardware, IBM is making a new deep learning software toolkit available, PowerAI, which runs on the recently announced IBM Power S822LC server built for artificial intelligence that features NVIDIA NVLink interconnect technology optimized for IBM’s Power architecture. The hardware-software combination provides more than 2X performance over comparable servers with 4 GPUs running AlexNet with Caffe. The same 4-GPU Power-based configuration running AlexNet with BVLC Caffe can also outperform 8 M40 GPU-based x86 configurations, making it the world’s fastest commercially available enterprise systems platform on two versions of a key deep learning framework.

Deep learning is a fast growing, machine learning method that extracts information by crunching through millions of pieces of data to detect and ranks the most important aspects of the data. Publicly supported among leading consumer web and mobile application companies, deep learning is quickly being adopted by more traditional enterprises across a wide range of industry sectors; in banking to advance fraud detection through facial recognition; in automotive for self-driving automobiles; and in retail for fully automated call centers with computers that can better understand speech and answer questions. Is your data center ready for deep learning?

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Please follow DancingDinosaur on Twitter, @mainframeblog. See more of his IT writing at technologywriter.com and here.

 

 

Put your z System at the Center of Blockchain

October 6, 2016

The zSystem has been a leading platform for the world’s top banks for decades and with blockchain the z could capture even more banking and financial services data centers. Two recent IBM Institute for Business Value (IBV) studies show commercial blockchain solutions are rapidly being adopted throughout banking and financial markets dramatically faster than initially expected, according to an IBM announcement late in Sept.  Of course, not every blockchain deployment runs on z but more should be.

blockchainexplained-willian-mougayer

Copyright William Mougayer

According to an IBV study, more than 70 percent of early adopters are prioritizing blockchain efforts in order to break down current barriers to creating new business models and reaching new markets. IBV analyst report the respondents are better positioned to defend themselves against competitors, including those untraditional disruptors like non-bank startups. The majority of respondents are focusing their blockchain efforts on four areas: clearing and settlement, wholesale payments, equity and debt issuance, and reference data.

But blockchain isn’t just a financial services story. Mougayer identifies government services, healthcare, energy, supply chains, and world trade as blockchain candidates. IoT will also be an important area for blockchain, according to a new book on IoT by Maciej Kranz, an IoT pioneer.

As Kranz explains: blockchain has emerged as a technology that allows a secure exchange of value between entities in a distributed fashion. The technology first appeared on most IT radar screens a few years ago in the form of Bitcoin, a virtual currency that relies on blockchain technology to ensure its security and integrity. Although Bitcoin’s future is still uncertain, blockchain is a completely different story.

Blockchain is attracting considerable attention for its ability to ensure the integrity of transactions over the network between any entities. Automobile companies are considering the technology to authenticate connected vehicles in the vehicle-to-vehicle (V2V) environment, notes Kranz. Still others are looking at blockchain to trace the sources of goods, increase food safety, create smart contracts, perform audits, and do much more. Blockchain also provides a natural complement to IoT security in a wide variety of use cases.

The z and especially the newest generation of z Systems is ideal for blockchain work. Zero downtime, industry-leading security, massive I/O, flexibility, high performance at scale, and competitive price/performance along with its current presence in the middle of most transactions, especially financial transactions, makes z a natural for blockchain.

A key driver for blockchain, especially in the banking and financial services segment is the Linux Foundation’s HyperLedger project. This entails a collaborative, open source effort to establish an open blockchain platform that will satisfy a variety of use cases across multiple industries to streamline business processes. Through a cross-industry, open standard for distributed ledgers, virtually any digital exchange of value, such as real estate contracts, energy trades, even marriage licenses can securely and cost-effectively be tracked and traded.

According to Linux Foundation documents, “the Hyperledger Project has ramped up incredibly fast, a testament to how much pent-up interest, potential, and enterprise demand there is for a cross-industry open standard for distributed ledgers.” Linux Foundation members of the Hyperledger Project are moving blockchain technology forward at remarkable speed. IBM has been an early and sizeable contributor of code to the project. It contributed 44,000 lines of code as a founding member.

That it is catching on so quickly in the banking and financial services sector shouldn’t be a surprise either.  What blockchain enables is highly secure and unalterable distributed transaction tracking at every stage of the transaction.  Said Likhit Wagle, Global Industry General Manager, IBM Banking and Financial Markets, when ticking off blockchain advantages: To start, first movers are setting business standards and creating new models that will be used by future adopters of blockchain technology. We’re also finding that these early adopters are better able to anticipate disruption and fight off new competitors along the way.

It is the larger banks leading the charge to embrace blockchain technology with early adopters twice as likely to be large institutions with more than a hundred thousand employees. Additionally, 77 percent of these larger banks are retail banking organizations.

As the IBV surveys found, trailblazers expect the benefits from blockchain technology to impact several business areas, including reference data (83 percent), retail payments (80 percent) and consumer lending (79 percent). When asked which blockchain-based new business models could emerge, 80 percent of banks surveyed identified trade finance, corporate lending, and reference data as having the greatest potential.

IBM is making it easy to tap blockchain by making it available through Docker containers, as a signed and certified distribution of IBM’s code submission to Hyperledger, and through Bluemix services. As noted above, blockchain is a natural fit for the z and LinuxOne. To that end, Bluemix Blockchain Services and a fully integrated DevOps Tool is System z- and IoT-enabled.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghostwriter. Please follow DancingDinosaur on Twitter, @mainframeblog. See more of his IT writing at technologywriter.com and here.

 


%d bloggers like this: