Posts Tagged ‘artificial intelligence (AI)’

IBM AI Toolset Focuses on 9 Industries

October 4, 2018

Recently, IBM introduced new AI solutions and services pre-trained for nine industries and professions including agriculture, customer service, human resources, supply chain, manufacturing, building management, automotive, marketing, and advertising. In each area the amount of data makes it more difficult for managers to keep up due to volume, velocity, and complexity of the data. The solutions generally utilize IBM’s Watson Data Platform.

For example, supply chain companies now should incorporate weather data, traffic reports, and even regulatory reports to provide a fuller picture of global supply issues. Similarly, industrial organizations are seeking to reduce product inspection resource requirements significantly through the use of visual and acoustic inspection capabilities, notes IBM.

Recent IBM research from its Institute for Business Value revealed that 82% of businesses are now considering AI deployments. Why? David Kenny, Senior Vice President, IBM Cognitive Solutions, explains: “As data flows continue to increase, people are overwhelmed by the amount of information [forcing them] to act on it every day, but luckily the information explosion coincides with another key technological advance; artificial intelligence (AI). In the 9 industries targeted by IBM, the company provides the industry-specific algorithms and system training required for making AI effective in each segment.

Let’s look at a selection of these industry segments starting with Customer Service where 77% of top performing organizations report seeing customer satisfaction as a key value driver for AI by giving customer service agents increased ability to respond quickly to questions and complex inquiries. It was first piloted at Deluxe Corporation, which saw improved response times and increased client satisfaction.

Human resources also could benefit from a ready-made AI solution. The average hiring manager flips through hundreds of applicants daily, notes IBM, spending approximately 6 seconds on each resume. This isn’t nearly enough time to make well-considered decisions. The new AI tool for HR analyzes the background of current top performing employees from diverse backgrounds and uses that data to help flag promising applicants.

In the area of industrial equipment, AI can be used to reduce product inspection resource requirements significantly by using AI-driven visual and acoustic inspection capabilities. At a time of intense global competition, manufacturers face a variety of issues that impact productivity including workforce attrition, skills-gaps, and rising raw material costs—all exacerbated by downstream defects and equipment downtime. By combining the Internet of Thing (IoT) and AI, IBM contends, manufacturers can stabilize production costs by pinpointing and predicting areas of loss; such as energy waste, equipment failures, and product quality issues.

In agriculture, farmers can use AI to gather data from multiple sources—weather, IoT-enabled tractors and irrigators, satellite imagery, and more—and see a single, overarching, predictive view of data as it relates to a farm. For the individual grower, IBM notes, this means support for making more informed decisions that help improve yield. Water, an increasingly scarce resource in large swaths of the world, including parts of the U.S., which have been experienced persistent droughts. Just remember the recent wildfires.

Subway hopes AI can increase in restaurant visits by leveraging the connection between weather and quick service (QSR) foot traffic to drive awareness of its $4.99 Foot long promotion via The Weather Channel mobile app. To build awareness and ultimately drive in-store visits to its restaurants Subway reported experiencing a 31% lift in store traffic and a 53% reduction in campaign waste due to AI.

DancingDinosaur had no opportunity to verify any results reported above. So always be skeptical of such results until they are verified to you.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at

IBM AI Reference Architecture Promises a Fast Start

August 10, 2018

Maybe somebody in your organization has already fooled around with a PoC for an AI project. Maybe you already want to build it out and even put it into production. Great! According to IBM:  By 2020, organizations across a wide array of different industries that don’t deploy AI will be in trouble. So those folks already fooling around with an AI PoC will probably be just in time.

To help organization pull the complicated pieces of AI together, IBM, with the help of IDC, put together its AI Infrastrucure Reference Architecture. This AI reference architecture, as IBM explains, is intended to be used by data scientists and IT professionals who are defining, deploying and integrating AI solutions into an organization. It describes an architecture that will support a promising proof of concept (PoC), experimental application, and sustain growth into production as a multitenant system that can continue to scale to serve a larger organization, while integrating into the organization’s existing IT infrastructure. If this sounds like you check it out. The document runs short, less than 30 pages, and free.

In truth, AI, for all the wonderful things you’d like to do with it, is more a system vendor’s dream than yours.  AI applications, and especially deep learning systems, which parse exponentially greater amounts of data, are extremely demanding and require powerful parallel processing capabilities. Standard CPUs, like those populating racks of servers in your data center, cannot sufficiently execute AI tasks. At some point, AI users will have to overhaul their infrastructure to deliver the required performance if they want to achieve their AI dreams and expectations.

Therefore, IDC recommends businesses developing AI capabilities or scaling existing AI capabilities, should plan to deliberately hit this wall in a controlled fashion. Do it knowingly and in full possession of the details to make the next infrastructure move. Also, IDC recommends you do it in close collaboration with a server vendor—guess who wants to be that vendor—who can guide them from early stage to advanced production to full exploitation of AI capabilities throughout the business.

IBM assumes everything is going to AI as quickly as it can, but that may not be the case for you. AI workloads include applications based on machine learning and deep learning, using unstructured data and information as the fuel to drive the next results. Some businesses are well on their way with deploying AI workloads, others are experimenting, and a third group is still evaluating what AI applications can mean for their organization. At all three stages the variables that, if addressed properly, together make up a well-working and business-advancing solution are numerous.

To get a handle on these variables, executives from IT and LOB managers often form a special committee to actively consider their organization’s approach to the AI. Nobody wants to invest in AI for the sake of AI; the vendors will get rich enough as it is. Also, there is no need to reinvent the wheel; many well-defined use cases exist that are applicable across industries. Many already are noted in the AI reference guide.

Here is a sampling:

  • Fraud analysis and investigation (banking, other industries)
  • Regulatory intelligence (multiple industries)
  • Automated threat intelligence and prevention systems (many industries)
  • IT automation, a sure winner (most industries)
  • Sales process recommendation and automation
  • Diagnosis and treatment (healthcare)
  • Quality management investigation and recommendation (manufacturing)
  • Supply and logistics (manufacturing)
  • Asset/fleet management, another sure winner (multiple industries)
  • Freight management (transportation)
  • Expert shopping/buying advisory or guide

Notes IDC: Many can be developed in-house, are available as commercial software, or via SaaS in the cloud.

Whatever you think of AI, you can’t avoid it. AI will penetrate your company embedded in the new products and services you buy.

So where does IBM hope your AI effort end up? Power9 System, hundreds of GPUs, and PowerAI. Are you surprised?

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog. See more of his work at and here.

IBM Boosts AI at Think

March 23, 2018

Enterprise system vendors are racing to AI along with all the others. Writes Jeffrey Burt, an analyst at The Next Platform, “There continues to be an ongoing push among tech vendors to bring artificial intelligence (AI) and its various components – including deep learning and machine learning – to the enterprise. The technologies are being rapidly adopted by hyperscalers and in the HPC space, and enterprises stand to reap significant benefits by also embracing them.” Exactly what those benefits are still need to be specifically articulated and, if possible, quantified.

IBM Think Conference this week

For enterprise data centers running the Z or Power Systems, the most obvious quick payoff will be fast, deeper, more insightful data analytics along with more targeted guidance on actions to take in response. After that there still remains the possibility of more automation of operations but the Z already is pretty thoroughly automated and optimized. Just give it your operational and performance parameters and it will handle the rest.  In addition, vendors like Compuware and Syncsort have been making the mainframe more graphical and intuitive. The days of needing deep mainframe experience or expertise have passed. Even x86 admins can quickly pick up a modern mainframe today.

In a late 2016 study by Accenture that modeled the impact of AI for 12 developed economies. The research compared the size of each country’s economy in 2035 in a baseline scenario, which shows expected economic growth under current assumptions and an AI scenario reflecting expected growth once the impact of AI has been absorbed into the economy. AI was found to yield the highest economic benefits for the United States, increasing its annual growth rate from 2.6 percent to 4.6 percent by 2035, translating to an additional USD $8.3 trillion in gross value added (GVA). In the United Kingdom, AI could add an additional USD $814 billion to the economy by 2035, increasing the annual growth rate of GVA from 2.5 to 3.9 percent. Japan has the potential to more than triple its annual rate of GVA growth by 2035, and Finland, Sweden, the Netherlands, Germany and Austria could see their growth rates double. You can still find the study here.

Also coming out of Think this week was the announcement of an expanded Apple-IBM partnership around AI and machine learning (ML). The resulting AI service is intended for corporate developers to build apps themselves. The new service, Watson Services for Core ML, links Apple’s Core ML tools for developers that it unveiled last year with IBM’s Watson data crunching service. Core ML helps coders build machine learning-powered apps that more efficiently perform calculations on smartphones instead of processing those calculations in external data centers. It’s similar to other smartphone-based machine learning tools like Google’s TensorFlow Lite.

The goal is to help enterprises reimagine the way they work through a combination of Core ML and Watson Services to stimulate the next generation of intelligent mobile enterprise apps. Take the example of field technicians who inspect power lines or machinery. The new AI field app could feed images of electrical equipment to Watson to train it to recognize the machinery. The result would enable field technicians to scan the electrical equipment they are inspecting on their iPhones or iPads and automatically detect any anomalies. The app would eliminate the need to send that data to IBM’s cloud computing data centers for processing, thus reducing the amount of time it takes to detect equipment issues to near real-time.

Apple’s Core ML toolkit could already be used to connect with competing cloud-based machine learning services from Google, Amazon, and Microsoft to create developer tools that more easily link the Core ML service with Watson. For example, Coca-Cola already is testing Watson Services for Core ML to see if it helps its field technicians better inspect vending machines. If you want try it in your shop, the service will be free to developers to use now. Eventually, developers will have to pay.

Such new roll-your-own AI services represent a shift for IBM. Previously you had to work with IBM consulting teams. Now the new Watson developer services are intended to be bought in an “accessible and bite size” way, according to IBM, and sold in a “pay as you go” model without consultants.  In a related announcement at Think, IBM announced it is contributing the core of Watson Studio’s Deep Learning Service as an open source project called Fabric for Deep Learning. This will enable developers and data scientists to work together on furthering the democratization of deep learning.

Ultimately, the democratization of AI is the only way to go. When intelligent systems speak together and share insights everyone’s work will be faster, smarter. Yes, there will need to be ways to compensate distinctively valuable contributions but with over two decades of open source experience, the industry should be able to pretty easily figure that out.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog. See more of his work at and here.

%d bloggers like this: