Posts Tagged ‘cognitive computing’

5G Joins Edge Technology and Hybrid Multicloud

May 11, 2020

At IBM’s virtual Think Conference the first week in May the company made a big play for edge computing and 5G together. 

From connected vehicles to intelligent manufacturing equipment, the amount of data from devices has resulted in unprecedented volumes of data at the edge. IBM is convinced the data volumes will compound as 5G networks increase the number of connected mobile devices.

z15 T02  and the LinuxONE 111 LT2

Edge computing  and 5G networks promise to reduce latency while improving speed, reliability, and processing. This will deliver faster and more comprehensive data analysis, deeper insights, faster response times, and improved experiences for employees, customers, and their customers.

First gaining prominence with the Internet of Things (IoT) a few years back IBM defined edge computing as a distributed computing framework that brings enterprise applications closer to where data is created and often remains, where it can be processed. This is where decisions are made and actions taken.

5G stands for the Fifth Generation of cellular wireless technology. Beyond higher speed and reduced latency, 5G standards will have a much higher connection density, allowing networks to handle greater numbers of connected devices combined with network slicing to isolate and protect designated applications.

Today, 10% of data is processed at the edge, an amount IBM expects to grow to 75% by 2025. Specifically, edge computing enables:

  • Better data control and lower costs by minimizing data transport to central hubs and reducing vulnerabilities and costs
  • Faster insights and actions by tapping into more sources of data and processing that data there, at the edge
  • Continuous operations by enabling systems that run autonomously, reduce disruption, and lower costs because data can be processed by the devices themselves on the spot and where decisions can be made

In short: the growing number of increasingly capable devices, faster 5G processing, and the increased pressure to drive the edge computing market beyond what the initial IoT proponents, who didn’t have 5G yet, envisioned. They also weren’t in a position to imagine the growth in the processing capabilities of edge devices in just the past year or two.

But that is starting to happen now, according to IDC: By 2023, half of the newly deployed on-premises infrastructure will be in critical edge locations rather than corporate datacenters, up from less than 10% today.

Also unimagined was the emergence of the hybrid multicloud, which IBM has only recently started to tout. The convergence of 5G, edge computing, and hybrid multicloud, according to the company, is redefining how businesses operate. As more embrace 5G and edge, the ability to modernize networks to take advantage of the edge opportunity is only now feasible. 

And all of this could play very well with the new z machines, the z15 T02  and LinuxONE lll LT2. These appear to be sufficiently capable to handle the scale of business edge strategies and hybrid cloud requirements for now. Or the enterprise class z15 if you need more horsepower.

By moving to a hybrid multicloud model, telcos can process data at both the core and network edge across multiple clouds, perform cognitive operations and make it easier to introduce and manage differentiated digital services. As 5G matures it will become the network technology that underpins the delivery of these services. 

Enterprises adopting a hybrid multicloud model that extends from corporate data centers (or public and private clouds) to the edge is critical to unlock new connected experiences. By extending cloud computing to the edge, enterprises can perform AI/analytics faster, run enterprise apps to reduce impacts from intermittent connectivity, and minimize data transport to central hubs for cost efficiency. 

Deploying a hybrid multicloud model from corporate data centers to the edge is central to capitalizing on  new connected experiences. By extending cloud computing to the edge, organizations can run AI/analytics faster  while minimizing data transport to central hubs for cost efficiency. By 2023, half of the newly deployed on-premises infrastructure will be in critical edge locations rather than corporate datacenters, up from less than 10% today. It’s time to start thinking about making edge part of your computer strategy. 

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/ 

IBM Introduces New Flash Storage Family

February 14, 2020

IBM describes this mainly as a simplification move. The company is eliminating 2 current storage lines, Storwize and Flash Systems A9000, and replacing them with a series of flash storage systems that will scale from entry to enterprise. 

Well, uh, not quite enterprise as Dancing Dinosaur readers might think of it. No changes are planned for the DS8000 storage systems, which are focused on the mainframe market, “All our existing product lines, not including our mainframe storage, will be replaced by the new FlashSystem family,” said Eric Herzog, IBM’s chief marketing officer and vice president of worldwide storage channel in a published report earlier this week

The move will rename two incompatible storage lines out of the IBM product lineup and replace them with a line that provides compatible storage software and services from entry level to the highest enterprise, mainframe excluded, Herzog explained. The new flash systems family promises more functions, more features, and lower prices, he continued.

Central to the new Flash Storage Family is NVMe, which comes in multiple flavors.  NVM Express (NVMe) or Non-Volatile Memory Host Controller Interface Specification (NVMHCIS) is an open logical device interface specification for accessing non-volatile storage media attached via a PCI Express (PCIe) bus.

At the top of the new family line is the NVMe and multicloud ultra-high throughput storage system. This is a validated system with IBM implementation. IBM promises unmatched NVMe performance, SCM, and  IBM FlashCore technology. In addition it brings the features of IBM Spectrum Virtualize to support the most demanding workloads.

Image result for IBM flash storage family

IBM multi-cloud flash storage family system

Get NVMe performance, SCM and  IBM FlashCore technology, and the rich features of IBM Spectrum Virtualize to support your most demanding workloads.

NVM Express (NVMe) or Non-Volatile Memory Host Controller Interface Specification (NVMHCIS) is an open logical device interface specification for accessing non-volatile storage media attached via a PCI Express (PCIe) bus.

Next up are the IBM FlashSystem 9200 and IBM FlashSystem 9200R, IBM tested and validated rack solutions designed for the most demanding environments. With the extreme performance of end-to-end NVMe, the IBM FlashCore technology, and the ultra-low latency of Storage Class Memory (SCM). It also brings IBM Spectrum Virtualize and AI predictive storage management with proactive support by Storage Insights. FlashSystem 9200R is delivered assembled, with installation and configuration completed by IBM to ensure a working multicloud solution.

Gain the performance of all-flash and NVMe with SCM support for flash acceleration and the reliability and innovation of IBM FlashCore technology, plus the rich features of IBM Spectrum Virtualize — all in a powerful 2U storage system.

Combine the performance of flash and NVMe with the reliability and innovation of IBM FlashCore® and the rich features of IBM Spectrum Virtualize™, bringing high-end capability to clients needing enterprise mid-range storage.

In the middle of the family is the IBM FlashSystem 7200 and FlashSystem 7200H. As IBM puts it, these offer end-to-end NVMe, the innovation of IBM FlashCore technology, the ultra-low latency of Storage Class Memory (SCM), the flexibility of IBM Spectrum Virtualize, and the AI predictive storage management and proactive support of Storage Insights. It comes in a powerful 2U storage all flash or hybrid flash array. The IBM FlashSystem 7200 brings mid-range storage while allowing the organization to add  multicloud technology that best supports the business.

At the bottom of the line is the NVMe entry enterprise all flash storage solution, which brings  NVMe end-to-end capabilities and flash performance to the affordable FlashSystem 5100. As IBM describes it, the FlashSystem® 5010 and IBM FlashSystem 5030 (formerly known as IBM Storwize V5010E and Storwize V5030E–they are still there, just renamed) are all-flash or hybrid flash solutions intended to provide enterprise-grade functionalities without compromising affordability or performance. Built with the flexibility of IBM Spectrum Virtualize and AI-powered predictive storage management and proactive support of Storage Insights. IBM FlashSystem 5000 helps make modern technologies such as artificial intelligence accessible to enterprises of all sizes. In short, these promise entry-level flash storage solutions designed to provide enterprise-grade functionality without compromising affordability or performance

IBM likes the words affordable and affordability in discussing this new storage family. But, as is typical with IBM, nowhere will you see a price or a reference to cost/TB or cost/IOPS or cost of anything although these are crucial metrics for evaluating any flash storage system. DancingDinosaur expects this after 20 years of writing about the z. Also, as I wrote at the outset, the z is not even included in this new flash storage family so we don’t even have to chuckle if they describe z storage as affordable.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/

Meet IBM’s New CEO

February 6, 2020

Have to admire Ginny Rometty. She survived 19 consecutive losing quarters (one quarter shy of 5 years), which DancingDinosaur and the rest of the world covered with monotonous regularity, and she was not bounced out until this January. Memo to readers: Keep that in mind if you start feeling performance heat from top management. Can’t imagine another company that would tolerate it but what do I know.

Arvind Krishna becomes the Chief Executive Officer and a member of the I BM Board of Directors effective April 6, 2020. Krishna is currently IBM Senior Vice President for Cloud and Cognitive Software, and was a principal architect of the company’s acquisition of Red Hat. The cloud/Red Hat strategy has only just started to show signs of payback.

As IBM writes: Under Rometty’s leadership, IBM acquired 65 companies, built out key capabilities in hybrid cloud, security, industry and data, and AI both organically and inorganically, and successfully completed one of the largest technology acquisitions in history (Red Hat).  She reinvented more than 50% of IBM’s portfolio, built a $21 billion hybrid cloud business and established IBM’s leadership in AI, quantum computing, and blockchain, while divesting nearly $9 billion in annual revenue to focus the portfolio on IBM’s high value, integrated offerings. Part of that was the approximately $34 billion Red Hat acquisition, IBM’s, and possibly the IT industry’s, biggest to date. Rometty isn’t going away all that soon; she continues in some executive Board position.

It is way too early to get IBM 1Q2020 results, which will be the last quarter of Rometty’s reign. The fourth quarter of 2019, at least was positive, especially after all those quarters of revenue loss. The company reported  $21.8 billion in revenue, up 0.1 percent. Red Hat revenue was up 24 percent. Cloud and cognitive systems were up 9 percent while systems, which includes the z, was up 16 percent. 

Total cloud revenue, the new CEO Arvind Krishna’s baby, was up 21 percent. Even with z revenue up more than cloud and cognitive systems, it is probably unlikely IBM will easily find a buyer for the z soon. If IBM dumps it, they will probably have to pay somebody to take it despite the z’s faithful, profitable blue chip customer base. 

Although the losing streak has come to an end Krishna still faces some serious challenges.  For example, although DancingDinosaur has been enthusiastically cheerleading quantum computing as the future there is no proven business model there. Except for some limited adoption by a few early adopters, there is no widespread groundswell of demand for quantum computing and the technology has not yet proven itself useful. Also there is no ready pool of skilled quantum talent. If you wanted to try quantum computing would you even know what to try or where to find skilled people?

Even in the area of cloud computing where IBM finally is starting to show some progress the company has yet to penetrate the top tier of players. These players–Amazon, Google, Microsoft/Azur–are not likely to concede market share.

So here is DancingDinosaur’s advice to Krishna: Be prepared to scrap for every point of cloud share and be prepared to spin a compelling case around quantum computing. Finally, don’t give up the z until the accountants and lawyers force you, which they will undoubtedly insist on.To the contrary, slash the z prices and make it an irresistible bargain. 

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/ 

2020 IBM Quantum Gains

January 13, 2020

IBM returned from the holidays announcing a flurry of activity around quantum computing. Specifically, it has expanded its set of Q Network partners, including a range of commercial, academic, startup, government, and research entities.  

IBM Qiskit screen

The Q Network now includes over 100 organizations, across multiple industries, including: Airline, Automotive, Banking and Finance, Energy, Insurance, Materials, and Electronics.  Specifically, Anthem, Delta Air Lines, Goldman Sachs, Wells Fargo, and Woodside Energy are among the latest organizations to begin to explore practical applications using quantum computing.

In addition to these industry leaders, a number of academic, government research labs and startups have also joined the IBM Q Network, including the Georgia Institute of Technology (Georgia Tech), Stanford University, Los Alamos National Laboratory, AIQTech, Beit, Quantum Machines, Tradeteq, and Zurich Instruments.

These organizations join over 200,000 users, who have run hundreds of billions of executions on IBM’s quantum systems and simulators through the IBM Cloud. This has led to the publication of more than 200 third-party research papers on practical quantum applications.

More quantum: IBM also recently announced the planned installation of the first two IBM Q System One commercial universal quantum computers outside the US – one with Europe’s leading organization for applied research, Fraunhofer-Gesellschaft, in Germany; another with The University of Tokyo. Both are designed to advance country-wide research and provide an education framework program to engage universities, industry, and government to grow a quantum computing community and foster new economic opportunities.

Growing a quantum computing community should quickly become a critical need and, more likely, a major headache. My own cursory search of employment sites revealed no quantum computing openings  listed. Just a few casual inquiries suggest curiosity about quantum computing but not much insight or readiness or actual skills or openings to generate action. 

Still, even at this early stage things already are happening.

Anthem, Inc., a leading health benefits company is expanding its research and development efforts to explore how quantum computing may further enhance the consumer healthcare experience. For Anthem, quantum computing offers the potential to analyze vast amounts of data inaccessible to classical computing while also enhancing privacy and security. It also brings the potential to help individuals through the development of more accurate and personalized treatment options while improving the prediction of health conditions.

Delta Air Lines joined the IBM Q Hub at North Carolina State University to embark on a multi-year collaborative effort with IBM to explore the potential capabilities of quantum computing in transforming experiences for customers and employees as they encounter challenges throughout the  travel day.

Quantum Machines (QM), a provider of control and operating systems for quantum computers, brings customers among the leading players in the field, including multinational corporations, academic institutions, start-ups and national research labs. As part of the IBM and QM collaboration, a compiler between IBM’s quantum computing programming languages, like Qiskit (see graphic above),  and those of QM is being developed for use by QM’s customers. Such development will lead to the increased adoption of IBM’s open-sourced programming languages across the industry.

The Los Alamos National Laboratory also has joined as an IBM Q Hub to greatly help the lab’s research efforts, including developing and testing near-term quantum algorithms and formulating strategies for mitigating errors on quantum computers. A 53-qubit system will also allow Los Alamos to benchmark the abilities to perform quantum simulations on real quantum hardware and perhaps to finally push beyond the limits of classical computing. 

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/ 

IBM Suggests Astounding Productivity with Cloud Pak for Automation

November 25, 2019

DancingDinosaur thought IBM would not introduce another Cloud Pak until after the holidays, but I was wrong. Last week IBM launched Cloud Pak for security. According to IBM it helps an organization uncover threats, make more informed risk-based decisions, and prioritize your team’s time. 

More specifically, it connects the organization’s existing data sources to generate deeper insights. In the process you can access IBM and third-party tools to search for threats across any cloud or on-premises location. Quickly orchestrate actions and responses to those threats  while leaving your data where it is.

DancingDinosaur’s only disappointment in the IBM’s new security cloud pak as with other IBM Cloud Paks is that it runs only on Linux. That means it doesn’t run RACF, the legendary IBM access control tool for zOS. IBM’s Cloud Paks reportedly run on z Systems, but only those running Linux. Not sure how IBM can finesse this particular issue. 

Of the 5 original IBM Cloud Paks (application, data, integration, multicloud mgt, and automation) only one offers the kind of payback that will wow top c-level execs; automation.  Find Cloud Park for Automation here.

To date, IBM reports  over 5000 customers have used IBM Digital Business Automation to run their digital business. At the same time, IBM claims successful digitization has increased organizational scale and fueled growth of knowledge work.

McKinsey & Company notes that such workers spend up to 28 hours each week on low value work. IBM’s goal with digital business automation is to bring digital scale to knowledge work and free these workers to work on high value tasks.

Such tasks include collaborating and using creativity to come up with new ideas or meeting and building relationships with clients or resolving issues and exceptions. By automating these tasks the payoff, says IBM, can be staggering simply  by applying intelligent automation.

“We can reclaim 120 billion hours a year  spent by knowledge workers on low value work by using intelligent automation,” declares IBM.  So what value can you reclaim over the course of the year for your operation with, say, 100 knowledge workers, earning, maybe, $22 per hour, or maybe 1000 workers earning $35/hr. You can do the math. 

As you would expect,  automation is the critical component of this particular Cloud Pak. The main targets for enhancement or assistance among the rather broad category of knowledge workers are administrative/departmental work and expert work, which includes cross enterprise work.  IBM offers vendor management as one example.

The goal is to digitize core services by automating at scale and building low code/no code apps for your knowledge workers. For what IBM refers to as digital workers, who are key to this plan, the company wants to free them for higher value work. IBM’s example of such an expert worker would be a loan officer. 

Central to IBM’s Cloud Pak for Automation is what IBM calls its Intelligent Automation Platform. Some of this is here now, according to the company, with more coming in the future. Here now is the ability to create apps using low code tooling, reuse assets from business automation workflow, and create new UI assets.

Coming up in some unspecified timeframe is the ability to enable  digital workers to automate job roles, define and create content services to enable intelligent capture and extraction, and finally to envision and create decision services to offload and automate routine decisions.

Are your current and would-be knowledge workers ready to contribute or participate in this scheme? Maybe for some. it depends for others. To capture those billions of hours of increased productivity, however, they will have to step up to it. But you can be pretty sure IBM will do it for you if you ask.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/ 

 

IBM teams with Cloudera and Hortonworks 

July 11, 2019

Dancing Dinosaur has a friend on the West coast who finally left IBM after years of complaining, swearing never to return, and has been happily working at Cloudera ever since. IBM and Cloudera this week announced a strategic partnership to develop joint go-to-market programs designed to bring advanced data and AI solutions to more organizations across the expansive Apache Hadoop ecosystem.

Graphic representing a single solution for big data analytics

Deploy a single solution for big data

The agreement builds on the long-standing relationship between IBM and Hortonworks, which merged with Cloudera this past January to create integrated solutions for data science and data management. The new agreement builds on the integrated solutions and extends them to include the Cloudera platform. “This should stop the big-data-is-dead thinking that has been cropping up,” he says, putting his best positive spin on the situation.

Unfortunately, my West coast buddy may be back at IBM sooner than he thinks. With IBM finalizing its $34 billion Red Hat acquisition yesterday, it is small additional money to just buy Horton and Cloudera and own them all as a solid big data-cloud capabilities block IBM owns.  

As IBM sees it, the companies have partnered to offer an industry-leading, enterprise-grade Hadoop distribution plus an ecosystem of integrated products and services – all designed to help organizations achieve faster analytic results at scale. As a part of this partnership, IBM promises to:

  • Resell and support of Cloudera products
  • Sell and support of Hortonworks products under a multi-year contract
  • Provide migration assistance to future Cloudera/Hortonworks unity products
  • Deliver the benefits of the combined IBM and Cloudera collaboration and investment in the open source community, along with commitment to better support analytics initiatives from the edge to AI.

IBM also will resell the Cloudera Enterprise Data Hub, Cloudera DataFlow, and Cloudera Data Science Workbench. In response, Cloudera will begin to resell IBM’s Watson Studio and BigSQL.

“By teaming more strategically with IBM we can accelerate data-driven decision making for our joint enterprise customers who want a hybrid and multi-cloud data management solution with common security and governance,” said Scott Andress, Cloudera’s Vice President of Global Channels and Alliances in the announcement. 

Cloudera enables organizations to transform complex data into clear and actionable insights. It delivers an enterprise data cloud for any data, anywhere, from the edge to AI. One obvious question: how long until IBM wants to include Cloudera as part of its own hybrid cloud? 

But IBM isn’t stopping here. It also just announced new storage solutions across AI and big data, modern data protection, hybrid multicloud, and more. These innovations will allow organizations to leverage more heterogeneous data sources and data types for deeper insights from AI and analytics, expand their ability to consolidate rapidly expanding data on IBM’s object storage, and extend modern data protection to support more workloads in hybrid cloud environments.

The key is IBM Spectrum Discover, metadata management software that provides data insight for petabyte-scale unstructured storage. The software connects to IBM Cloud Object Storage and IBM Spectrum Scale, enabling it to rapidly ingest, consolidate, and index metadata for billions of files and objects. It provides a rich metadata layer that enables storage administrators, data stewards, and data scientists to efficiently manage, classify, and gain insights from massive amounts of unstructured data. Combining that with Cloudera and Horton on the IBM’s hybrid cloud should give you a powerful data analytics solution. 

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at technologywriter.com. 

 

IBM Pushes Quantum for Business

June 20, 2019

Other major system providers pursuing quantum computing initiatives, but none are pursuing it as methodically or persistently as IBM. In a recent announcement:  IBM’s Institute for Business Value introduced a five-step roadmap to bring quantum computing to your organization.

Into IBM Q computation center: dilution refrigerators with microwave electronics (middle) that provide Q Network cloud access to 20-qubit processor. (Credit: Connie Zhou)

Start by familiarizing yourself with superposition and entanglement, which enable quantum computers to solve problems intractable for today’s conventional computers:

Superposition. A conventional computer uses binary bits that can only depict either 1 or 0. Instead, quantum computers use qubits that can depict a 1 or 0, or any combination by superposition of the qubits’ possible states. This supplies quantum computers with an exponential set of states they can explore to solve certain types of problems better than conventional computers.

Entanglement. In the quantum world, two qubits located even light-years apart can still act in ways that are strongly correlated. Quantum computing takes advantage of this entanglement to encode problems that exploit this correlation between qubits.

The quantum properties of superposition and entanglement enable quantum computers to rapidly explore an enormous set of possibilities to identify an optimal answer that could maximize business value. As future quantum computers can calculate certain answers exponentially faster than today’s conventional machines, they will enable tackling business problems that are exponentially more complex.

Despite conventional computers’ limitations, quantum computers are not expected to replace them in the foreseeable future. Instead, hybrid quantum-conventional architectures are expected to emerge that, in effect, outsource portions of difficult problems to a quantum computer.

Already Quantum computing appears ripe to transform certain industries. For instance, current computational chemistry methods rely heavily on approximation because the exact equations cannot be solved by conventional computers. Similarly, quantum algorithms are expected to deliver accurate simulations of molecules over longer timescales, currently impossible to model precisely. This could enable life-saving drug discoveries and significantly shorten the number of years required to develop complex pharmaceuticals.

Additionally, quantum computing’s anticipated ability to solve today’s impossibly complex logistics problems could produce considerable cost savings and carbon footprint reduction. For example, consider improving the global routes of the trillion-dollar shipping industry (see Dancing Dinosaur’s recent piece on blockchain gaining traction). If quantum computing could improve container utilization and shipping volumes by even a small fraction, this could save shippers hundreds of millions of dollars. To profit from quantum computing’s advantages ahead of competitors, notes IBM, some businesses are developing expertise now to explore which use cases may benefit their own industries as soon as the technology matures.

To stimulate this type of thinking, IBM’s Institute of Business Value has come up with 5 steps to get you started:

  1. Identify your quantum champions. Assign this staff to learn more about the prospective benefits of quantum computing. Just designate some of your leading professionals as quantum champions and charge them with understanding quantum computing, its potential impact on your industry, your competitors’ response, and how your business might benefit. Have these champions report periodically to senior management to educate the organization and align progress to strategic objectives.
  2. Begin identifying quantum computing use cases and associated value propositions. Have your champions identify specific areas where quantum computing could propel your organization ahead of competitors. Have these champions monitor progress in quantum application development to track which use cases may be commercialized sooner. Finally, ensure your quantum exploration links to business results. Then select the most promising quantum computing applications, such as creating breakthrough products and services or new ways to optimize the supply chain.
  3. Experiment with real quantum systems. Demystify quantum computing by trying out a real quantum computer (IBM’s Q Experience). Have your champions get a sense for how quantum computing may solve your business problems and interface with your existing tools. A quantum solution may not be a fit for every business issue. Your champions will need to focus on solutions to address your highest priority use cases, ones that conventional computers can’t practically solve.
  4. Chart your quantum course. This entails constructing a quantum computing roadmap with viable next steps for the purpose of pursuing problems that could create formidable competitive barriers or enable sustainable business advantage. To accelerate your organization’s quantum readiness, consider joining an emerging quantum community. This can help you gain better access to technical infrastructure, evolving industry applications, and expertise that can enhance your development of specific quantum applications.
  5. Lastly, be flexible about your quantum future. Quantum computing is rapidly evolving. Seek out technologies and development toolkits that are becoming the industry standard, those around which ecosystems are coalescing. Realize that new breakthroughs may cause you to adjust your approach to your quantum development process, including changing your ecosystem partners. Similarly, your own quantum computing needs may evolve over time, particularly as you improve your understanding of which business issues can benefit most from quantum solutions.

Finally, actually have people in your organization try a quantum computer, such as through IBM’s Q program and Qiskit, a free development tool. Q provides a free 16-qubit quantum computer you don’t have to configure or keep cool and stable. That’s IBM’s headache.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at technologywriter.com.

Syncsort Drives IBMi Security with AI

May 2, 2019

The technology security landscape looks increasingly dangerous  The problem revolves around the possible impact of AI. the impact of which is not fully clear. The hope, of course, is that AI will make security more efficient and effective.  However, the security bad actors can also jump on AI to advance their own schemes. Like a cyber version of the nuclear arms race, this has been an ongoing battle for decades. The industry has to cooperate and, specifically, share information and hope the good guys can stay a step ahead.

In the meantime, vendors like IBM and most recently Syncsort have been stepping up to  the latest challengers. Syncsort, for example, earlier this month launched its Assure Security to address the increasing sophistication of cyber attacks and expanding data privacy regulations.  In surprising ways, it turns out, data privacy and AI are closely related in the AI security battle.

Syncsort, a leader in Big Iron-to-Big Data software, announced Assure Security, which combines access control, data privacy, compliance monitoring, and risk assessment into a single product. Together, these capabilities help security officers, IBMi administrators, and Db2 administrators address critical security challenges and comply with new regulations meant to safeguard and protect the privacy of data.

And it clearly is coming at the right time.  According to Privacy Rights Clearinghouse, a non-profit corporation with a mission to advocate for data privacy there were 828 reported security incidents in 2018 resulting in the exposure of over 1.37 billion records of sensitive data. As regulations to help protect consumer and business data become stricter and more numerous, organizations must build more robust data governance and security programs to keep the data from being exploited by bad security actors for nefarious purposes.  The industry already has scrambled to comply with GDPR and the New York Department of Financial Services Cybersecurity regulations and they now must prepare for the GDPR-like California Consumer Privacy Act, which takes effect January 1, 2020.

In its own survey Syncsort found security is the number one priority among IT pros with IBMi systems. “Given the increasing sophistication of cyber attacks, it’s not surprising 41 percent of respondents reported their company experienced a security breach and 20 percent more were unsure if they even had been breached,” said David Hodgson, CPO, Syncsort. The company’s new Assure Security product leverages the wealth of IBMi security technology and the expertise to help organizations address their highest-priority challenges. This includes protecting against vulnerabilities introduced by new, open-source methods of connecting to IBMi systems, adopting new cloud services, and complying with expanded government regulations.

Of course, IBM hasn’t been sleeping through this. The company continues to push various permutations of Watson to tackle the AI security challenge. For example, IBM leverages AI to gather insights and use reasoning to identify relationships between threats, such as malicious files, suspicious IP addresses,  or even insiders. This analysis takes seconds or minutes, allowing security analysts to respond to threats up to 60 times faster.

It also relies on AI to eliminate time-consuming research tasks and provides curated analysis of risks, which reduces the amount of time security analysts require to make the critical decisions and launch an orchestrated response to counter each threat. The result, which IBM refers to as cognitive security, combines the strengths of artificial intelligence and human intelligence.

Cognitive AI in effect, learns with each interaction to proactively detect and analyze threats and provides actionable insights to security analysts making informed decisions. Such cognitive security, let’s hope, combines the strengths of artificial intelligence with human judgement.

Syncsort’s Assure Security, specifically brings together best-in-class IBMi security capabilities acquired by Syncsort into an all-in-one solution, with the flexibility for customers to license individual modules. The resulting product includes:

  • Assure  Compliance Monitoring quickly identifies security and compliance issues with real-time alerts and reports on IBMi system activity and database changes.
  • Assure Access Control provides control of access to IBMi systems and their data through a varied bundle of capabilities.
  • Assure Data Privacy protects IBMi data at-rest and in-motion from unauthorized access and theft through a combination of NIST-certified encryption, tokenization, masking, and secure file transfer capabilities.
  • Assure Security Risk Assessment examines over a dozen categories of security values, open ports, power users, and more to address vulnerabilities.

It probably won’t surprise anyone but the AI security situation is not going to be cleared up soon. Expect to see a steady stream of headlines around security hits and misses over the next few years. Just hope will get easier to separate the good guys from the bad actors and the lessons will be clear.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at technologywriter.com.

IBM Rides Quantum Volume  to Quantum Advantage

March 19, 2019

Recently IBM announced achieving its’ highest quantum volume to date. Of course, nobody else knows what Quantum Volume is.  Quantum volume is both a measurement and a procedure developed, no surprise here, by IBM to determine how powerful a quantum computer is. Read the May 4 announcement here.

Quantum volume is not just about the number of qubits, although that is one part of it. It also includes both gate and measurement errors, device cross talk, as well as device connectivity and circuit compiler efficiency. According to IBM, the company has doubled the power of its quantum computers annually since 2017.

The upgraded processor will be available for use by developers, researchers, and programmers to explore quantum computing using a real quantum processor at no cost via the IBM Cloud. This offer has been out in various forms since May 2016 as IBM’s Q Experience.

Also announced was a new prototype of a commercial processor, which will be the core for the first IBM Q early-access commercial systems.  Dates have only been hinted at.

IBM’s recently unveiled IBM Q System One quantum computer, with a fourth-generation 20-qubit processor, which has resulted in a Quantum Volume of 16, roughly double that of the current IBM Q 20-qubit device, which have a Quantum Volume of 8.

The Q volume math goes something like this: a variety of factors determine Quantum Volume, including the number of qubits, connectivity, and coherence time, plus accounting for gate and measurement errors, device cross talk, and circuit software compiler efficiency.

In addition to producing the highest Quantum Volume to date, IBM Q System One’s performance reflects some of the lowest error rates IBM has ever measured, with an average 2-qubit gate error less than 2 percent, and its best gate achieving less than a 1 percent error rate. To build a fully-functional, large-scale, universal, fault-tolerant quantum computer, long coherence times and low error rates are required. Otherwise how could you ever be sure of the results?

Quantum Volume is a fundamental performance metric that measures progress in the pursuit of Quantum Advantage, the Quantum Holy Grail—the point at which quantum applications deliver a significant, practical benefit beyond what classical computers alone are capable. To achieve Quantum Advantage in the next decade, IBM believes that the industry will need to continue to double Quantum Volume every year.

Sounds like Moore’s Law all over again. IBM doesn’t deny the comparison. It writes: in 1965, Gordon Moore postulated that the number of components per integrated function would grow exponentially for classical computers. Jump to the new quantum era and IBM notes its Q system progress since 2017 presents a similar early growth pattern, supporting the premise that Quantum Volume will need to double every year and presenting a clear roadmap toward achieving Quantum Advantage.

IBM’s recently unveiled IBM Q System One quantum computer, with a fourth-generation 20-qubit processor, which has produced a Quantum Volume of 16, roughly double that of the current IBM Q 20-qubit IBM Q Network device, which has a Quantum Volume of 8.

Potential use cases, such as precisely simulating battery-cell chemistry for electric vehicles, speeding quadratic derivative models, and many others are already being investigated by IBM Q Network partners. To achieve Quantum Advantage in the 2020s, IBM believes the industry will need to continue doubling Quantum Volume every year.

In time AI should play a role expediting quantum computing.  For that, researchers will need to develop more effective AI that can identify patterns in data otherwise invisible to classical computers.

Until then how should most data centers proceed? IBM researchers suggest 3 initial steps:

  1. Develop quantum algorithms that demonstrate how quantum computers can improve AI classification accuracy.
  1. Improve feature mapping to a scale beyond the reach of the most powerful classical computers
  2. Classify data through the use of short depth circuits, allowing AI applications in the NISQ (noisy intermediate scale quantum) regime and a path forward to achieve quantum advantage for machine learning.

Sounds simple, right? Let DancingDinosaur know how you are progressing.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at technologywriter.com.


%d bloggers like this: