Posts Tagged ‘Big Data’

Syncsort Now Precisely After Pitney Bowes Acquisition

May 29, 2020

After announcing its acquisition of Pitney Bowes last August and completing the deal in December, Syncsort earlier this month rebranded itself as Precisely. The company, a long established mainframe ISV, is trying to position Precisely as a major player among enterprises seeking to handle quantities of data in various ways.

Precisely’s combined and updated Syncsort and Pitney Bowes product lines to span what the rebranded operation now describes as  “the breadth of the data integrity spectrum” by offering data integration, data quality and location intelligence tools.

The rebranded company’s solution portfolio spans five areas based on the use case. 

  • Integrate is its data integration line that features Precisely Connect, Ironstream, Assure, and Syncsort.
  • Verify unit of data quality tools includes Precisely Spectrum Quality, Spectrum Context, and Trillium.
  • Location intelligence (Locate) touts Precisely Spectrum Spatial, Spectrum Geocoding, MapInfo, and Confirm
  • Enrich features Precisely Streets, Boundaries, Points Of Interest, Addresses, and Demographics. 
  • Engage unit aims to create seamless, personalized and omnichannel communications on any medium, anytime

According to the company, the updated product line will span what it describes as “the breadth of the data integrity spectrum” by offering data integration, data quality and location intelligence tools. Adds Josh Rogers, CEO, Syncsort, now Precisely,  “With the combination of Syncsort and Pitney Bowes software and data, we are creating in Precisely a new company that is focused on helping enterprises advance their use of data through expertise across data domains, disciplines and platforms.”

Rogers continued: “Advancements in storage, compute, analytics, and machine learning have opened up a world of possibilities for enhanced decision-making, but inaccuracies and inconsistencies in data have held back innovation and stifled value creation. Achieving data integrity is the next business imperative. Put simply, better data means better decisions, and Precisely offers the industry’s most complete portfolio of data integrity products, providing the link between data sources and analytics that helps companies realize the value of their data and investments.”

Precisely may again be onto something by emphasizing the quality of data for decision making, which is just an amplification of the old GIGO (Garbage In Garbage Out), especially now as the volume, variety, and availability of data skyrockets. When edge devices begin generating new and different data it will further compound these challenges. Making data-driven decisions already has become increasingly complex for even the largest enterprises.

Despite the proliferation of cloud-based analytics tools, according to published studies in Forbes, Harvard Business Review, and elsewhere CEOs found that 84 percent do not trust the data they are basing decisions on, and with good reason, as another study found almost half of newly created data records have at least one critical error. Meanwhile, the cost of noncompliance with new governmental regulations, including GDPR and CCPA, has created an even greater urgency for trusted data.

Out of the gate, Precisely has more than 2,000 employees and 12,000 customers in more than 100 countries, with 90 of those part of the Fortune 100. The company boasts annual revenue of over $600 million.

Prior to its acquisition Pitney Bowes delivered solutions, analytics, and APIs in the areas of ecommerce fulfillment, shipping and returns; cross-border ecommerce; office mailing and shipping; presort services; and financing.

Syncsort provides data integration and optimization software alongside location Intelligence, data enrichment, customer information management, and engagement solutions. Together, the two companies serve more than 11,000 enterprises and hundreds of channel partners worldwide.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/

Power9 Summit Fights COVID-19

March 16, 2020

IBM has unleashed its currently top-rated supercomputer, Summit, to simulate 8,000 chemical compounds in a matter of days in a hunt for something that will impact the COVID-19 infection process by binding to the virus’s spike, a key early step in coming up with an effective vaccine or cure. In the first few days Summit already identified 77 small-molecule compounds, such as medications and natural compounds, that have shown the potential to impair COVID-19’s ability to dock with and infect host cells.

POWER9 Summit Supercomputer battles COVID-19

 

The US Dept of Energy turned to the IBM Summit supercomputer to help in the fight against COVID-19 that appears almost unstoppable as it has swept through 84 countries on every continent except Antarctica, according to IBM. The hope is that by quickly culling the most likely initial chemical candidates, the lab researchers could get an early jump on the search for an effective cure.

As IBM explains it, viruses infect cells by binding to them and using a ‘spike’ to inject their genetic material into the host cell. When trying to understand new biological compounds, like viruses, researchers in wet labs grow the micro-organism and see how it reacts in real-life to the introduction of new compounds, but this can be a slow process without computers that can perform fast digital simulations to narrow down the range of potential variables.And even then there are challenges. 

Computer simulations can examine how different variables react with different viruses, but when each of these individual variables can be comprised of millions or even billions of unique pieces of data and compounded by the need to be run in multiple simulations this isn’t trivial. Very quickly this can become a very time-intensive process, especially  if you are using commodity hardware. 

But, IBM continued, by using Summit, researchers were able to simulate 8,000 compounds in a matter of days to model which bone might impact that infection process by binding to the virus’s spike. As of last week, they have identified dozens of small-molecule compounds, such as medications and natural compounds, that have shown the potential to impair COVID-19’s ability to dock with and infect host cells.

“Summit was needed to rapidly get the simulation results we needed. It took us a day or two whereas it would have taken months on a normal computer,” said Jeremy Smith, Governor’s Chair at the University of Tennessee, director of the UT/ORNL Center for Molecular Biophysics, and principal researcher in the study. “Our results don’t mean that we have found a cure or treatment for COVID-19. But we are very hopeful  that our computational findings will both inform future studies and provide a framework that the subsequent researchers can use to further investigate these compounds. Only then will we know whether any of them exhibit the characteristics needed to mitigate this virus.” 

After the researchers turn over the most likely possibilities to the medical scientists they are still a long way from finding a cure.  The medical folks will take them into the physical wet lab and do whatever they do to determine whether a compound might work or not.  

Eventually, if they are lucky,  they will end up with something promising, which then has to be tested against the coronavirus and COVID-19. Published experts suggest this can take a year or two or more. 

Summit gave the researchers a jump start with its massive data processing capability, enabled through its 4,608 IBM Power Systems AC922 server nodes, each equipped with two IBM POWER9 CPUs and six NVIDIA Tensorcore V100 GPUs, giving it a peak performance of 200 petaflops, in effect more powerful than one million high-end laptops. 

Might quantum computing have sped up the process even more? IBM didn’t report throwing one of its quantum machines at the problem, relying instead on Summit, which has already been acclaimed as the world’s fastest supercomputer.

Nothing stays the same in the high performance computing world. HEXUS reports that when time is of the essence and lives are at stake, the value of supercomputers is highly evident. Now a new one, is being touted as  the world’s first 2 Exaflops+ supercomputer, is set to begin operations in 2023. This AMD-powered giant, HEXUS notes, is claimed to be about 10x faster than Summit. That’s good to know, but let’s hope the medical researchers have already beaten the Coronavirus and COVID-19  by then.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/

2020 IBM Quantum Gains

January 13, 2020

IBM returned from the holidays announcing a flurry of activity around quantum computing. Specifically, it has expanded its set of Q Network partners, including a range of commercial, academic, startup, government, and research entities.  

IBM Qiskit screen

The Q Network now includes over 100 organizations, across multiple industries, including: Airline, Automotive, Banking and Finance, Energy, Insurance, Materials, and Electronics.  Specifically, Anthem, Delta Air Lines, Goldman Sachs, Wells Fargo, and Woodside Energy are among the latest organizations to begin to explore practical applications using quantum computing.

In addition to these industry leaders, a number of academic, government research labs and startups have also joined the IBM Q Network, including the Georgia Institute of Technology (Georgia Tech), Stanford University, Los Alamos National Laboratory, AIQTech, Beit, Quantum Machines, Tradeteq, and Zurich Instruments.

These organizations join over 200,000 users, who have run hundreds of billions of executions on IBM’s quantum systems and simulators through the IBM Cloud. This has led to the publication of more than 200 third-party research papers on practical quantum applications.

More quantum: IBM also recently announced the planned installation of the first two IBM Q System One commercial universal quantum computers outside the US – one with Europe’s leading organization for applied research, Fraunhofer-Gesellschaft, in Germany; another with The University of Tokyo. Both are designed to advance country-wide research and provide an education framework program to engage universities, industry, and government to grow a quantum computing community and foster new economic opportunities.

Growing a quantum computing community should quickly become a critical need and, more likely, a major headache. My own cursory search of employment sites revealed no quantum computing openings  listed. Just a few casual inquiries suggest curiosity about quantum computing but not much insight or readiness or actual skills or openings to generate action. 

Still, even at this early stage things already are happening.

Anthem, Inc., a leading health benefits company is expanding its research and development efforts to explore how quantum computing may further enhance the consumer healthcare experience. For Anthem, quantum computing offers the potential to analyze vast amounts of data inaccessible to classical computing while also enhancing privacy and security. It also brings the potential to help individuals through the development of more accurate and personalized treatment options while improving the prediction of health conditions.

Delta Air Lines joined the IBM Q Hub at North Carolina State University to embark on a multi-year collaborative effort with IBM to explore the potential capabilities of quantum computing in transforming experiences for customers and employees as they encounter challenges throughout the  travel day.

Quantum Machines (QM), a provider of control and operating systems for quantum computers, brings customers among the leading players in the field, including multinational corporations, academic institutions, start-ups and national research labs. As part of the IBM and QM collaboration, a compiler between IBM’s quantum computing programming languages, like Qiskit (see graphic above),  and those of QM is being developed for use by QM’s customers. Such development will lead to the increased adoption of IBM’s open-sourced programming languages across the industry.

The Los Alamos National Laboratory also has joined as an IBM Q Hub to greatly help the lab’s research efforts, including developing and testing near-term quantum algorithms and formulating strategies for mitigating errors on quantum computers. A 53-qubit system will also allow Los Alamos to benchmark the abilities to perform quantum simulations on real quantum hardware and perhaps to finally push beyond the limits of classical computing. 

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/ 

Syncsort Acquires Pitney Bowes Software & Data

December 10, 2019

It is easy to forget that there are other ISVs  who work with the z. A recent list of z ISVs ran to over a dozen, including Rocket Software, Compuware, GT Software, and Syncsort, among others.  

Syncsort has grabbed some attention of late by announcing  the completion of an agreement to combine Pitney Bowes, the postal metering company, to take over its software and data operations. As a result, Syncsort claims a position of one of the leading data management software companies in the world, serving more than 11,000 primarily z customers.

The combined portfolio brings together capabilities in location intelligence, data enrichment, customer information management, and engagement solutions with powerful data integration and optimization software. About the only thing they haven’t listed is AI.

Over the coming months, teams will be working to combine the Syncsort-Pitney Bowes organizations and portfolios. While there may be some changes within the Syncsort organization, not much will change for its customers immediately. They can still expect to receive the same level of service they have received to support their everyday needs.

Syncsort’s acquisition of the Pitney Bowes software and data business creates a data management software company with more than 11,000 enterprise customers, $600 million in revenue, and 2,000 employees worldwide. Although modest in comparison with today’s Internet tech giants and even IBM, the resulting company brings sufficient scale, agility, and breadth of portfolio to enable leading enterprises to gain a competitive advantage from their data, Syncsort noted in its announcement.

“Enterprises everywhere are striving to increase their competitiveness through the strategic use of data…”  As a result, “organizations must invest in next-generation technologies like cloud, streaming, and machine learning, while simultaneously leveraging and modernizing decades of investment in traditional data infrastructure,” said Josh Rogers, CEO, Syncsort. Now “our increased scale allows us to expand the scope of partnerships with customers so that they can maximize the value of all their data,” he added.

According to Paige Bartley of 451 Research accompanying Syncsort’s announcement:  “The ability to derive actionable human intelligence from data requires ensuring that it has been integrated from all relevant sources, is representative and high quality, and has been enriched with additional context and information. Syncsort, as a longtime player in the data management space, is further addressing these issues with the acquisition of Pitney Bowes Software Solutions’ assets – technology that complements existing data-quality capabilities to provide additional context and enrichment for data, as well as leverage customer data and preferences to drive business outcomes.” 

The combined portfolio brings together much-in-demand capabilities in location intelligence, data enrichment, customer information management, and engagement solutions with powerful data integration and optimization software. These end-to-end capabilities, Syncsort adds,  will empower organizations to overcome ever-increasing challenges around the integrity of their data so their IT and business operations can easily integrate, enrich, and improve data assets to maximize insights.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/ 

IBM teams with Cloudera and Hortonworks 

July 11, 2019

Dancing Dinosaur has a friend on the West coast who finally left IBM after years of complaining, swearing never to return, and has been happily working at Cloudera ever since. IBM and Cloudera this week announced a strategic partnership to develop joint go-to-market programs designed to bring advanced data and AI solutions to more organizations across the expansive Apache Hadoop ecosystem.

Graphic representing a single solution for big data analytics

Deploy a single solution for big data

The agreement builds on the long-standing relationship between IBM and Hortonworks, which merged with Cloudera this past January to create integrated solutions for data science and data management. The new agreement builds on the integrated solutions and extends them to include the Cloudera platform. “This should stop the big-data-is-dead thinking that has been cropping up,” he says, putting his best positive spin on the situation.

Unfortunately, my West coast buddy may be back at IBM sooner than he thinks. With IBM finalizing its $34 billion Red Hat acquisition yesterday, it is small additional money to just buy Horton and Cloudera and own them all as a solid big data-cloud capabilities block IBM owns.  

As IBM sees it, the companies have partnered to offer an industry-leading, enterprise-grade Hadoop distribution plus an ecosystem of integrated products and services – all designed to help organizations achieve faster analytic results at scale. As a part of this partnership, IBM promises to:

  • Resell and support of Cloudera products
  • Sell and support of Hortonworks products under a multi-year contract
  • Provide migration assistance to future Cloudera/Hortonworks unity products
  • Deliver the benefits of the combined IBM and Cloudera collaboration and investment in the open source community, along with commitment to better support analytics initiatives from the edge to AI.

IBM also will resell the Cloudera Enterprise Data Hub, Cloudera DataFlow, and Cloudera Data Science Workbench. In response, Cloudera will begin to resell IBM’s Watson Studio and BigSQL.

“By teaming more strategically with IBM we can accelerate data-driven decision making for our joint enterprise customers who want a hybrid and multi-cloud data management solution with common security and governance,” said Scott Andress, Cloudera’s Vice President of Global Channels and Alliances in the announcement. 

Cloudera enables organizations to transform complex data into clear and actionable insights. It delivers an enterprise data cloud for any data, anywhere, from the edge to AI. One obvious question: how long until IBM wants to include Cloudera as part of its own hybrid cloud? 

But IBM isn’t stopping here. It also just announced new storage solutions across AI and big data, modern data protection, hybrid multicloud, and more. These innovations will allow organizations to leverage more heterogeneous data sources and data types for deeper insights from AI and analytics, expand their ability to consolidate rapidly expanding data on IBM’s object storage, and extend modern data protection to support more workloads in hybrid cloud environments.

The key is IBM Spectrum Discover, metadata management software that provides data insight for petabyte-scale unstructured storage. The software connects to IBM Cloud Object Storage and IBM Spectrum Scale, enabling it to rapidly ingest, consolidate, and index metadata for billions of files and objects. It provides a rich metadata layer that enables storage administrators, data stewards, and data scientists to efficiently manage, classify, and gain insights from massive amounts of unstructured data. Combining that with Cloudera and Horton on the IBM’s hybrid cloud should give you a powerful data analytics solution. 

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at technologywriter.com. 

 

Syncsort Drives IBMi Security with AI

May 2, 2019

The technology security landscape looks increasingly dangerous  The problem revolves around the possible impact of AI. the impact of which is not fully clear. The hope, of course, is that AI will make security more efficient and effective.  However, the security bad actors can also jump on AI to advance their own schemes. Like a cyber version of the nuclear arms race, this has been an ongoing battle for decades. The industry has to cooperate and, specifically, share information and hope the good guys can stay a step ahead.

In the meantime, vendors like IBM and most recently Syncsort have been stepping up to  the latest challengers. Syncsort, for example, earlier this month launched its Assure Security to address the increasing sophistication of cyber attacks and expanding data privacy regulations.  In surprising ways, it turns out, data privacy and AI are closely related in the AI security battle.

Syncsort, a leader in Big Iron-to-Big Data software, announced Assure Security, which combines access control, data privacy, compliance monitoring, and risk assessment into a single product. Together, these capabilities help security officers, IBMi administrators, and Db2 administrators address critical security challenges and comply with new regulations meant to safeguard and protect the privacy of data.

And it clearly is coming at the right time.  According to Privacy Rights Clearinghouse, a non-profit corporation with a mission to advocate for data privacy there were 828 reported security incidents in 2018 resulting in the exposure of over 1.37 billion records of sensitive data. As regulations to help protect consumer and business data become stricter and more numerous, organizations must build more robust data governance and security programs to keep the data from being exploited by bad security actors for nefarious purposes.  The industry already has scrambled to comply with GDPR and the New York Department of Financial Services Cybersecurity regulations and they now must prepare for the GDPR-like California Consumer Privacy Act, which takes effect January 1, 2020.

In its own survey Syncsort found security is the number one priority among IT pros with IBMi systems. “Given the increasing sophistication of cyber attacks, it’s not surprising 41 percent of respondents reported their company experienced a security breach and 20 percent more were unsure if they even had been breached,” said David Hodgson, CPO, Syncsort. The company’s new Assure Security product leverages the wealth of IBMi security technology and the expertise to help organizations address their highest-priority challenges. This includes protecting against vulnerabilities introduced by new, open-source methods of connecting to IBMi systems, adopting new cloud services, and complying with expanded government regulations.

Of course, IBM hasn’t been sleeping through this. The company continues to push various permutations of Watson to tackle the AI security challenge. For example, IBM leverages AI to gather insights and use reasoning to identify relationships between threats, such as malicious files, suspicious IP addresses,  or even insiders. This analysis takes seconds or minutes, allowing security analysts to respond to threats up to 60 times faster.

It also relies on AI to eliminate time-consuming research tasks and provides curated analysis of risks, which reduces the amount of time security analysts require to make the critical decisions and launch an orchestrated response to counter each threat. The result, which IBM refers to as cognitive security, combines the strengths of artificial intelligence and human intelligence.

Cognitive AI in effect, learns with each interaction to proactively detect and analyze threats and provides actionable insights to security analysts making informed decisions. Such cognitive security, let’s hope, combines the strengths of artificial intelligence with human judgement.

Syncsort’s Assure Security, specifically brings together best-in-class IBMi security capabilities acquired by Syncsort into an all-in-one solution, with the flexibility for customers to license individual modules. The resulting product includes:

  • Assure  Compliance Monitoring quickly identifies security and compliance issues with real-time alerts and reports on IBMi system activity and database changes.
  • Assure Access Control provides control of access to IBMi systems and their data through a varied bundle of capabilities.
  • Assure Data Privacy protects IBMi data at-rest and in-motion from unauthorized access and theft through a combination of NIST-certified encryption, tokenization, masking, and secure file transfer capabilities.
  • Assure Security Risk Assessment examines over a dozen categories of security values, open ports, power users, and more to address vulnerabilities.

It probably won’t surprise anyone but the AI security situation is not going to be cleared up soon. Expect to see a steady stream of headlines around security hits and misses over the next few years. Just hope will get easier to separate the good guys from the bad actors and the lessons will be clear.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at technologywriter.com.

IBM Joins with Harley-Davidson for LiveWire

March 1, 2019

I should have written this piece 40 years ago as a young man fresh out of grad school. Then I was dying for a 1200cc Harley Davidson motorcycle. My mother was dead set against it—she wouldn’t even allow me to play tackle football and has since been vindicated (You win on that, mom.). My father, too, was opposed and wouldn’t help pay for it. I had to settle for a puny little motor scooter that offered zero excitement.

In the decades since I graduated, Harley’s fortunes have plummeted as the HOG (Harley Owners Group) community aged out and few youngsters have picked up the slack. The 1200cc bike I once lusted after probably is now too heavy for me to handle. So, what is Harley to do? Redefine its classic American brand with an electric model, LiveWire.

Courtesy: Harley Davidson, IBM

With LiveWire, Harley expects to remake the motorcycle as a cloud-connected machine and promises to deliver new products for fresh motorcycle segments, broaden engagement with the brand, and strengthen the H-D dealer network. It also boldly proclaimed that Harley-Davidson will lead the electrification of motorcycling.

According to the company, Harley’s LiveWire will leverage H-D Connect, a service (available in select markets), built on thIBM AI, analytics, and IoTe IBM Cloud. This will enable it to deliver new mobility and concierge services today and leverage an expanding use of IBM AI, analytics, and IoT to enhance and evolve the rider’s experience. In order to capture this next generation of bikers, Harley is working with IBM to transform the everyday experience of riding through the latest technologies and features IBM can deliver via the cloud.

Would DancingDinosaur, an aging Harley enthusiast, plunk down the thousands it would take to buy one of these? Since I rarely use my smartphone to do anything more than check email and news, I am probably not a likely prospect for LiveWire.

Will LiveWire save Harley? Maybe; it depends on what the promised services will actually deliver. Already, I can access a wide variety of services through my car but, other than Waze, I rarely use any of those.

According to the joint IBM-Harley announcement, a fully cellular-connected electric motorcycle needed a partner that could deliver mobility solutions that would meet riders’ changing expectations, as well as enhance security. With IBM, Harley hopes to strike a balance between using data to create both intelligent and personal experiences while maintaining privacy and security, said Marc McAllister, Harley-Davidson VP Product Planning and Portfolio in the announcement.

So, based on this description, are you ready to jump to LiveWire? You probably need more details. So far, IBM and Harley have identified only three:

  1. Powering The Ride: LiveWire riders will be able to check bike vitals at any time and from any location. Information available includes features such as range, battery health, and charge level. Motorcycle status features will also support the needs of the electric bike, such as the location of charging stations. Also riders can see their motorcycle’s current map location.  Identifying charging stations could be useful.
  2. Powering Security: An alert will be sent to the owner’s phone if the motorcycle has been bumped, tampered, or moved. GPS-enabled stolen-vehicle assistance will provide peace of mind that the motorcycle’s location can be tracked. (Requires law enforcement assistance. Available in select markets).
  3. Powering Convenience: Reminders about upcoming motorcycle service requirements and other care notifications will be provided. In addition, riders will receive automated service reminders as well as safety or recall notifications.

“The next generation of Harley-Davidson riders will demand a more engaged and personalized customer experience,” said Venkatesh Iyer, Vice President, North America IoT and Connected Solutions, Global Business Services, IBM. Introducing enhanced capabilities, he continues, via the IBM Cloud will not only enable new services immediately, but will also provide a roadmap for the journey ahead. (Huh?)

As much as DancingDinosaur aches for Harley to come roaring back with a story that will win the hearts of the HOG users who haven’t already drifted away Harley will need more than the usual buzzwords, trivial apps, and cloud hype.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at technologywriter.com.

Are Quantum Computers Even Feasible

November 29, 2018

IBM has toned down its enthusiasm for quantum computing. Even last spring it already was backing off a bit at Think 2018. Now the company is believes that quantum computing will augment classical computing to potentially open doors that it once thought would remain locked indefinitely.

First IBM Q computation center

With its Bristlecone announcement Google trumped IBM with 72 qubits. Debating a few dozen qubits more or less may prove irrelevant. A number of quantum physics researchers have recently been publishing papers that suggest useful quantum computing may be decades away.

Mikhail Dyakonov writes in his piece titled: The Case Against Quantum Computing, which appeared last month in Spectrum IEEE.org. Dyakonov does research in theoretical physics at Charles Coulomb Laboratory at the University of Montpellier, in France.

As Dyakonov explains: In quantum computing, the classical two-state circuit element (the transistor) is replaced by a quantum element called a quantum bit, or qubit. Like the conventional bit, it also has two basic states. But you already know this because DancingDinosaur covered it here and several times since.

But this is what you might not know: With the quantum bit, those two states aren’t the only ones possible. That’s because the spin state of an electron is described as a quantum-mechanical wave function. And that function involves two complex numbers, α and β (called quantum amplitudes), which, being complex numbers, have real parts and imaginary parts. Those complex numbers, α and β, each have a certain magnitude, and, according to the rules of quantum mechanics, their squared magnitudes must add up to 1.

Dyakonov continues: In contrast to a classical bit a qubit can be in any of a continuum of possible states, as defined by the values of the quantum amplitudes α and β. This property is often described by the statement that a qubit can exist simultaneously in both of its ↑ and ↓ states. Yes, quantum mechanics often defies intuition.

So while IBM, Google, and other classical computer providers quibble about 50 qubits or 72 or even 500 qubits, to Dyakonov this is ridiculous. The real number of qubits will be astronomical as he explains: Experts estimate that the number of qubits needed for a useful quantum computer, one that could compete with your laptop in solving certain kinds of interesting problems, is between 1,000 and 100,000. So the number of continuous parameters describing the state of such a useful quantum computer at any given moment must be at least 21,000, which is to say about 10300. That’s a very big number indeed; much greater than the number of subatomic particles in the observable universe.

Just in case you missed the math, he repeats: A useful quantum computer [will] need to process a set of continuous parameters that is larger than the number of subatomic particles in the observable universe.

Before you run out to invest in a quantum computer with the most qubits you can buy you would be better served joining IBM’s Q Experience and experimenting with it on IBM’s nickel. Let them wrestle with the issues Dyakonov brings up.

Then, Dyakonov concludes: I believe that such experimental research is beneficial and may lead to a better understanding of complicated quantum systems.  I’m skeptical that these efforts will ever result in a practical quantum computer. Such a computer would have to be able to manipulate—on a microscopic level and with enormous precision—a physical system characterized by an unimaginably huge set of parameters, each of which can take on a continuous range of values. Could we ever learn to control the more than 10300 continuously variable parameters defining the quantum state of such a system? My answer is simple. No, never.

I hope my high school science teacher who enthusiastically introduced me to quantum physics has long since retired or, more likely, passed on. Meanwhile, DancingDinosaur expects to revisit quantum regularly in the coming months or even years.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at technologywriter.com.

GAO Blames Z for Government Inefficiency

October 19, 2018

Check out the GAO report from May 2016 here.  The Feds spent more than 75 percent of the total amount budgeted for information technology (IT) for fiscal year 2015 on operations and maintenance (O&M). In a related report, the IRS reported it used assembly language code and COBOL, both developed in the 1950s, for IMF and IDRS. Unfortunately, the GAO conflates the word “mainframe” to refer to outdated UNISYS mainframes with the modern, supported, and actively developed IBM Z mainframes, notes Ross Mauri, IBM general manager, Z systems.

Mainframes-mobile in the cloud courtesy of Compuware

The GAO repeatedly used “mainframe” to refer to outdated UNISYS mainframes alongside the latest advanced IBM Z mainframes.  COBOL, too, maintains active skills and training programs at many institutions and receives investment across many industries. In addition to COBOL, the IBM z14 also runs Java, Swift, Go, Python and other open languages to enable modern application enhancement and development. Does the GAO know that?

The GAO uses the word “mainframe” to refer to outdated UNISYS mainframes as well as modern, supported, and actively developed IBM Z mainframes. In a recent report, the GAO recommends moving to supported modern hardware. IBM agrees. The Z, however, does not expose mainframe investments to a rise in procurement and operating costs, nor to skilled staff issues, Mauri continued.

Three investments the GAO reviewed in the operations and maintenance clearly appear as legacy investments facing significant risks due to their reliance on obsolete programming languages, outdated hardware, and a shortage of staff with critical skills. For example, IRS reported that it used assembly language code and COBOL (both developed in the 1950s) for IMF and IDRS. What are these bureaucrats smoking?

The GAO also seems confused over the Z and the cloud. IBM Cloud Private is designed to run on Linux-based Z systems to take full advantage of the cloud through open containers while retaining the inherent benefits of Z hardware—security, availability,  scalability, reliability; all the ities enterprises have long relied on the z for. The GAO seems unaware that the Z’s automatic pervasive encryption immediately encrypts everything at rest or in transit. Furthermore, the GAO routinely addresses COBOL as a deficiency while ISVs and other signatories of the Open Letter consider it a modern, optimized, and actively supported programming language.

The GAO apparently isn’t even aware of IBM Cloud Private. IBM Cloud Private is compatible with leading IT systems manufacturers and has been optimized for IBM Z. All that you need to get started with the cloud is the starter kit available for IBM OpenPOWER LC (Linux) servers, enterprise Power Systems, and Hyperconverged Systems powered by Nutanix. You don’t even need a Z; just buy a low cost OpenPOWER LC (Linux) server online and configure it as desired.

Here is part of the letter that Compuware sent to the GAO, Federal CIOs, and members of Congress. It’s endorsed by several dozen members of the IT industry. The full letter is here:

In light of a June 2018 GAO report to the Internal Revenue Service suggesting the agency’s mainframe- and COBOL-based systems present significant risks to tax processing, we the mainframe IT community—developers, scholars, influencers and inventors—urge the IRS and other federal agencies to:

  • Reinvest in and modernize the mainframe platform and the mission-critical applications which many have long relied upon.
  • Prudently consider the financial risks and opportunity costs associated with rewriting and replacing proven, highly dependable mainframe applications, for which no “off-the-shelf” replacement exists.
  • Understand the security and performance requirements of these mainframe applications and data and the risk of migrating to platforms that were never designed to meet such requirements.

The Compuware letter goes on to state: In 2018, the mainframe is still the world’s most reliable, performant and securable platform, providing the lowest cost high-transaction system of record. Regarding COBOL it notes that since 2017 IBM z14 supports COBOL V6.2, which is optimized bi-monthly.

Finally, about attracting new COBOL workers: COBOL is as easy to work with it as any other language. In fact, open source Zowe has demonstrated appeal to young techies, providing solutions for development and operations teams to securely manage, control, script, and develop on the mainframe like any other cloud platform. What don’t they get?

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at technologywriter.com.

Can IBM find a place for Watson?

September 7, 2018

After beating 2 human Jeopardy game champions three times in a row in 2011 IBM’s Watson has been hard pressed to come up with a comparable winning streak. Initially IBM appeared to expect its largest customers to buy richly configured Power Servers to run Watson on prem. When they didn’t get enough takers the company moved Watson to the cloud where companies could lease it for major knowledge-driven projects. When that didn’t catch on IBM started to lease Watson’s capabilities by the drink, promising to solve problems in onesies and twosies.

Jeopardy champs lose to Watson

Today Watson is promising to speed AI success through IBM’s Watson Knowledge Catalog. As IBM puts it: IBM Watson Knowledge Catalog powers intelligent, self-service discovery of data, models, and more; activating them for artificial intelligence, machine learning, and deep learning. Access, curate, categorize and share data, knowledge assets, and their relationships, wherever they reside.

DancingDinosaur has no doubt that Watson is stunning technology and has been rooting for its success since that first Jeopardy round seven years ago. Over that time, Watson and IBM have become a case study in how not to price, package, and market powerful yet expensive technology. The Watson Knowledge Catalog is yet another pricing and packaging experiment.

Based on the latest information online, Watson Knowledge Catalog is priced according to number of provisioned catalogs and discovery connections. There are two plans available: Lite and Professional. The Lite plan allows 1 catalog and 5 free discovery connections while the Professional plan provides unlimited of both. Huh? This statement begs for clarification and there probably is a lot of information and fine print required to answer the numerous questions the above description raises, but life is too short for DancingDinosaur to rummage around on the Watson Knowledge Catalog site to look for answers. Doesn’t this seem like something Watson itself should be able to clarify with a single click?

But no, that is too easy. Instead IBM takes the high road, which DancingDinosaur calls the education track.  Notes Jay Limburn, Senior Technical Staff Member and IBM Offering Manager: there are two main challenges that might impede you from realizing the true value of your data and slowing your journey to adopting artificial intelligence (AI). They are 1) inefficient data management and 2) finding the right tools for all data users.

Actually, the issues start even earlier. In attempting AI most established organizations start at a disadvantage, notes IBM. For example:

  • Most enterprises do not know what and where their data is
  • Data science and compliance teams are handicapped by the lack of data accessibility
  • Enterprises with legacy data are even more handicapped than digitally savvy startups
  • AI projects will expose problems with limited data and poor quality; many will simply fail just due to that.
  • The need to differentiate through monetization increases in importance with AI

These are not new. People have been whining about this since the most rudimentary data mining attempts were made decades ago. If there is a surprise it is that they have not been resolved by now.

Or maybe they finally have with the IBM Watson Knowledge Catalog. As IBM puts it, the company will deliver what promises to be the ultimate data Catalog that actually illuminates data:

  • Knows what data your enterprise has
  • Where it resides
  • Where it came from
  • What it means
  • Provide quick access to it
  • Ensure protection of use
  • Exploit Machine Learning for intelligence and automation
  • Enable data scientists, data engineers, stewards and business analysts
  • Embeddable everywhere for free, with premium features available in paid editions

OK, after 7 years Watson may be poised to deliver and it has little to do with Jeopardy but with a rapidly growing data catalog market. According to a Research and Markets report, the data catalog market is expected to grow from $210 million in 2017 to $620 million by 2022. How many sales of the Professional version gets IBM a leading share.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog. See more of his work at technologywriter.com and here.


%d bloggers like this: