Posts Tagged ‘technology’

IBM Advances Commercial Quantum Computing

August 7, 2019

The reason IBM and others are so eager for quantum computing is simple: money. Recent efforts have demonstrated that quantum analytics can process massive amounts of transactions quickly and accurately, as much as nearly $70 Trillion last year, according to the World Bank.

“These are enormous amounts of money,” says mathematician Cornelis Oosterlee of Centrum Wiskunde & Informatica, a national research institute in the Netherlands for a piece in Wired Magazine. “Some single trades involve numbers that are scary to imagine”—part of a company’s pension fund, say, or a university endowment, he continues.

Of course, this isn’t exactly new. Large organizations with access to huge amounts of resources devote inordinate quantities of those resources in an effort to predict how much their assets will be worth in the future.  If they could do this modeling faster or more accurately or more efficiently, maybe just shaving off a few seconds here or there; well you can do the arithmetic.

Today these calculations are expensive to run, requiring either an in-house supercomputer or two or a big chunk of cloud computing processors and time. But if or when quantum computing could deliver on some of its theoretical promise to drive these analyses faster, more accurately, more efficiently and cheaper that’s something IBM could build into the next generation of systems.. 

And it is not just IBM. From Google on down to startups, developers are working on machines that could one day beat conventional computers at various tasks, such as classifying data through machine learning or inventing new drugs—and running complex financial calculations. In a step toward delivering on that promise, researchers affiliated with IBM and J.P. Morgan recently figured out how to run a simplified risk calculation on an actual quantum computer.

Using one of IBM’s machines, located in Yorktown Heights, New York, the researchers demonstrated they could simulate the future value of a financial product called an option. Currently, many banks use what’s called  the Monte Carlo method to simulate prices of all sorts of financial instruments. In essence, the Monte Carlo method models the future as a series of forks in the road. A company might go under; it might not. President Trump might start a trade war; he might not. Analysts estimate the likelihood of such scenarios, then generate millions of alternate futures at random. To predict the value of a financial asset, they produce a weighted average of these millions of possible outcomes.

Quantum computers are particularly well suited to this sort of probabilistic calculation, says Stefan Woerner, who led the IBM team. Classical (or conventional) computers—the kind most of us use—are designed to manipulate bits. Bits are binary, having a value of either 0 or 1. Quantum computers, on the other hand, manipulate qubits, which represent an in-between state. A qubit is like a coin flipping in the air—neither heads nor tails, neither 0 nor 1 but some probability of being one or the other. And because a qubit has unpredictability built in, it promises to  be a natural tool for simulating uncertain outcomes.

Woerner and his colleagues ran their Monte Carlo calculations using three of the 20 qubits available on their quantum machine. The experiment was too simplistic to be useful commercially, but it’s a promising proof of concept; once bigger and smoother-running quantum computers are available, the researchers hope to execute the algorithm faster than conventional machines.

But this theoretical advantage is just that, theoretical. Existing machines remain too error-ridden to compute consistently, In addition, financial institutions already have ample computing power available, onsite or in the cloud.. And they will have even more as graphics processing units (GPU), which can execute many calculations in parallel, come on line. A quantum computer might well be faster than an individual chip but it’s unclear whether it could beat a fleet of high performance GPUs in a supercomputer.

Still, it’s noteworthy that the IBM team was able to implement the algorithm on actual hardware, says mathematician Ashley Montanaro of the University of Bristol in the UK, who was not involved with the work. Academics first developed the mathematical proofs behind this quantum computing algorithm in 2000, but it remained a theoretical exercise for years. Woerner’s group took a 19-year-old recipe and figured out how to make it quantum-ready on actual quantum hardware.

Now they’re looking to improve their algorithm by using more qubits. The most powerful quantum computers today have fewer than 200 qubits, Practitioners suggest it may take thousands to consistently beat conventional methods.

But demonstrations like Woerner’s, even with their limited scope, are useful in that they apply quantum computers to problems organizationz actually want to solve, And that is what it will take if IBM expects to build quantum computing into a viable commercial business.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at technologywriter.com. 

IBM teams with Cloudera and Hortonworks 

July 11, 2019

Dancing Dinosaur has a friend on the West coast who finally left IBM after years of complaining, swearing never to return, and has been happily working at Cloudera ever since. IBM and Cloudera this week announced a strategic partnership to develop joint go-to-market programs designed to bring advanced data and AI solutions to more organizations across the expansive Apache Hadoop ecosystem.

Graphic representing a single solution for big data analytics

Deploy a single solution for big data

The agreement builds on the long-standing relationship between IBM and Hortonworks, which merged with Cloudera this past January to create integrated solutions for data science and data management. The new agreement builds on the integrated solutions and extends them to include the Cloudera platform. “This should stop the big-data-is-dead thinking that has been cropping up,” he says, putting his best positive spin on the situation.

Unfortunately, my West coast buddy may be back at IBM sooner than he thinks. With IBM finalizing its $34 billion Red Hat acquisition yesterday, it is small additional money to just buy Horton and Cloudera and own them all as a solid big data-cloud capabilities block IBM owns.  

As IBM sees it, the companies have partnered to offer an industry-leading, enterprise-grade Hadoop distribution plus an ecosystem of integrated products and services – all designed to help organizations achieve faster analytic results at scale. As a part of this partnership, IBM promises to:

  • Resell and support of Cloudera products
  • Sell and support of Hortonworks products under a multi-year contract
  • Provide migration assistance to future Cloudera/Hortonworks unity products
  • Deliver the benefits of the combined IBM and Cloudera collaboration and investment in the open source community, along with commitment to better support analytics initiatives from the edge to AI.

IBM also will resell the Cloudera Enterprise Data Hub, Cloudera DataFlow, and Cloudera Data Science Workbench. In response, Cloudera will begin to resell IBM’s Watson Studio and BigSQL.

“By teaming more strategically with IBM we can accelerate data-driven decision making for our joint enterprise customers who want a hybrid and multi-cloud data management solution with common security and governance,” said Scott Andress, Cloudera’s Vice President of Global Channels and Alliances in the announcement. 

Cloudera enables organizations to transform complex data into clear and actionable insights. It delivers an enterprise data cloud for any data, anywhere, from the edge to AI. One obvious question: how long until IBM wants to include Cloudera as part of its own hybrid cloud? 

But IBM isn’t stopping here. It also just announced new storage solutions across AI and big data, modern data protection, hybrid multicloud, and more. These innovations will allow organizations to leverage more heterogeneous data sources and data types for deeper insights from AI and analytics, expand their ability to consolidate rapidly expanding data on IBM’s object storage, and extend modern data protection to support more workloads in hybrid cloud environments.

The key is IBM Spectrum Discover, metadata management software that provides data insight for petabyte-scale unstructured storage. The software connects to IBM Cloud Object Storage and IBM Spectrum Scale, enabling it to rapidly ingest, consolidate, and index metadata for billions of files and objects. It provides a rich metadata layer that enables storage administrators, data stewards, and data scientists to efficiently manage, classify, and gain insights from massive amounts of unstructured data. Combining that with Cloudera and Horton on the IBM’s hybrid cloud should give you a powerful data analytics solution. 

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at technologywriter.com. 

 

IBM Pushes Quantum for Business

June 20, 2019

Other major system providers pursuing quantum computing initiatives, but none are pursuing it as methodically or persistently as IBM. In a recent announcement:  IBM’s Institute for Business Value introduced a five-step roadmap to bring quantum computing to your organization.

Into IBM Q computation center: dilution refrigerators with microwave electronics (middle) that provide Q Network cloud access to 20-qubit processor. (Credit: Connie Zhou)

Start by familiarizing yourself with superposition and entanglement, which enable quantum computers to solve problems intractable for today’s conventional computers:

Superposition. A conventional computer uses binary bits that can only depict either 1 or 0. Instead, quantum computers use qubits that can depict a 1 or 0, or any combination by superposition of the qubits’ possible states. This supplies quantum computers with an exponential set of states they can explore to solve certain types of problems better than conventional computers.

Entanglement. In the quantum world, two qubits located even light-years apart can still act in ways that are strongly correlated. Quantum computing takes advantage of this entanglement to encode problems that exploit this correlation between qubits.

The quantum properties of superposition and entanglement enable quantum computers to rapidly explore an enormous set of possibilities to identify an optimal answer that could maximize business value. As future quantum computers can calculate certain answers exponentially faster than today’s conventional machines, they will enable tackling business problems that are exponentially more complex.

Despite conventional computers’ limitations, quantum computers are not expected to replace them in the foreseeable future. Instead, hybrid quantum-conventional architectures are expected to emerge that, in effect, outsource portions of difficult problems to a quantum computer.

Already Quantum computing appears ripe to transform certain industries. For instance, current computational chemistry methods rely heavily on approximation because the exact equations cannot be solved by conventional computers. Similarly, quantum algorithms are expected to deliver accurate simulations of molecules over longer timescales, currently impossible to model precisely. This could enable life-saving drug discoveries and significantly shorten the number of years required to develop complex pharmaceuticals.

Additionally, quantum computing’s anticipated ability to solve today’s impossibly complex logistics problems could produce considerable cost savings and carbon footprint reduction. For example, consider improving the global routes of the trillion-dollar shipping industry (see Dancing Dinosaur’s recent piece on blockchain gaining traction). If quantum computing could improve container utilization and shipping volumes by even a small fraction, this could save shippers hundreds of millions of dollars. To profit from quantum computing’s advantages ahead of competitors, notes IBM, some businesses are developing expertise now to explore which use cases may benefit their own industries as soon as the technology matures.

To stimulate this type of thinking, IBM’s Institute of Business Value has come up with 5 steps to get you started:

  1. Identify your quantum champions. Assign this staff to learn more about the prospective benefits of quantum computing. Just designate some of your leading professionals as quantum champions and charge them with understanding quantum computing, its potential impact on your industry, your competitors’ response, and how your business might benefit. Have these champions report periodically to senior management to educate the organization and align progress to strategic objectives.
  2. Begin identifying quantum computing use cases and associated value propositions. Have your champions identify specific areas where quantum computing could propel your organization ahead of competitors. Have these champions monitor progress in quantum application development to track which use cases may be commercialized sooner. Finally, ensure your quantum exploration links to business results. Then select the most promising quantum computing applications, such as creating breakthrough products and services or new ways to optimize the supply chain.
  3. Experiment with real quantum systems. Demystify quantum computing by trying out a real quantum computer (IBM’s Q Experience). Have your champions get a sense for how quantum computing may solve your business problems and interface with your existing tools. A quantum solution may not be a fit for every business issue. Your champions will need to focus on solutions to address your highest priority use cases, ones that conventional computers can’t practically solve.
  4. Chart your quantum course. This entails constructing a quantum computing roadmap with viable next steps for the purpose of pursuing problems that could create formidable competitive barriers or enable sustainable business advantage. To accelerate your organization’s quantum readiness, consider joining an emerging quantum community. This can help you gain better access to technical infrastructure, evolving industry applications, and expertise that can enhance your development of specific quantum applications.
  5. Lastly, be flexible about your quantum future. Quantum computing is rapidly evolving. Seek out technologies and development toolkits that are becoming the industry standard, those around which ecosystems are coalescing. Realize that new breakthroughs may cause you to adjust your approach to your quantum development process, including changing your ecosystem partners. Similarly, your own quantum computing needs may evolve over time, particularly as you improve your understanding of which business issues can benefit most from quantum solutions.

Finally, actually have people in your organization try a quantum computer, such as through IBM’s Q program and Qiskit, a free development tool. Q provides a free 16-qubit quantum computer you don’t have to configure or keep cool and stable. That’s IBM’s headache.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at technologywriter.com.

Syncsort Drives IBMi Security with AI

May 2, 2019

The technology security landscape looks increasingly dangerous  The problem revolves around the possible impact of AI. the impact of which is not fully clear. The hope, of course, is that AI will make security more efficient and effective.  However, the security bad actors can also jump on AI to advance their own schemes. Like a cyber version of the nuclear arms race, this has been an ongoing battle for decades. The industry has to cooperate and, specifically, share information and hope the good guys can stay a step ahead.

In the meantime, vendors like IBM and most recently Syncsort have been stepping up to  the latest challengers. Syncsort, for example, earlier this month launched its Assure Security to address the increasing sophistication of cyber attacks and expanding data privacy regulations.  In surprising ways, it turns out, data privacy and AI are closely related in the AI security battle.

Syncsort, a leader in Big Iron-to-Big Data software, announced Assure Security, which combines access control, data privacy, compliance monitoring, and risk assessment into a single product. Together, these capabilities help security officers, IBMi administrators, and Db2 administrators address critical security challenges and comply with new regulations meant to safeguard and protect the privacy of data.

And it clearly is coming at the right time.  According to Privacy Rights Clearinghouse, a non-profit corporation with a mission to advocate for data privacy there were 828 reported security incidents in 2018 resulting in the exposure of over 1.37 billion records of sensitive data. As regulations to help protect consumer and business data become stricter and more numerous, organizations must build more robust data governance and security programs to keep the data from being exploited by bad security actors for nefarious purposes.  The industry already has scrambled to comply with GDPR and the New York Department of Financial Services Cybersecurity regulations and they now must prepare for the GDPR-like California Consumer Privacy Act, which takes effect January 1, 2020.

In its own survey Syncsort found security is the number one priority among IT pros with IBMi systems. “Given the increasing sophistication of cyber attacks, it’s not surprising 41 percent of respondents reported their company experienced a security breach and 20 percent more were unsure if they even had been breached,” said David Hodgson, CPO, Syncsort. The company’s new Assure Security product leverages the wealth of IBMi security technology and the expertise to help organizations address their highest-priority challenges. This includes protecting against vulnerabilities introduced by new, open-source methods of connecting to IBMi systems, adopting new cloud services, and complying with expanded government regulations.

Of course, IBM hasn’t been sleeping through this. The company continues to push various permutations of Watson to tackle the AI security challenge. For example, IBM leverages AI to gather insights and use reasoning to identify relationships between threats, such as malicious files, suspicious IP addresses,  or even insiders. This analysis takes seconds or minutes, allowing security analysts to respond to threats up to 60 times faster.

It also relies on AI to eliminate time-consuming research tasks and provides curated analysis of risks, which reduces the amount of time security analysts require to make the critical decisions and launch an orchestrated response to counter each threat. The result, which IBM refers to as cognitive security, combines the strengths of artificial intelligence and human intelligence.

Cognitive AI in effect, learns with each interaction to proactively detect and analyze threats and provides actionable insights to security analysts making informed decisions. Such cognitive security, let’s hope, combines the strengths of artificial intelligence with human judgement.

Syncsort’s Assure Security, specifically brings together best-in-class IBMi security capabilities acquired by Syncsort into an all-in-one solution, with the flexibility for customers to license individual modules. The resulting product includes:

  • Assure  Compliance Monitoring quickly identifies security and compliance issues with real-time alerts and reports on IBMi system activity and database changes.
  • Assure Access Control provides control of access to IBMi systems and their data through a varied bundle of capabilities.
  • Assure Data Privacy protects IBMi data at-rest and in-motion from unauthorized access and theft through a combination of NIST-certified encryption, tokenization, masking, and secure file transfer capabilities.
  • Assure Security Risk Assessment examines over a dozen categories of security values, open ports, power users, and more to address vulnerabilities.

It probably won’t surprise anyone but the AI security situation is not going to be cleared up soon. Expect to see a steady stream of headlines around security hits and misses over the next few years. Just hope will get easier to separate the good guys from the bad actors and the lessons will be clear.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at technologywriter.com.

IBM Revamps V5000

April 5, 2019

On April 2nd IBM announced several key enhancements across the Storwize V5000 portfolio and along with new models. These new models include the V5010E, 5030E and the V5100. The E stands for EXPRESS.) To further complicate the story, it utilizes Broadwell, Intel’s new 14 nanometer die shrink of its Haswell microarchitecture. Broadwell did not completely replace the full range of CPUs from Intel’s previous Haswell microarchitecture but IBM is using it widely in the new V5000 models.

IBM NVMe Flash Core Module

And the results can be impressive. From a scale-out perspective the V5010E supports a single controller configuration, while the V5030E and V5100 both support up to two controller clusters. This provides for a maximum of 392 drives in the V5010E and a massive 1520 drives in either the V5030E or V5100 dual controller clusters. The V5030E includes the Broadwell DE 1.9GHz, 6 core processor in its two canisters. Each canister supports a maximum of 32GB of RAM. Better still, the V5100 boasts a single Skylake 1.7Ghz processor with 8 cores in each canister. RAM is increased to a total of 576GB for the entire controller, or 288GB maximum per canister.

.For the next generation Storwize V5000 platforms IBM encouraging them to be called Gen3. The Gen3 encompasses 8 new MTM (Machine Type Model) based on 3 hardware models, V5010E, V5030E and V5100. The V5100 comes in two models, a hybrid (HDD and Flash) and the All Flash model V5100F. Of these 4 types, each is available with a 1 year or 3 year warranty.

The V5000E models are based on the Gen2 hardware, with various enhancements, including more memory options on the V5010E. The V5100 models are all new hardware and bring same NVMe Flash Core Modules (FCM) that are available on the V7000 and FlashSystem9100 products, completing Core Modules the transition of the Storwize family to all NVMe arrays. If you haven’t seen or heard about IBM’s FCM technology introduced last year to optimize NVMe FCM are a family of high-performance flash drives that utilizes the NVMe protocol, a PCIe Gen3 interface, and high-speed NAND memory to provide high throughput and IOPS and very low latency. FCM is available in 4.8 TB, 9.6 TB, and 19.2 TB capacities. Hardware-based data compression and self-encryption are built in.

The all flash (F) variants of the V5000 can also attach SAS expansions to extend capacity using SAS based Flash drives to allow expansion up to 1520 drives. The drives, however, are not interchangeable with the new FCM drives. The E variants allow attachment of SAS 2.5” and 3.5” HDD drives, with the V5010E expandable to 392 drives and the others up to 1520.

Inbuilt host attachments come in the form of 10GbE ports for iSCSI workloads, with optional 16Gbit FibreChannel (SCSI or FC-NVMe) as well as additional 10GbE or 25GbE iSCSI. The V5100 models can also use the iSER (an iSCSI translation layer for operation over RDMA transports, such as InfiniBand) protocol over the 25GbE ports for clustering capability, with plans to support NVMeF over Ethernet. In terms of cache memory, the V5000E products are expandable up to 64GB per controller (IO Group) and the V5100 can support up to 576GB per controller. Similarly, IBM issued as a statement of direction for all 25GbE port types across the entire Spectrum Virtualize family of products.

As Lloyd Dean, IBM Senior Certified Executive IT Architect noted, the new lineup for the V5000 is impressive; regarding the quantity of drives, and the storage available per model will “blow your mind”. How mind blowing will depend, of course, on your configuration and IBM’s pricing. As usual, IBM talks about affordable and comparative cost and storage efficiency but they usually never state a price. But they did once 3 years ago: List price then for the V5010 was $9,250 including hardware, software and a one-year warranty, according to a published report. Today IBM will likely steer you to cloud pricing, which may or may not be a bargain depending on how the deal is structured and priced. With the cloud, everything is in the details.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at technologywriter.com.

IBM Rides Quantum Volume  to Quantum Advantage

March 19, 2019

Recently IBM announced achieving its’ highest quantum volume to date. Of course, nobody else knows what Quantum Volume is.  Quantum volume is both a measurement and a procedure developed, no surprise here, by IBM to determine how powerful a quantum computer is. Read the May 4 announcement here.

Quantum volume is not just about the number of qubits, although that is one part of it. It also includes both gate and measurement errors, device cross talk, as well as device connectivity and circuit compiler efficiency. According to IBM, the company has doubled the power of its quantum computers annually since 2017.

The upgraded processor will be available for use by developers, researchers, and programmers to explore quantum computing using a real quantum processor at no cost via the IBM Cloud. This offer has been out in various forms since May 2016 as IBM’s Q Experience.

Also announced was a new prototype of a commercial processor, which will be the core for the first IBM Q early-access commercial systems.  Dates have only been hinted at.

IBM’s recently unveiled IBM Q System One quantum computer, with a fourth-generation 20-qubit processor, which has resulted in a Quantum Volume of 16, roughly double that of the current IBM Q 20-qubit device, which have a Quantum Volume of 8.

The Q volume math goes something like this: a variety of factors determine Quantum Volume, including the number of qubits, connectivity, and coherence time, plus accounting for gate and measurement errors, device cross talk, and circuit software compiler efficiency.

In addition to producing the highest Quantum Volume to date, IBM Q System One’s performance reflects some of the lowest error rates IBM has ever measured, with an average 2-qubit gate error less than 2 percent, and its best gate achieving less than a 1 percent error rate. To build a fully-functional, large-scale, universal, fault-tolerant quantum computer, long coherence times and low error rates are required. Otherwise how could you ever be sure of the results?

Quantum Volume is a fundamental performance metric that measures progress in the pursuit of Quantum Advantage, the Quantum Holy Grail—the point at which quantum applications deliver a significant, practical benefit beyond what classical computers alone are capable. To achieve Quantum Advantage in the next decade, IBM believes that the industry will need to continue to double Quantum Volume every year.

Sounds like Moore’s Law all over again. IBM doesn’t deny the comparison. It writes: in 1965, Gordon Moore postulated that the number of components per integrated function would grow exponentially for classical computers. Jump to the new quantum era and IBM notes its Q system progress since 2017 presents a similar early growth pattern, supporting the premise that Quantum Volume will need to double every year and presenting a clear roadmap toward achieving Quantum Advantage.

IBM’s recently unveiled IBM Q System One quantum computer, with a fourth-generation 20-qubit processor, which has produced a Quantum Volume of 16, roughly double that of the current IBM Q 20-qubit IBM Q Network device, which has a Quantum Volume of 8.

Potential use cases, such as precisely simulating battery-cell chemistry for electric vehicles, speeding quadratic derivative models, and many others are already being investigated by IBM Q Network partners. To achieve Quantum Advantage in the 2020s, IBM believes the industry will need to continue doubling Quantum Volume every year.

In time AI should play a role expediting quantum computing.  For that, researchers will need to develop more effective AI that can identify patterns in data otherwise invisible to classical computers.

Until then how should most data centers proceed? IBM researchers suggest 3 initial steps:

  1. Develop quantum algorithms that demonstrate how quantum computers can improve AI classification accuracy.
  1. Improve feature mapping to a scale beyond the reach of the most powerful classical computers
  2. Classify data through the use of short depth circuits, allowing AI applications in the NISQ (noisy intermediate scale quantum) regime and a path forward to achieve quantum advantage for machine learning.

Sounds simple, right? Let DancingDinosaur know how you are progressing.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at technologywriter.com.

IBM Joins with Harley-Davidson for LiveWire

March 1, 2019

I should have written this piece 40 years ago as a young man fresh out of grad school. Then I was dying for a 1200cc Harley Davidson motorcycle. My mother was dead set against it—she wouldn’t even allow me to play tackle football and has since been vindicated (You win on that, mom.). My father, too, was opposed and wouldn’t help pay for it. I had to settle for a puny little motor scooter that offered zero excitement.

In the decades since I graduated, Harley’s fortunes have plummeted as the HOG (Harley Owners Group) community aged out and few youngsters have picked up the slack. The 1200cc bike I once lusted after probably is now too heavy for me to handle. So, what is Harley to do? Redefine its classic American brand with an electric model, LiveWire.

Courtesy: Harley Davidson, IBM

With LiveWire, Harley expects to remake the motorcycle as a cloud-connected machine and promises to deliver new products for fresh motorcycle segments, broaden engagement with the brand, and strengthen the H-D dealer network. It also boldly proclaimed that Harley-Davidson will lead the electrification of motorcycling.

According to the company, Harley’s LiveWire will leverage H-D Connect, a service (available in select markets), built on thIBM AI, analytics, and IoTe IBM Cloud. This will enable it to deliver new mobility and concierge services today and leverage an expanding use of IBM AI, analytics, and IoT to enhance and evolve the rider’s experience. In order to capture this next generation of bikers, Harley is working with IBM to transform the everyday experience of riding through the latest technologies and features IBM can deliver via the cloud.

Would DancingDinosaur, an aging Harley enthusiast, plunk down the thousands it would take to buy one of these? Since I rarely use my smartphone to do anything more than check email and news, I am probably not a likely prospect for LiveWire.

Will LiveWire save Harley? Maybe; it depends on what the promised services will actually deliver. Already, I can access a wide variety of services through my car but, other than Waze, I rarely use any of those.

According to the joint IBM-Harley announcement, a fully cellular-connected electric motorcycle needed a partner that could deliver mobility solutions that would meet riders’ changing expectations, as well as enhance security. With IBM, Harley hopes to strike a balance between using data to create both intelligent and personal experiences while maintaining privacy and security, said Marc McAllister, Harley-Davidson VP Product Planning and Portfolio in the announcement.

So, based on this description, are you ready to jump to LiveWire? You probably need more details. So far, IBM and Harley have identified only three:

  1. Powering The Ride: LiveWire riders will be able to check bike vitals at any time and from any location. Information available includes features such as range, battery health, and charge level. Motorcycle status features will also support the needs of the electric bike, such as the location of charging stations. Also riders can see their motorcycle’s current map location.  Identifying charging stations could be useful.
  2. Powering Security: An alert will be sent to the owner’s phone if the motorcycle has been bumped, tampered, or moved. GPS-enabled stolen-vehicle assistance will provide peace of mind that the motorcycle’s location can be tracked. (Requires law enforcement assistance. Available in select markets).
  3. Powering Convenience: Reminders about upcoming motorcycle service requirements and other care notifications will be provided. In addition, riders will receive automated service reminders as well as safety or recall notifications.

“The next generation of Harley-Davidson riders will demand a more engaged and personalized customer experience,” said Venkatesh Iyer, Vice President, North America IoT and Connected Solutions, Global Business Services, IBM. Introducing enhanced capabilities, he continues, via the IBM Cloud will not only enable new services immediately, but will also provide a roadmap for the journey ahead. (Huh?)

As much as DancingDinosaur aches for Harley to come roaring back with a story that will win the hearts of the HOG users who haven’t already drifted away Harley will need more than the usual buzzwords, trivial apps, and cloud hype.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at technologywriter.com.

Meet SUSE Enterprise Linux Server 12

February 25, 2019

A surprising amount of competition has emerged lately for Linux on the mainframe, but SUSE continues to be among the top of the heap.  With the newest release last fall, SUSE Linux Enterprise 12, should secure its position for some time to come.

SUSE touts SLE 12 as the latest version of its reliable, scalable and secure platform for efficiently deploying and managing highly available enterprise-class IT services in physical, virtual, or cloud environments. New products based on SLE 12 feature enhancements should allow for better system uptime, improved operational efficiency, and accelerated innovation. As the foundation for all SUSE data center operating systems and extensions, according to the company, SUSE Linux Enterprise meets the performance requirements of data centers with mixed IT environments while reducing the risk of technological obsolescence and vendor lock-in.

With SLE 12 the company also introduces an updated customer portal, SUSE Customer Center, to make it easier for customers to manage their subscriptions, access patches and updates, and communicate with SUSE customer support. It promises a new way to manage a SUSE account and subscriptions via one interface, anytime, anywhere.

Al Gillen, program vice president for servers and system software at IDC, said, “The industry is seeing growing movement of mission-critical workloads to Linux, with that trend expected to continue well into the future.” For Gillen, the modular design of SLE 12, as well as other mission-critical features like full system rollback and live kernel patching, helps address some of the key reservations customers express, and should help accelerate the adoption of Linux on z.

It’s about time. Linux has been available on the z for 20 years. Only with the introduction of IBM LinuxONE a couple of years ago has IBM gotten serious about Linux on z.  Around that time IBM also ported the Go programming language to LinuxOne. Go was developed by Google and is designed for building simple, reliable and efficient software, making it easier for developers to combine the software tools they know with the speed, security and scale offered by LinuxONE. Taking it even further, following Apple’s introduction of Swift as the new language for OS X and iOS application development. IBM began partnering with Apple to bring the power of Swift open source programming to the z. This was closely tied to Canonical’s Ubuntu port to the z.

And it didn’t stop there. IBM ported the Go programming language to LinuxOne too. Go was developed by Google and is designed for building simple, reliable and efficient software, making it easier for developers to combine the software tools they know with the speed, security and scale offered by LinuxONE. As expected IBM has contributed code to the Go community.

Then IBM brought Apple’s Swift programming to the party, first to the IBM Watson iOS SDK, which gives developers a Swift API to simplify integration with many of the Watson Developer Cloud services – all of which are available today, and can now be integrated with just a few lines of code. As soon as Apple introduced Swift as the new language for OS X and iOS application development. IBM began partnering with Apple to bring the power of Swift open source programming to the z. This was closely tied to Canonical’s Ubuntu port to the z, which has already been released.

With SUSE Linux Enterprise Server for x86_64, IBM Power Systems, and IBM System SUSE ES 12 has boosted its versatility, able to deliver business-critical IT services in a variety of physical, virtual, and cloud environments. New features like full system rollback, live kernel patching, and software modules increase data center uptime, improve operational efficiency, and accelerate the adoption of open source innovation. ES 12 further builds on SUSE’s leadership with Linux Containers technology and adds the Docker framework, which is now included as an integral part of the operating system.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at technologywriter.com.

IBM Enhances Storage for 2019

February 14, 2019

It has been a while since DancingDinosaur last looked closely at IBM’s storage efforts. The latest 4Q18 storage briefing, actually was held on Feb. 5, 2019 but followed by more storage announcements 2/11 and 2/12 For your sake, this blog will not delve into each of these many announcements. You can, however, find them at the previous link.

Sacramento-San Joaquin River Delta–IBM RESEARCH

As IBM likes to say whenever it is trying to convey the value of data: “data is more valuable than oil.”  Maybe it is time to update this to say data is more valuable than fresh, clean water, which is quickly heading toward becoming the most precious commodity on earth.

IBM CEO Ginny Rometty, says it yet another way: “80% of the world’s data, whether it’s decades of underwriting, pricing, customer experience, risk in loans… That is all with our clients. You don’t want to share it. That is gold,” maybe more valuable even, say, the value of fresh water. But whatever metaphor you choose to use—gold, clean water, oil, something else you perceive as priceless, this represents to IBM the value of data. To preserve the value it represents this data must be economically stored, protected, made accessible, analyzed, and selectively shared. That’s where IBM’s storage comes in.

And IBM storage has been on a modest multi-year storage growth trend.  Since 2016, IBM reports shipping 700 new NVMe systems, 850 VeraStack systems, 3000 DS8880 systems, 5500 PB of capacity, attracted 6,800 new IBM Spectrum (virtualized) storage customers, and sold 3,000 Storwize All-flash system along with 12,000 all-flash arrays shipped.

The bulk of the 2/5 storage announcements fell into 4 areas:

  1. IBM storage for containers and cloud
  2. AI storage
  3. Modern data protection
  4. Cyber resiliency

Except for modern data protection, much of this may be new to Z and Power data centers. However, some of the new announcements will interest Z shops. In particular, 219-135 –Statement of direction: IBM intends to deliver Managed-from-Z, a new feature of IBM Cloud Private for Linux on IBM Z. This will enable organizations to run and manage IBM Cloud Private applications from IBM Linux on Z or LinuxONE platforms. The new capability furthers IBM’s commitment to deliver multi-cloud and multi-architecture cloud-native technologies on the platform of the customer’s choice. Watson, too, will now be available on more platforms through newly announced Watson Anywhere—a version of IBM’s cognitive platform that can run Watson on-premises, in IBM’s cloud, or any other cloud, be it private or public.

Another interesting addition to the IBM storage line, the FlashSystem 9100. IBM FlashSystem 9100, as IBM explains it, combines the performance of flash and Non-Volatile Memory Express (NVMe) end-to-end with the reliability and innovation of IBM FlashCore technology and the rich features of IBM Spectrum Virtualize, — all packed into a 2U enterprise-class storage system. Providing intensive data driven multi-cloud storage capacity, FlashSystem 9100 is deeply integrated with the software defined (virtualized) capabilities of IBM Spectrum Storage, allowing organizations to easily add multi-cloud solutions that best support their business..

Finally, 219-029 –IBM Spectrum Protect V8.1.7 and IBM Spectrum Protect Plus V10.1.3 deliver new application support and optimization for long term data retention. Think of it this way: as the value of data increases, you will want to retain and protect it in more data in more ways for longer and longer. For this you will want the kind of flexible and cost-efficient storage available through Spectrum Protect.

In addition, at Think, IBM announced Watson Anywhere, a version of Watson that runs on-premises, in IBM’s cloud, or any other cloud, be it private or public.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at technologywriter.com.

Are Your Security Systems Reliable?

January 17, 2019

How confident are you in your security systems? Just a glance at reports of data losses should temper any confidence you may have. Verizon’s 2018 Data Breach Investigations Report (DBIR), as it is every year, should serve as a wakeup call.  Or as the report writers put it: identifying 53K+ incidents in only 12 months suggests an information security dystopia, an uneven playing field where the bad guys consistently win out. The 2018 report, in that regard, is not much different from previous years’ reports.

Syncsort, a leading mainframe ISV, released its own security survey results among 300 survey responders. It found 85 percent of respondents are either very or somewhat confident in their organization’s security program although 41 percent said their company had experienced a security breach and 20 percent more were unsure. I’d be more nervous about the 20% who weren’t sure than the 41 % who, at least, had identified a security breach. You can find Syncsort’s security announcement here.

Top security related challenges, courtesy Syncsort

 

To Syncsort, a particularly interesting challenge appeared to come from new data sources. Specifically, seven percent were familiar with newer but widely-adopted data storage options like Hadoop data lakes.

Cloud and compliance definitely are not new security challenges. Yet, they show up in the Syncsort survey:

  • Twenty-eight percent of respondents named adoption of cloud services as their top security-related challenge, followed by growing complexity of regulations (20%) and insufficient IT security staffing (19%).
  • The regulation most respondents had to adhere to was GDPR (37%), followed by HIPAA and SOX (32% each).

Fortunately, security (42%) and cloud computing (35%) are organizations’ top two IT priorities in the coming year. It probably is too much, however, to expect management will increase security staffing until the organization finds its security breach on the front page of a large daily newspaper like the New York Times or Wall Street Journal. This is the corporate equivalent to shutting the proverbial gate after the horses (or data) have left.

So who are the bad guys you are up against. Verizon has an answer to this: Almost three-quarters (73%) of cyberattacks were perpetrated by outsiders. Members of organized criminal groups were behind half of all breaches, with nation-state or state-affiliated actors involved in 12%.

But don’t get complacent. Insider threats may be the hardest to defend against. Over a quarter (28%) of attacks involved insiders. The insider threat can be particularly difficult to guard against, as Verizon points out, since it can be difficult to spot the signs if someone is using their legitimate access to your data for nefarious purposes. Or to put it another way, the chances are you should be more nervous about a disgruntled employee than about a North Korean agent.

Similarly, audit security regularly. Most audit only annually but a few audit more frequently. More frequently leads to better security. As Syncsort found: Thirty-two percent of responding organizations only perform security audits annually, while 23 percent do so every three months and 19 percent every six months.

The areas you examine in audits also can help you improve security effectiveness. For example, Syncsort survey responders were most likely to examine application security (72%), backup/disaster recovery processes (70%), network security (69%), and antivirus programs and password policies (67% each).

Not surprisingly different organizations have different security priorities. For example, security (42%) and cloud computing (35%) are top two IT priorities for the majority of organization in the coming year.  

However, twenty-eight percent of respondents named adoption of cloud services as their top security-related challenge, followed by growing complexity of regulations (20%), and insufficient IT security staffing (19%).

Respondents also differed on which regulation they felt they had to meet first. The regulation most respondents reported having to meet: GDPR (37%), followed by HIPAA and SOX (32% each).

If you have had a chance to periodically review the various Verizon Data Breach Investigations Reports, you won’t be surprised to learn that organizations continue to experience data breaches.  Specifically, Syncsort found: forty-one percent of organizations have experienced data breaches, while 39 percent have not, However, 20 percent say they don’t know? Kinda scary.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at technologywriter.com.

 


%d bloggers like this: