Posts Tagged ‘data compression’

IBM Elastic Storage System 5000–AI Utility Storage Option

July 27, 2020

The newest storage from IBM is the Elastic Storage System 5000. It promises, as  you would expect, leading performance, density, and scalability, but that’s not the most interesting part of IBM’s July and Aug. storage offerings. In conjunction with the new storage hardware, IBM is tapping AI through IBM Storage Spectrum, a familiar product to create a smart utility storage option. This allows you to define your current and future storage needs on day one, deploy it all that day while paying only for the capacity you are actually using and activate more only when you need it, paying for it only when you have activated it.

AI Elastic Storage Systems via Spectrum 

Will that save you much money? Maybe, but what it mainly will save you is time. That comes from not having to go through the entire process of ordering and installing the extra capacity. DancingDinosaur guesses it will save you money if you usually over-ordered what you needed initially and paid for it then. IBM says some customers actually do that, but certainly there is no good reason to ever do that–paying for it in advance–at least no longer.

Yes, data volumes continue to grow at explosive rates. And yes, a prudent IT manager does not want to suddenly find the company in a position when a lack of sufficient storage is constraining the performance of critical applications. 

But a prudent data center manager should never be in that position. DancingDinosaur was always taught that the selective use of data compression can free up some top tier storage capacity, even on very short notice. And who doesn’t also have at least a few old storage arrays hanging around that can’t be put back into use to ease a sudden crunch, even if only briefly? OK, it won’t be the fastest or best storage but it could work for a short time at least.

Of course, IBM puts a somewhat different spin on the situation. It explains: For the last 30 years, the standard method has been for organizations to calculate their current application capacity needs and then guess the rest—hoping the guess is enough to meet future needs. The organization then works up an RFQ, RFP or some other tortuous procurement method to get the capacity they need as quickly as possible. The organization then invites all the vendors it knows and some it doesn’t to pitch their solution—at which point the organization usually finds it will need much more capacity than originally thought—or budgeted. 

And IBM continues: Then, only a few months later, the organization realizes its needs have changed—and that what it originally requested is no longer adequate. A year past the initial start, your new capacity is finally in place—and the organization hopes it won’t have to go through the same process again next year. 

Does that sound like you? DancingDinosaur hopes not, at least not in 2020. 

Maybe if your storage environment is large, one with more than 250 TB of storage and is growing, IBM notes. And depending how you specified it initially, additional capacity through its elastic storage program, is instantly available by simply provisioning what you need. From a billing standpoint, it allows you to move what would otherwise be high upfront costs to a predictable quarterly charge directly related to your business activity.

DancingDinosaur has long felt that IBM Spectrum Storage delivered a nifty set of capabilities even before AI became a current fad. If you can take advantage of it to shortcut the storage acquisition and provisioning process while holding onto a few bucks for a little longer what’s not to like. 

Alan Radding, a veteran information technology analyst, writer, and ghost-writer, is DancingDinosaur. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/


%d bloggers like this: