Posts Tagged ‘System z’

BMC’s AMI Brings Machine Learning to Z

November 9, 2018

On Oct 18 BMC announced AMI, an automated mainframe intelligence capability that promises higher performing, self-managing mainframe environments to meet the growing demands created by digital business growth and do it through the use of AI-like capabilities.

AMI delivers a self-managing mainframe

BMC’s AMI solutions combine built-in domain expertise, machine learning, intelligent automation, and predictive analytics to help enterprises automatically manage, diagnose, heal, secure, and optimize mainframe processes. BMC doesn’t actually call it AI but they attribute all the AI buzzwords to it.

BMC cited Gartner saying: by 2020, thirty percent of data centers that fail to apply artificial intelligence and machine learning effectively in support of enterprise business will cease to be operationally and economically viable.  BMC is tapping machine learning in conjunction with its analysis of dozens of KPIs and millions of metrics a day to proactively identify, predict, and fix problems before they become an issue. In the process, BMC intends relieve the burden on enterprise teams and free up IT staff to work on high-value initiatives by removing manual processes through intelligent automation. Ultimately, the company hopes to keep its customers, as Gartner put it, operationally and economically viable.

In effect, mainframe-based organizations can benefit from BMC’s expertise in collecting deep and broad z/OS operational metrics from a variety of industry data sources, built-in world-class domain expertise, and multivariate analysis.

A lot of this already is available in the Z itself through a variety of tools, particularly zAware, described by IBM as a firmware feature consisting of an integrated set of analytic applications that monitor software running on z/OS and model normal system behavior. Its pattern recognition techniques identify unexpected messages, providing rapid diagnosis of problems caused by system changes.

But BMC is adding two new ingredients that should take this further, Autonomous Solutions and Enterprise Connectors.

Autonomous Solutions promise to enable IT operations that automatically anticipate and repair performance degradations and disruptive outages before they occur, without manual intervention. This set of intelligent, integrated solutions that compasses BMC AMI for Security Management, BMC AMI for DevOps, BMC AMI for Performance and Availability Management, and BMC AMI Cost and Capacity Management.

Enterprise Connectors move business-critical data from the mainframe to the entire enterprise and simplify the enterprise-wide management of business applications. The connectors promise a complete view of enterprise data by streaming mainframe metrics and related information in real-time to a variety of data receivers, including leading Security Information and Event Management (SIEM) solutions such as Splunk, IBM QRadar, ArcSight, LogRhythm, McAfee Enterprise Security Manager, and others. Note, BMC’s AMI Data Extractor for IMS solution is available now, additional extractors will be available early in 2019.

To bolster its mainframe business further. BMC in early October announced the acquisition of the assets of CorreLog, Inc., which provides real-time security management to mainframe customers. When combined with BMC’s offerings in systems, data, and cost management, it enables end-to-end solutions to ensure the availability, performance, and security of mission critical applications and data residing on today’s modern mainframe the merged operation. CorreLog brings capabilities for security and compliance auditing professionals who need more advanced network and system security, and improved adherence to key industry standards for protecting data.

The combination of CorreLog’s security offerings with BMC’s mainframe capabilities provides organizations with enhanced security capabilities including:

Real-time visibility into security events from mainframe environments, delivered directly into SIEM/SOC systems. It also brings a wide variety of security alerts, including IBM IMS and Db2, event log correlation, which provides up-to-the second security notifications for faster remediation in the event of a breach, and a 360-degree view of mainframe threat activity. The CorreLog deal is expected to close later this quarter.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at technologywriter.com.

IBM Takes Red Hat for $34 Billion

November 2, 2018

“The acquisition of Red Hat is a game-changer. It changes everything about the cloud market,” declared Ginni Rometty, IBM Chairman. At a cost of $34 billion, 10x Red Hat’s gross revenue, it had better be a game changer. See IBM’s announcement earlier this week here.

IBM Multicloud Manager Dashboard

IBM has been hot on the tail of the top three cloud hyperscalers—AWS, Google, and Microsoft/Azure. Will this change the game? Your guess is as good as anyone’s.

The hybrid cloud market appears to be IBM’s primary target. As the company put it: “IBM will become the world’s #1 hybrid cloud provider, offering companies the only open cloud solution that will unlock the full value of the cloud for their businesses.” IBM projects the value of the hybrid cloud market at $1 trillion within a few years!

Most companies today are only 20 percent along their cloud journey, renting compute power to cut costs. The next chapter of the cloud, noted Rometty, requires shifting business applications to hybrid cloud, extracting more data, and optimizing every part of the business.

Nobody has a lock on this market yet. Not IBM, not Red Hat, not VMware, but one thing seems clear; whoever wins will involve open source.  Red Hat, with $3 billion in open source revenue has proven that open source can pay. The only question is how quickly it can pay back IBM’s $34 billion bet.

What’s needed is something that promotes data portability and applications across multiple clouds, data security in a multi-cloud environment, and consistent cloud management. This is the Red Hat and IBM party line.  Both believe they will be well positioned to address these issues to accelerate hybrid multi-cloud adoption. To succeed at this, the new entity will have to tap their leadership in Linux, containers, Kubernetes, multi-cloud management, and automation.

IBM first brought Linux to the Z 20 years ago, making IBM an early advocate of open source, collaborating with Red Hat to help grow enterprise-class Linux.  More recently the two companies worked to bring enterprise Kubernetes and hybrid cloud solutions to the enterprise. These innovations have become core technologies within IBM’s $19 billion hybrid cloud business.

The initial announcement made the point Red Hat will join IBM’s Hybrid Cloud team as a distinct unit, as IBM described, preserving the independence and neutrality of Red Hat’s open source development heritage and commitment, current product portfolio, go-to-market strategy, and unique development culture. Also Red Hat will continue to be led by Jim Whitehurst and Red Hat’s current management team.

That camaraderie lasted until the Q&A following the announcement, when a couple of disagreements arose following different answers on relatively trivial points. Are you surprised? Let’s be clear, nobody spends $34 billion on a $3 billion asset and gives it a completely free hand. You can bet IBM will be calling the shots on everything it is feels is important. Would you do less?

Dharmesh Thakker, a contributor to Forbes, focused more on Red Hat’s OpenShift family of development software. These tools make software developers more productive and are helping transform how software is created and implemented across most enterprises today. So “OpenShift is likely the focus of IBM’s interest in Red Hat” he observes.

A few years ago, he continued, the pendulum seemed to shift from companies deploying more-traditional, on-premises datacenter infrastructure to using public cloud vendors, mostly Amazon. In the last few years, he continued, we’ve seen most mission-critical apps inside companies continue to run on a private cloud but modernized by agile tools and microservices to speed innovation. Private cloud represents 15-20% of datacenter spend, Thakker reports, but the combo of private plus one or more public clouds – hybrid cloud—is here to stay, especially for enterprises. Red Hat’s OpenShift technology enables on-premises, private cloud deployments, giving IBM the ability to play in the hybrid cloud.

IBM isn’t closing this deal until well into 2019; expect to hear more about this in the coming months.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at technologywriter.com.

 

 

 

GAO Blames Z for Government Inefficiency

October 19, 2018

Check out the GAO report from May 2016 here.  The Feds spent more than 75 percent of the total amount budgeted for information technology (IT) for fiscal year 2015 on operations and maintenance (O&M). In a related report, the IRS reported it used assembly language code and COBOL, both developed in the 1950s, for IMF and IDRS. Unfortunately, the GAO conflates the word “mainframe” to refer to outdated UNISYS mainframes with the modern, supported, and actively developed IBM Z mainframes, notes Ross Mauri, IBM general manager, Z systems.

Mainframes-mobile in the cloud courtesy of Compuware

The GAO repeatedly used “mainframe” to refer to outdated UNISYS mainframes alongside the latest advanced IBM Z mainframes.  COBOL, too, maintains active skills and training programs at many institutions and receives investment across many industries. In addition to COBOL, the IBM z14 also runs Java, Swift, Go, Python and other open languages to enable modern application enhancement and development. Does the GAO know that?

The GAO uses the word “mainframe” to refer to outdated UNISYS mainframes as well as modern, supported, and actively developed IBM Z mainframes. In a recent report, the GAO recommends moving to supported modern hardware. IBM agrees. The Z, however, does not expose mainframe investments to a rise in procurement and operating costs, nor to skilled staff issues, Mauri continued.

Three investments the GAO reviewed in the operations and maintenance clearly appear as legacy investments facing significant risks due to their reliance on obsolete programming languages, outdated hardware, and a shortage of staff with critical skills. For example, IRS reported that it used assembly language code and COBOL (both developed in the 1950s) for IMF and IDRS. What are these bureaucrats smoking?

The GAO also seems confused over the Z and the cloud. IBM Cloud Private is designed to run on Linux-based Z systems to take full advantage of the cloud through open containers while retaining the inherent benefits of Z hardware—security, availability,  scalability, reliability; all the ities enterprises have long relied on the z for. The GAO seems unaware that the Z’s automatic pervasive encryption immediately encrypts everything at rest or in transit. Furthermore, the GAO routinely addresses COBOL as a deficiency while ISVs and other signatories of the Open Letter consider it a modern, optimized, and actively supported programming language.

The GAO apparently isn’t even aware of IBM Cloud Private. IBM Cloud Private is compatible with leading IT systems manufacturers and has been optimized for IBM Z. All that you need to get started with the cloud is the starter kit available for IBM OpenPOWER LC (Linux) servers, enterprise Power Systems, and Hyperconverged Systems powered by Nutanix. You don’t even need a Z; just buy a low cost OpenPOWER LC (Linux) server online and configure it as desired.

Here is part of the letter that Compuware sent to the GAO, Federal CIOs, and members of Congress. It’s endorsed by several dozen members of the IT industry. The full letter is here:

In light of a June 2018 GAO report to the Internal Revenue Service suggesting the agency’s mainframe- and COBOL-based systems present significant risks to tax processing, we the mainframe IT community—developers, scholars, influencers and inventors—urge the IRS and other federal agencies to:

  • Reinvest in and modernize the mainframe platform and the mission-critical applications which many have long relied upon.
  • Prudently consider the financial risks and opportunity costs associated with rewriting and replacing proven, highly dependable mainframe applications, for which no “off-the-shelf” replacement exists.
  • Understand the security and performance requirements of these mainframe applications and data and the risk of migrating to platforms that were never designed to meet such requirements.

The Compuware letter goes on to state: In 2018, the mainframe is still the world’s most reliable, performant and securable platform, providing the lowest cost high-transaction system of record. Regarding COBOL it notes that since 2017 IBM z14 supports COBOL V6.2, which is optimized bi-monthly.

Finally, about attracting new COBOL workers: COBOL is as easy to work with it as any other language. In fact, open source Zowe has demonstrated appeal to young techies, providing solutions for development and operations teams to securely manage, control, script, and develop on the mainframe like any other cloud platform. What don’t they get?

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at technologywriter.com.

Z Acceptance Grows in BMC 2018 Survey

September 27, 2018

Did Zowe, introduced publicly just a few weeks ago, arrive in the nick of time, like the cavalry rescuing the mainframe from an aging workforce? In the latest BMC annual mainframe survey released in mid September, 95% of millennials are positive about the mainframe’s long-term prospects for supporting new and legacy applications. And 63% of respondents were under the age of 50, up ten points from the previous year.

The mainframe veterans, those with 30 or even 40 years of experience, are finally moving out. DancingDinosaur itself has been writing about the mainframe for about 35 years. With two recently married daughters even a hint of a grandchild on the way will be the signal for me to stop. In the meantime, read on.

Quite interesting from the BMC survey was the very high measures among executives believing in the long-term viability of the mainframe. More interesting to DancingDinosaur, however, was the interest in and willingness to use new mainframe technology like Linux and Java, which are not exactly new arrivals to the mainframe world; as we know, change takes time.

For example 28% of respondents cited as a strength the availability of new technology on the mainframe and their high level of confidence in that new technology. And this was before word about Zowe and what it could do to expand mainframe development got out. A little over a quarter of the respondents also cited using legacy apps to create new apps. Organizations are finally waking up to leveraging mainframe assets.

Also interesting was that both executives and technical staff cite application modernization among the top priorities. No complaints there. Similarly, BMC notes executive perception of the mainframe as a long-term solution is the highest in three years, a six point increase over 2016! While cost still remains a concern, BMC continues, the relative merits of the Z outweigh the costs and this perception continues to shift positively year after year.

The mainframe regularly has been slammed over the years as too costly. Yet. IBM has steadily lowered the cost of the mainframe in term of price performance. Now IBM is talking about applying AI to boost the efficiency, management, and operation of the mainframe data center.

The past May Gartner published a report confirming the value gains of the latest z14 and LinuxONE machines: The z14 ZR1 delivers an approximately 13% total capacity improvement over the z13’s maximum capacity for traditional z/OS environments. This is due to an estimated 10% boost in processor performance, as well as system design enhancements that improve the multiprocessor ratio. In the same report Gartner recommends including IBM’s LinuxONE Rockhopper II in RFPs for highly scalable, highly secure, Linux-based server solutions.

Several broad trends are coming together to feed the growing positive feelings the mainframe has experienced in recent years as revealed in the latest survey responses. “Absolute security and 24×7 availability have never been more important than now,” observes BMC’s John McKenny, VP of Strategy for ZSolutions Optimization. Here the Z itself plays a big part with pervasive encryption and secure containers.

Other trends, particularly digitization and mobility are “placing incredible pressure on both IT and mainframes to manage a greater volume, variety, and velocity of transactions and data, with workloads becoming more volatile and unpredictable,” said Bill Miller, president of ZSolutions at BMC. The latest BMC mainframe survey confirms executive and IT concerns in that area and the mainframe as an increasingly preferred response.

Bottom line: expect the mainframe to hang around for another decade or two at least. Long before then, DancingDinosaur will be a dithering grandfather playing with grandchildren and unable to get myself off the floor.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog. See more of his work at technologywriter.com.

Attract Young Techies to the Z

September 14, 2018

A decade ago DancingDinosaur was at a major IBM mainframe event and looked around at the analysts milling about and noticed all the gray hair and balding heads and very few women, and, worse, few appeared to be under 40, not exactly a crowd that would excite young male computer geeks. At the IBM introduction of the Z it had become even worse; more gray or balding heads, mine included, and none of the few Z professional female analysts that I knew under 40 were there at all.

millions of young eager to join the workforce (Image by © Reuters/CORBIS)

An IBM analyst relations person agreed, noting that she was under pressure from IBM to get some young techies at Z events.  Sounded like Mission Impossible to me. But my thinking has changed in the last couple of weeks. A couple of discussions with 20-something techies suggested that Zowe has the potential to be a game changer as far as young techies are concerned.

DancingDinosaur covered Zowe two weeks ago here. It represents the first open source framework for z/OS. As such it provides solutions for development and operations teams to securely manage, control, script, and develop on the mainframe like any other cloud platform.

Or, to put it another way, with Zowe IBM and partners CA Technologies and Rocket Software are enabling users to access z/OS using a new open-source framework. Zowe, more than anything before, brings together generations of systems that were not designed to handle global networks of sensors and devices. Now, decades since IBM brought Linux to the mainframe IBM, CA, and Rocket Software are introducing Zowe, as a new open-source software framework that bridges the divide between modern challenges like IoT and the mainframe.

Says Sean Grady, a young (under 30) software engineer at Rocket Software: Zowe to me is really cool, the first time I could have a sustained mainframe conversation with my peers. Their first reactions were really cynical, he recalls. Zowe changed that. “My peers know Linux tools really well,” he notes.

The mainframe is perceived as separate thing, something my peers couldn’t touch, he added. But Linux is something his peers know really well so through Zowe it has tools they know and like. Suddenly, the mainframe is no longer a separate, alien world but a familiar place. They can do the kind of work they like to do, in a way they like to do it by using familiar tools.

And they are well paid, much better than they can get coding here-and-gone mobile apps for some startup. Grady reports his starting offers ran up to $85k, not bad for a guy just out of college. And with a few years of experience now you can bet he’s doing a lot better than that.

The point of Zowe is to enable any developer, but especially new developers who don’t know or care about the mainframe, to manage, control, script, and develop on the mainframe like any other cloud platform. Additionally, Zowe allows teams to use the same familiar, industry-standard, open-source tools they already know to access mainframe resources and services.

The mainframe is older than many of the programmers IBM hopes Zowe will attract. But it opens new possibilities for next generation applications for mainframe shops desperately needing new mission-critical applications for which customers are clamoring. Already it appears ready to radically reduce the learning curve for the next generation.

Initial open source Zowe modules will include an extensible z/OS framework that provides new APIs and z/OS REST services to transform enterprise tools and DevOps processes that can incorporate new technology, languages, and workflows. It also will include a unifying workspace providing a browser-based desktop app container that can host both traditional and modern user experiences and is extensible via the latest web toolkits. The framework will also incorporate an interactive and scriptable command-line interface that enables new ways to integrate z/OS in cloud and distributed environments.

These modules represent just the start. More will be developed over time, enabling development teams to manage and develop on the mainframe like any other cloud platform. Additionally, the modules reduce risk and cost by allowing teams to use familiar, industry-standard, open source tools that can accelerate mainframe integration into their enterprise DevOps initiatives. Just use Zowe to entice new mainframe talent.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog. See more of his work at technologywriter.com.

Compuware Expedites DevOps on Z

July 13, 2018

Compuware continues its quarterly introduction of new capabilities for the mainframe, a process that has been going on for several years by now. The latest advance, Topaz for Enterprise Data, promises to expedite the way DevOps teams can access the data they need while reducing the complexity, labor, and risk through extraction, masking, and visualization of the mainframe. The result: the ability to leverage all available data sources to deliver high-value apps and analytics fast.

Topaz for Enterprise Data expedites data access for DevOps

The days when mainframe shops could take a methodical and deliberate approach—painstakingly slow—to accessing enterprise data have long passed. Your DevOps teams need to dig the value out of that data and put it into the hands of managers and LOB teams fast, in hours, maybe just minutes so they can jump on even the most fleeting opportunities.

Fast, streamlined access to high-value data has become an urgent concern as businesses seek competitive advantages in a digital economy while fulfilling increasingly stringent compliance requirements. Topaz for Enterprise Data enables developers, QA staff, operations teams, and data scientists at all skill and experience levels to ensure they have immediate, secure access to the data they need, when they need it, in any format required.

It starts with data masking, which in just the last few months has become a critical concern with the rollout of GDPR across the EU. GDPR grants considerable protections and options to the people whose data your systems have been collecting. Now you need to protect personally identifiable information (PII) and comply with regulatory mandates like GDPR and whatever similar regs will come here.

Regs like these don’t apply just to your primary transaction data. You need data masking with all your data, especially when large, diverse datasets of high business value residing on the mainframe contain sensitive business or personal information.

This isn’t going to go away anytime soon so large enterprises must start transferring responsibility for the stewardship of this data to the next generation of DevOps folks who will be stuck with it. You can bet somebody will surely step forward and say “you have to change every instance of my data that contains this or that.” Even the most expensive lawyers will not be able to blunt such requests. Better to have the tools in place to respond to this quickly and easily.

The newest tool, according to Compuware, is Topaz for Enterprise Data. It will enable even a mainframe- inexperienced DevOps team to:

  • Readily understand relationships between data even when they lack direct familiarity with specific data types or applications, to ensure data integrity and resulting code quality.
  • Quickly generate data for testing, training, or business analytics purposes that properly and accurately represents actual production data.
  • Ensure that any sensitive business or personal data extracted from production is properly masked for privacy and compliance purposes, while preserving essential data relationships and characteristics.
  • Convert file types as required.

Topaz users can access all these capabilities from within Topaz’s familiar Eclipse development environment, eliminating the need to learn yet another new and complicated tool.

Those who experience it apparently like what they find. Noted Lynn Farley, Manager of Data Management at TCF Bank: “Testing with production-like obfuscated data helps us develop and deliver better quality applications, as well as remain compliant with data privacy requirements, and Topaz provides our developers with a way to implement data privacy rules to mask multiple data types across platforms and with consistent results.”

Rich Ptak, principal of IT analyst firm Ptak Associates similarly observed: “Leveraging a modern interface for fast, simple access to data for testing and other purposes is critical to digital agility,” adding it “resolves the long-standing challenge of rapidly getting value from the reams of data in disparate sources and formats that are critical to DevOps and continuous improvement.”

“The wealth of data that should give large enterprises a major competitive advantage in the digital economy often instead becomes a hindrance due to the complexity of sourcing across platforms, databases, and formats,” said Chris O’Malley,Comp CEO of Compuware. As DancingDinosaur sees it, by removing such obstacles Compuware reduces the friction between enterprise data and business advantage.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog. See more of his work at technologywriter.com and here.

IBM Preps Z World for GDPR

June 1, 2018

Remember Y2K?  That was when calendars rolled over from the 1999 to 2000. It was hyped as an event that would screw up computers worldwide. Sorry, planes did not fall out of the sky overnight (or at all), elevators didn’t plummet to the basement, and hospitals and banks did not cease functioning. DancingDinosaur did OK writing white papers on preparing for Y2K. Maybe nothing bad happened because companies read papers like those and worked on changing their date fields.

Starting May 25, 2018 GDPR became the new Y2K. GRDP, the EC’s (or EU) General Data Protection Regulation (GDPR), an overhaul of existing EC data protection rules, promises to strengthen and unify those laws for EC citizens and organizations anywhere collecting and exchanging data involving its citizens. That is probably most of the readers of DancingDinosaur. GDRP went into effect at the end of May and generated a firestorm of trade business press but nothing near what Y2K did.  The primary GDPR objectives are to give citizens control over their personal data and simplify the regulatory environment for international business.

According to Bob Yelland, author of How it Works: GDPR, a Little Bee Book above, 50% of global companies  say they will struggle to meet the rules set out by Europe unless they make significant changes to how they operate, and this may lead many companies to appoint a Data Protection Officer, which the rules recommend. Doesn’t it feel a little like Y2K again?

The Economist in April wrote: “After years of deliberation on how best to protect personal data, the EC is imposing a set of tough rules. These are designed to improve how data are stored and used by giving more control to individuals over their information and by obliging companies to handle what data they have more carefully. “

As you would expect, IBM created a GDPR framework with five phases to help organizations achieve readiness: Assess, Design, Transform, Operate, and Conform. The goal of the framework is to help organizations manage security and privacy effectively in order to reduce risks and therefore avoid incidents.

DancingDinosaur is not an expert on GDPR in any sense, but from reading GDPR documents, the Z with its pervasive encryption and automated secure key management should eliminate many concerns. The rest probably can be handled by following good Z data center policy and practices.

There is only one area of GDPR, however, that may be foreign to North American organizations—the parts about respecting and protecting the private data of individuals.

As The Economist wrote: GDPR obliges organizations to create an inventory of the personal data they hold. With digital storage becoming ever cheaper, companies often keep hundreds of databases, many of which are long forgotten. To comply with the new regulation, firms have to think harder about data hygiene. This is something North American companies probably have not thought enough about.

IBM recommends you start by assessing your current data privacy situation under all of the GDPR provisions. In particular, discover where protected information is located in your enterprise. Under GDPR, individuals have rights to consent to access, correct, delete, and transfer personal data. This will be new to most North American data centers, even the best managed Z data centers.

Then, IBM advises, assess the current state of your security practices, identify gaps, and design security controls to plug those gaps. In the process find and prioritize security vulnerabilities, as well as any personal data assets and affected systems. Again, you will want to design appropriate controls. If this starts sounding a little too complicated just turn it over to IBM or any of the handful of other vendors who are racing GDPR readiness services into the market. IBM offers Data Privacy Consulting Services along with a GDPR readiness assessment.

Of course, you can just outsource it to IBM or others. IBM also offers its GDPR framework with five phases. The goal of the framework is to help organizations subject to GDPR manage security and privacy with the goal of reducing risks and avoiding problems.

GDPR is not going to be fun, especially the obligation to comply with each individual’s rights regarding their data. DancingDinosaur suspects it could even get downright ugly.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Please follow DancingDinosaur on Twitter, @mainframeblog. See more of his IT writing at technologywriter.com and here.

High Cost of Ignoring Z’s Pervasive Encryption

May 17, 2018

That cost was spelled out at IBM’s Think this past spring.  Writes David Bruce, who leads IBM’s strategies for security on IBM Z and LinuxONE, data breaches are expensive, costing $3.6 million on average. And hoping to avoid one by doing business as usual is a bad bet. Bruce reports breaches are increasingly likely: an organization has a 28 percent chance of being breached in the next 24 months. You can find Bruce’s comments on security and pervasive encryption here.

9 million data records were compromised in 2015

Were any of those 9 million records from your organization? Did you end up on the front page of the newspaper? To stay out of the data breach headlines, organizations require security solutions that protect enterprise and customer data at minimal cost and effort, Bruce observes.

Encryption is the preferred solution, but it is costly, cumbersome, labor-intensive, and hit-or-miss. It is hit-or-miss because the overhead involved forces organizations to choose what to encrypt and what to skip. You have to painstakingly classify the data in terms of risk, which takes time and only adds to the costs. Outside of critical revenue transactions or key intellectual property—no brainers—you will invariably choose wrong and miss something you will regret when it shows up on the front page of the New York Times.

Adding to the cost is the compliance runaround. Auditors are scheduled to visit or maybe they aren’t even scheduled and just drop in; you now have to drop whatever your staff was hoping to do and gather the necessary documentation to prove your data is safe and secure.  Do you really need this? Life is too short as it is.

You really want to put an end to the entire security compliance runaround and all the headaches it entails. But more than that, you want protected, secure data; all data, all the time.  When someone from a ransomware operation calls asking for hundreds or thousands of dollars to get your data back you can laugh and hang up the phone. That’s what Bruce means when he talks about pervasive encryption. All your data is safely encrypted with its keys protected from the moment it is created until the moment it is destroyed by you. And you don’t have to lift a finger; the Z does it all.

That embarrassing news item about a data breach; it won’t happen to you either. Most importantly of all, customers will never see it and get upset.

In fact, at Think, Forrester discussed today’s customer-obsessed approach that leading organizations are adopting to spur growth. To obsess over customers, explained Bruce, means to take great care in protecting the customer’s sensitive data, which provides the cornerstone of a customer-obsessed Forrester zero trust security framework. The framework includes, among other security elements, encryption of all data across the enterprise. Enabling the Z’s built in pervasive encryption and automatic key protection you can ignore the rest of Forrester’s framework.

Pervasive encryption, unique to Z, addresses the security challenges while helping you thrive in this age of the customer. At Think, Michael Jordan, IBM Distinguished Engineer for IBM Z Security, detailed how pervasive encryption represents a paradigm shift in security, reported Bruce. Previously, selective field-level encryption was the only feasible way to secure data, but it was time-, cost-, and resource-intensive – and it left large portions of data unsecured.

Pervasive encryption, however, offers a solution capable of encrypting data in bulk, making it possible and practical to encrypt all data associated with an application, database, and cloud service – whether on premises or in the cloud, at-rest or in-flight. This approach also simplifies compliance by eliminating the need to demonstrate compliance at the field level. Multiple layers of encryption – from disk and tape up through applications – provide the strongest possible defense against security breaches. The high levels of security enabled by pervasive encryption help you promote customer confidence by protecting their data and privacy.

If you have a Z and have not enabled pervasive encryption, you are putting your customers and your organization at risk. Am curious, please drop me a note why.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Please follow DancingDinosaur on Twitter, @mainframeblog. See more of his IT writing at technologywriter.com and here.

 

May 4, 2018

Compuware Tackles Mainframe Workforce Attrition and Batch Processing

While IBM works furiously to deliver quantum computing and expand AI and blockchain into just about everything, many DancingDinosaur readers are still wrestling with the traditional headaches and boosting quality and efficiency or mainframe operations and optimizing the most traditional mainframe activities there are, batch processes. Would be nice if quantum computing could handle multiple batch operations simultaneously but that’s not high on IBM’s list of quantum priorities.

So Compuware is stepping up as it has been doing quarterly by delivering new systems to expedite and facilitate conventional mainframe processes.  Its zAdviser promises actionable analytic insight to continuously improve quality, velocity and efficiency on the mainframe. While Compuware’s ThruPut Manager enables next-gen ITstaff to optimize mainframe batch execution through new visually intuitive workload scheduling.

zAdviser captures data about developers’ behaviors

zAdviser uses machine learning to continuously measure and improve an organization’s mainframe DevOps processes and development outcomes. Based on key performance indicators (KPIs), zAdviser measures application quality, as well as development speed and the efficiency of a development team. The result: managers can now make evidence-based decisions in support of their continuous improvement efforts.

The new tool leverages a set of analytic models that uncover correlations between mainframe developer behaviors and mainframe DevOps KPIs. These correlations represent the best available empirical evidence regarding the impact of process, training and tooling decisions on digital business outcomes. Compuware is offering zAdviser free to customers on current maintenance.

zAdviser leverages a set of analytic models that uncover correlations between mainframe developer behaviors and mainframe DevOps KPIs. These correlations represent the best available empirical evidence regarding the impact of process, training and tooling decisions on digital business outcomes.

Long mainframe software backlogs are no longer acceptable. Improvements in mainframe DevOps has become an urgent imperative for large enterprises that find themselves even more dependent on mainframe applications—not less. According to a recent Forrester Consulting study commissioned by Compuware, 57 percent of enterprises with a mainframe run more than half of their business-critical workloads on the mainframe. That percentage is expected to increase to 64 percent by 2019, while at the same time enterprises are failing to replace the expert mainframe workforce they have lost by attrition. Hence the need for modern, automated, intelligent tools to speed the learning curve for workers groomed on Python or Node.js.

Meanwhile, IBM hasn’t exactly been twiddling its thumbs in regard to DevOps analytics for the Z. Its zAware delivers a self-contained firmware IT analytics offering that helps systems and operations professionals rapidly identify problematic messages and unusual system behavior in near real time, which systems administrators can use to take corrective actions.

ThruPut Manager brings a new web interface that offers  visually intuitive insight for the mainframe staff, especially new staff, into how batch jobs are being initiated and executed—as well as the impact of those jobs on mainframe software licensing costs.

By implementing ThruPut Manager, Compuware explains, enterprises can better safeguard the performance of both batch and non-batch applications while avoiding the significant adverse economic impact of preventable spikes in utilization as measured by Rolling 4-Hour Averages (R4HA). Reducing the R4HA is a key way data centers can contain mainframe costs.

More importantly,  with the new ThruPut Manager, enterprises can successfully transfer batch management responsibilities to the next generation of IT staff with far less hands-on platform experience—without exposing themselves to related risks such as missed batch execution deadlines, missed SLAs, and excess costs.

With these new releases, Compuware is providing a way to reduce the mainframe software backlog—the long growing complaint that mainframe shops cannot deliver new requested functionality fast enough—while it offers a way to replace the attrition among aging mainframe staff with young staff who don’t have years of mainframe experience to fall back on. And if the new tools lower some mainframe costs however modestly in the process, no one but IBM will complain.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog. See more of his work at technologywriter.com and here.

IBM Introduces Skinny Z Systems

April 13, 2018

Early this week IBM unveiled two miniaturized mainframe models, dubbed skinny mainframes, it said are easier to deploy in a public or private cloud facility than their more traditional, much bulkier predecessors. Relying on all their design tricks, IBM engineers managed to pack each machine into a standard 19-inch rack with space to spare, which can be used for additional components.

Z14 LinuxONE Rockhopper II, 19-inch rack

The first new mainframe introduced this week, also in a 19-inch rack, is the Z14 model ZR1. You can expect subsequent models to increment the model numbering.  The second new machine is the LinuxONE Rockhopper II, also in a 19-inch rack.

In the past, about a year after IBM introduced a new mainframe, say the z10, it was introduced what it called a Business Class (BC) version. The BC machines were less richly configured, less expandable but delivered comparable performance with lower capacity and a distinctly lower price.

In a Q&A analyst session IBM insisted the new machines would be priced noticeably lower, as were the BC-class machines of the past. These are not comparable to the old BC machines. Instead, they are intended to attract a new group of users who face new challenges. As such, they come cloud-ready. The 19-inch industry standard, single-frame design is intended for easy placement into existing cloud data centers alongside other components and private cloud environments.

The company, said Ross Mauri, General Manager IBM Z, is targeting the new machines toward clients seeking robust security with pervasive encryption, cloud capabilities and powerful analytics through machine learning. Not only, he continued, does this increase security and capability in on-premises and hybrid cloud environments for clients, IBM will also deploy the new systems in IBM public cloud data centers as the company focuses on enhancing security and performance for increasingly intensive data loads.

In terms of security, the new machines will be hard to beat. IBM reports the new machines capable of processing over 850 million fully encrypted transactions a day on a single system. Along the same lines, the new mainframes do not require special space, cooling or energy. They do, however, still provide IBM’s pervasive encryption and Secure Service Container technology, which secures data serving at a massive scale.

Ross continued: The new IBM Z and IBM LinuxONE offerings also bring significant increases in capacity, performance, memory and cache across nearly all aspects of the system. A complete system redesign delivers this capacity growth in 40 percent less space and is standardized to be deployed in any data center. The z14 ZR1 can be the foundation for an IBM Cloud Private solution, creating a data-center-in-a-box by co-locating storage, networking and other elements in the same physical frame as the mainframe server.  This is where you can utilize that extra space, which was included in the 19-inch rack.

The LinuxONE Rockhopper II can also accommodate a Docker-certified infrastructure for Docker EE with integrated management and scale tested up to 330,000 Docker containers –allowing developers to build high-performance applications and embrace a micro-services architecture.

The 19-inch rack, however, comes with tradeoffs, notes Timothy Green writing in The Motley Fool. Yes, it takes up 40% less floor space than the full-size Z14, but accommodates only 30 processor cores, far below the 170 cores supported by a full size Z14, , which fills a 24-inch rack. Both new systems can handle around 850 million fully encrypted transactions per day, a fraction of the Z14’s full capacity. But not every company needs the full performance and capacity of the traditional mainframe. For companies that don’t need the full power of a Z14 mainframe, notes Green, or that have previously balked at the high price or massive footprint of full mainframe systems, these smaller mainframes may be just what it takes to bring them to the Z. Now IBM needs to come through with the advantageous pricing they insisted they would offer.

The new skinny mainframe are just the latest in IBM’s continuing efforts to keep the mainframe relevant. It began over a decade ago with porting Linux to the mainframe. It continued with Hadoop, blockchain, and containers. Machine learning and deep learning are coming right along.  The only question for DancingDinosaur is when IBM engineers will figure out how to put quantum computing on the Z and squeeze it into customers’ public or private cloud environments.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog. See more of his work at technologywriter.com and here.


%d bloggers like this: