Posts Tagged ‘API economy’

IBM Boosts DevOps with ADDI on Z

February 9, 2018

IBM’s Application Discovery and Delivery Intelligence (ADDI) is an analytical platform for application modernization. It uses cognitive technologies to analyze mainframe applications so you can quickly discover and understand interdependencies and impacts of change. You can use this intelligence to transform and renew these applications faster than ever. Capitalize on time-tested mainframe code to engage the API economy. Accelerate application transformation of your IBM Z hybrid cloud environment and more.

Formerly, ADDI was known as EZSource. Back then EZSource was designed to expedite digital transformations by unlocking core business logic and apps. Specifically it enabled the IT team to pinpoint specific mainframe code in preparation for leveraging IT through a hybrid cloud strategy. In effect it enabled the understanding business-critical assets in preparation of deployment of a z-centered hybrid cloud. This also enabled enterprise DevOps, which was necessary to keep up with the pace of changes overtaking existing business processes.

This wasn’t easy when EZSource initially arrived and it still isn’t although the intelligence built into ADDI makes it easier now.  Originally it was intended to help the mainframe data center team to:

  • Identify API candidates to play in the API economy
  • Embrace micro services to deliver versatile apps fast
  • Identify code quality concerns, including dead code, to improve reliability and maintainability
  • Mitigate risk of change through understanding code, data, and schedule interdependencies
  • Aid in sizing the change effort
  • Automate documentation to improve understanding
  • Reduce learning curve as new people came onboarded
  • Add application understanding to DevOps lifecycle information to identify opportunities for work optimization

Today, IBM describes Application Discovery and Delivery Intelligence (ADDI), its follow-up to EZSource, as an analytical platform for application modernization. It uses cognitive technologies to analyze mainframe applications so your team can quickly discover and understand interdependencies and impacts of any change. In theory you should be able to use this intelligence to transform and renew these applications more efficiently and productively. In short, it should allow you to leverage time-tested mainframe code to engage with the API economy and accelerate the application transformation on your IBM Z and hybrid cloud environment.

More specifically, it promises to enable your team to analyze a broad range of IBM and non-IBM programing languages, databases, workload schedulers, and environments. Enterprise application portfolios were built over decades using an ever-evolving set of technologies, so you need a tool with broad support, such as ADDI, to truly understand the relationships between application components and accurately determine the impacts of potential changes.

In practice, it integrates with mainframe environments and tools via a z/OS agent to automatically synchronize application changes. Without keeping your application analysis synchronized with the latest changes that your developers made, according to IBM, your analysis can get out of date and you risk missing critical changes.

In addition, it provides visual analysis integrated with leading IDEs. Data center managers are petrified of changing applications that still work, fearing they will inadvertently break it or slow performance. When modifying complex applications, you need to be able to quickly navigate the dependencies between application components and drill down to see relevant details. After you understand the code, you can then effectively modify it at much lower risk. The integration between ADDI and IBM Developer for z (IDz) combines the leading mainframe IDE with the application understanding and analytics capabilities you need to safely and efficiently modify the code.

It also, IBM continues, cognitively optimizes your test suites.  When you have a large code base to maintain and manyf tests to run, you must run the tests most optimally. ADDI correlates code coverage data and code changes with test execution records to enable you to identify which regression tests are the most critical, allowing you to optimize time and resources while reducing risk. It exposes poorly tested or complex code and empowers the test teams with cognitive insights that turns awareness of trends into mitigation of future risks.

Finally, ADDI intelligently identifies performance degradations before they hit production. It correlates runtime performance data with application discovery data and test data to quickly pinpoint performance degradation and narrow down the code artifacts to those that are relevant to the cause of bad performance. This enables early detection of performance issues and speeds resolution.

What’s the biggest benefit of ADDI on the Z? It enables your data center to play a central role in digital transformation, a phrase spoken by every c-level executive today as a holy mantra. But more importantly, it will keep your mainframe relevant.

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Please follow DancingDinosaur on Twitter, @mainframeblog. See more of his IT writing at technologywriter.com and here.

IBM Makes a Big Play for the API Economy with StrongLoop

September 25, 2015

APIs have become essential in connecting systems of engagement with the systems of record typically found on the IBM z System. That’s one reason why IBM earlier this month acquired StrongLoop, Inc., a software provider that helps developers connect enterprise applications to mobile, Internet of Things (IoT) and web applications in the cloud mainly through rapidly proliferating and changing APIs.  Take this as a key signal IBM intends to be a force in the emerging API economy. Its goal is to connect existing enterprise apps, data, and SOA services to new channels via APIs.

api economy ibm

Courtesy: developer.IBM.com (click to enlarge)

Key to the acquisition is StrongLoop’s position as a leading provider of Node.js, a scripting language that has become a favorite among developers needing to build applications using APIs. According to IBM it intends to integrate Node.js capabilities from StrongLoop with its own software portfolio, which already includes MobileFirst and WebSphere, to help organization better use enterprise data and conduct transactions whether in the cloud or on-premises.

These new capabilities, IBM continues, will enable organizations and developers to build scalable APIs, and more easily connect existing back-end enterprise processes with front-end mobile, IoT, and web apps in an open hybrid cloud. Node.js is one of the fastest growing development frameworks for creating and delivering APIs in part due to it similarities with JavaScript. This shortens the learning curve.

Although Node.js is emerging as the standard for APIs and micro-services, APIs still present challenges. These include the lack of an architected approach, limited scalability, multiple languages and point products, limited data connectors, and large, fragile monolithic applications.

Mainframe data centers, in particular, are sitting on proven software assets that beg to be broken out as micro-services to be combined and recombined to create new apps for use in mobile and Web contexts. As IoT ramps up the demand for these APIs and more will skyrocket.  And the mainframe data center will sit at the center of all this, possibly even becoming a revenue generator.

In response, StrongLoop brings API creation and lifecycle support and back end data connectors. It also will integrate with IBM’s API management, creating an API Platform that can enable polyglot run-times, integration, and API performance monitoring. It also will integrate with IBM’s MobileFirst Platform, WebSphere and other products, such as Bluemix, to enable Node across the product portfolio. StrongLoop also brings Arc and its LoopBack framework, which handles everything from API visual modeling to process manager to scale APIs, and a security gateway. Together StrongLoop Arc along with IBM’s API Management can deliver the full API lifecycle. IBM also will incorporate select capabilities from StrongLoop into its IoT Foundation, a topic DancingDinosaur expects to take up in the future.

At the initial StrongLoop acquisition announcement Marie Wieck, general manager, Middleware, IBM Systems, alluded to the data center possibilities, as noted above: “Enterprises are focused on digital transformation to reach new channels, tap new business models, and personalize their engagement with clients. APIs are a critical ingredient.” The fast adoption of Node.js for rapidly creating APIs combined with IBM’s strength in Java and API management on the IBM cloud platform promises a winning strategy.

To make this even more accessible, IBM is adding Node.js to Bluemix, following a summer of enhancements to Bluemix covered here by DancingDinosaur just a few weeks ago. Java remains the leading language for web applications and transaction systems. Combining StrongLoop’s Node.js tools and services with IBM’s WebSphere and Java capabilities will help organizations bridge Java and Node.js development platforms, enabling enterprises to extract greater value from their application investments. Throw in integration on IBM Bluemix and the Java and Node.js communities will gain access to many other IBM and third-party services including access to mobile services, data analytics, and Watson, IBM’s crown cognitive computing jewel.

DancingDinosaur is Alan Radding, a veteran IT analyst and writer. Please follow DancingDinosaur on Twitter, @mainframeblog. See more of his IT writing at technologywriter.com and here.

API Economy Comes to the IBM z System

June 11, 2015

What comes to mind when you hear (or read) about a RESTful IBM z System? Hint: it is not a mainframe that is loafing. To the contrary, a RESTful mainframe probably is busier than it has ever been, now running a slew of new apps, most likely mobile or social apps with REST APIs connecting to z/OS-based web services plus its usual workloads. Remember web services when SOA first came to the mainframe? They continue today behind the new mobile, cloud, social, and analytical workloads that are putting the spotlight on the mainframe.

Travel and Transportation - Passenger Care

Courtesy of IBM: travel fuels mobile activity (click to enlarge)

A variety of Edge2015 sessions, given by Asit Dan, chief architect, z Service API Management and Glenn Anderson, IBM Lab Services and Training, put what the industry refers to as the emerging API economy in perspective. The z, it should come as no surprise, lies at the heart of this burgeoning API economy, not only handling transactions but also providing governance and management to the API phenomenon that is exploding. Check out IBM’s APIs for Dummies.

The difference between first generation SOA and today’s API economy lies in the new workloads—especially mobile and cloud—fueling the surging interest. The mobile device certainly is the fastest growing platform and will likely become the largest platform soon if it is not already, surpassing desktop and laptop systems.

SOA efforts initially focused on the capabilities of the providers of services, noted Dan, particularly the development, run-time invocation, and management of services. The API economy, on the other hand, focuses on the consumption of these services. It really aims to facilitate the efforts of application developers (internal developers and external business partners) who must code their apps for access to existing and new API-enabled services.

One goal of an enterprise API effort is to access already deployed services, such z-based CICS services or those of a partner. Maybe a more important goal, especially where the z is involved, is to drive use of mainframe software assets by customers, particularly mobile customers.  The API effort not only improves customer service and satisfaction but could also drive added revenue. (Have you ever fantasized of the z as a direct revenue generator?)

This calls, however, for a new set of interfaces. As Dan notes in a recent piece, APIs for accessing these assets, defined using well known standards such as web services and Representational State Transfer (REST) with JSON (JavaScript Object Notation), and published via an easily accessible catalog, make it efficient to subscribe to APIs for obtaining permissions and building new applications. Access to the APIs now can be controlled and tracked during run-time invocations (and even metered where revenue generation is the goal).

Now the API economy can morph into a commercial exchange of business functions, capabilities, and competencies as services using web APIs, noted Glenn Anderson at Edge2015. In-house business functions running on the z can evolve into an API as-a-service delivery vehicle, which amounts to another revenue stream for the mainframe data center.

The API economy often is associated with the concept of containers. Container technology provides a simplified way to make applications more mobile in a hybrid cloud, Anderson explained, and brings some distinct advantages. Specifically, containers are much smaller in size than virtual machines and provide more freedom in the placement of workloads in a cloud (private, public, hybrid) environment. Container technology is being integrated into OpenStack, which is supported on the z through IBM Cloud Manager. Docker is the best known container technology and it works with Linux on z.

With the combination of SOA, web services, REST, JSON, OpenStack, and Docker all z capable, a mainframe data center can fully participate in the mobile, apps, cloud API economy. BTW, POWER servers also can play the API, OpenStack, Docker game too. Even Watson can participate in the API economy through IBM’s early March acquisition of AlchemyAPI, a provider of scalable cognitive computing API services. The acquisition will drive the API economy into cognitive computing too. Welcome to the mainframe API economy.

DancingDinosaur is Alan Radding, a veteran IT analyst and writer. Please follow DancingDinosaur on Twitter, @mainframeblog. See more of his IT writing on Technologywriter.com and here.


%d bloggers like this: