Posts Tagged ‘Internet of Things (IoT)’

5G Joins Edge Technology and Hybrid Multicloud

May 11, 2020

At IBM’s virtual Think Conference the first week in May the company made a big play for edge computing and 5G together. 

From connected vehicles to intelligent manufacturing equipment, the amount of data from devices has resulted in unprecedented volumes of data at the edge. IBM is convinced the data volumes will compound as 5G networks increase the number of connected mobile devices.

z15 T02  and the LinuxONE 111 LT2

Edge computing  and 5G networks promise to reduce latency while improving speed, reliability, and processing. This will deliver faster and more comprehensive data analysis, deeper insights, faster response times, and improved experiences for employees, customers, and their customers.

First gaining prominence with the Internet of Things (IoT) a few years back IBM defined edge computing as a distributed computing framework that brings enterprise applications closer to where data is created and often remains, where it can be processed. This is where decisions are made and actions taken.

5G stands for the Fifth Generation of cellular wireless technology. Beyond higher speed and reduced latency, 5G standards will have a much higher connection density, allowing networks to handle greater numbers of connected devices combined with network slicing to isolate and protect designated applications.

Today, 10% of data is processed at the edge, an amount IBM expects to grow to 75% by 2025. Specifically, edge computing enables:

  • Better data control and lower costs by minimizing data transport to central hubs and reducing vulnerabilities and costs
  • Faster insights and actions by tapping into more sources of data and processing that data there, at the edge
  • Continuous operations by enabling systems that run autonomously, reduce disruption, and lower costs because data can be processed by the devices themselves on the spot and where decisions can be made

In short: the growing number of increasingly capable devices, faster 5G processing, and the increased pressure to drive the edge computing market beyond what the initial IoT proponents, who didn’t have 5G yet, envisioned. They also weren’t in a position to imagine the growth in the processing capabilities of edge devices in just the past year or two.

But that is starting to happen now, according to IDC: By 2023, half of the newly deployed on-premises infrastructure will be in critical edge locations rather than corporate datacenters, up from less than 10% today.

Also unimagined was the emergence of the hybrid multicloud, which IBM has only recently started to tout. The convergence of 5G, edge computing, and hybrid multicloud, according to the company, is redefining how businesses operate. As more embrace 5G and edge, the ability to modernize networks to take advantage of the edge opportunity is only now feasible. 

And all of this could play very well with the new z machines, the z15 T02  and LinuxONE lll LT2. These appear to be sufficiently capable to handle the scale of business edge strategies and hybrid cloud requirements for now. Or the enterprise class z15 if you need more horsepower.

By moving to a hybrid multicloud model, telcos can process data at both the core and network edge across multiple clouds, perform cognitive operations and make it easier to introduce and manage differentiated digital services. As 5G matures it will become the network technology that underpins the delivery of these services. 

Enterprises adopting a hybrid multicloud model that extends from corporate data centers (or public and private clouds) to the edge is critical to unlock new connected experiences. By extending cloud computing to the edge, enterprises can perform AI/analytics faster, run enterprise apps to reduce impacts from intermittent connectivity, and minimize data transport to central hubs for cost efficiency. 

Deploying a hybrid multicloud model from corporate data centers to the edge is central to capitalizing on  new connected experiences. By extending cloud computing to the edge, organizations can run AI/analytics faster  while minimizing data transport to central hubs for cost efficiency. By 2023, half of the newly deployed on-premises infrastructure will be in critical edge locations rather than corporate datacenters, up from less than 10% today. It’s time to start thinking about making edge part of your computer strategy. 

DancingDinosaur is Alan Radding, a veteran information technology analyst, writer, and ghost-writer. Follow DancingDinosaur on Twitter, @mainframeblog, and see more of his work at http://technologywriter.com/ 

IBM Puts Blockchain on the z System for a Disruptive Edge

April 22, 2016

Get ready for Blockchain to alter your z-based transaction environment. Blockchain brings a new class of distributed ledger applications. Bitcoin, the first Blockchain system to grab mainstream data center attention, is rudimentary compared to what the Linux Foundation’s open HyperledgerProject will deliver.

ibm-blockchain-adept1

As reported in CIO Magazine, Blockchain enables a distributed ledger technology with ability to settle transactions in seconds or minutes automatically via computers. This is a faster, potentially more secure settlement process than is used today among financial institutions, where clearing houses and other third-party intermediaries validate accounts and identities over a few days. Financial services, as well as other industries, are exploring blockchain for conducting transactions as diverse as trading stock, buying diamonds, and streaming music.

IBM in conjunction with the Linux Foundation’s HyperledgerProject expects the creation and management of Blockchain network services to power a new class of distributed ledger applications. With the HyperLedger and Blockchain developers could create digital assets and accompanying business logic to more securely and privately transfer assets among members of a permissioned Blockchain network running on IBM LinuxONE or Linux on z.

In addition, IBM will introduce fully integrated DevOps tools for creating, deploying, running and monitoring Blockchain applications on the IBM Cloud and enable applications to be deployed on IBM z Systems. Furthermore, by using Watson as part of an IoT platform IBM intends to make possible information from devices such as RFID-based locations, barcode-scan events, or device-recorded data to be used with IBM Blockchain apps. Clearly, IBM is looking at Blockchain for more than just electronic currency. In fact, Blockchain will enable a wide range of secure transactions between parties without the use of intermediaries, which should speed transaction flow. For starters, the company brought to the effort 44,000 lines of code as a founding member of the Linux Foundation’s HyperledgerProject

The z, with its rock solid reputation for no-fail, extreme high volume and performance, and secure processing, is a natural for Blockchain applications and system. In the process it brings the advanced cryptography, security, and reliability of the z platform. No longer content just to handle traditional backend systems-of-record processing IBM is pushing to bring the z into new areas that leverage the strength and flexibility of today’s mainframe.  As IoT ramps up expect the z to handle escalating volumes of IoT traffic, mobile traffic, and now blockchain distributed ledger traffic.  Says IBM: “We intend to support clients looking to deploy this disruptive technology at scale, with performance, availability and security.” That statement has z written all over it.

Further advancing the z into new areas, IBM reemphasized its advantages through built-in hardware accelerators for hashing and digital signatures, tamper-proof security cards, unlimited random keys to encode transactions, and integration to existing business data with Smart Contract APIs. IBM believes the z could take blockchain performance to new levels with the world’s fastest commercial processor, which is further optimized through the use of hundreds of internal processors. The highly scalable I/O system can handle massive amounts of transactions and the optimized network between virtual systems in a z Systems cloud can speed up blockchain peer communications.

An IBM Blockchain DevOps service will also enable blockchain applications to be deployed on the z, ensuring an additional level of security, availability and performance for handling sensitive and regulated data. Blockchain applications can access existing transactions on distributed servers and z through APIs to support new payment, settlement, supply chain, and business processes.

Use Blockchain on the z to create and manage Blockchain networks to power the emerging new classes of distributed ledger applications.  According to IBM, developers can create digital assets and the accompanying business logic to more securely and privately transfer assets among members of a permissioned Blockchain network. Using fully integrated DevOps tools for creating, deploying, running, and monitoring Blockchain applications on IBM Cloud, data centers can enable applications to be deployed on the z. Through the Watson IoT Platform, IBM will make it possible for information from devices such as RFID-based locations, barcode scans, or device-recorded data to be used with IBM Blockchain.

However, Blockchain remains nascent technology. Although the main use cases already are being developed and deployed many more ideas for blockchain systems and applications are only just being articulated. Nobody, not even the Linux Foundation, knows what ultimately will shake out. Blockchain enables developers to easily build secure distributed ledgers that can be used to exchange most anything of value fast and securely. Now is the time for data center managers at z shops to think what they might want to do with such extremely secure transactions on their z.

DancingDinosaur is Alan Radding, a veteran information technology analyst and writer. Please follow DancingDinosaur on Twitter, @mainframeblog. See more of his IT writing at technologywriter.com and here.

 

Play the Cloud-Mobile App Dev Game with z/OS Client Web Enablement

April 15, 2016

Is you z team feeling a little nervous that they are missing an important new game? Are business managers bugging you about running slick Cloud and mobile applications through the z? Worse, are they turning to third party contractors to build apps that will try to connect your z to the cloud and mobile world? If so, it is time to take a close look at IBM’s z/OS Client Web Enablement Toolkit.

mobile access backend data 1800FLOWERS

Accessing backend system through a mobile device

If you’re a z shop running Linux on z or a LinuxONE shop you don’t need z/OS Web Enablement. The issue only comes up when you need to connect the z/OS applications to cloud, web, and mobile apps. IBM began talking up z/OS Enablement Toolkit since early this year. Prior to the availability of the toolkit, native z/OS applications had little or no easy options available to participate as a web services client.

You undoubtedly know the z in its role as a no-fail transaction workhorse. More recently you’ve watched as it learned new tricks like managing big data or big data analytics through IBM’s own tools and more recently with Spark. The z absorbed the services wave with SOA and turned CICS into a handler for Web transactions. With Linux it learned an entire new way to relate to the broader distributed world. The z has rolled with all the changes and generally came out ahead.

Now the next change for z data centers has arrived. This is the cloud/web-mobile-analytics execution environment that seemingly is taking over the known world. It almost seems like nobody wants a straight DB2 CICS transaction without a slew of other devices getting involved, usually as clients. Now everything is HTTP REST to handle x86 clients and JSON along with a slew of even newer scripting languages. Heard about Python and Ruby? And they aren’t even the latest.  The problem: no easy way to perform HTTP REST calls or handle JSON parsing on z/OS. This results from the utter lack of native JSON services built into z/OS, according to Steve Warren, IBM’s z/OS Client Web Enablement guru.

Starting, however, with z/OS V2.2 and now available in z/OS V2.1 via a couple of service updates,  Warren reports, the new z/OS Client Web Enablement Toolkit changes the way a z/OS-based data center can think about z/OS applications communicating with another web server. As he explains it, the toolkit provides an easy-to-use, lightweight solution for applications looking to easily participate as a client, in a client/server web application. Isn’t that what all the kids are doing with Bluemix? So why not with the z and z/OS?

Specifically, the z/OS Toolkit provides a built-in protocol enabler using interfaces similar in nature to other industry-standard APIs along with a z/OS JSON parser to parse JSON text coming from any source and the ability to build new or add to existing JSON text, according to Warren.  Suddenly, it puts z/OS shops smack in the middle of this hot new game.

While almost all environments on z/OS can take advantage of these new services, Warren adds, traditional z/OS programs running in a native environment (apart from a z/OS UNIX or JVM environment) stand to benefit the most. Before the toolkit, native z/OS applications, as noted above, had little or no easy options available to them to participate as a web services client. Now they do.

Programs running as a batch job, a started procedure, or in almost any address space on a z/OS system have APIs they can utilize in a similar manner to any standard z/OS APIs provided by the OS. Programs invoke these APIs in the programming language of their choice. Among z languages, C/C++, COBOL, PL/I, and Assembler are fully supported, and the toolkit provides samples for C/C++, COBOL, PL/I initially. Linux on z and LinuxONE shops already can do this.

Businesses with z data centers are being forced by the market to adopt Web applications utilizing published Web APIs that can be used by something as small as the watch you wear, noted Warren. As a result, the proliferation of Web services applications in recent years has been staggering, and it’s not by coincidence. Representational state transfer (REST) applications are simple, use the ubiquitous HTTP protocol—which helps them to be platform-independent—and are easy to organize.  That’s what the young developers—the millennials—have been doing with Bluemix and other cloud-based development environments for their cloud, mobile, and  web-based applications.  With the z/OS web enablement toolkit now any z/OS shop can do the same. As IoT ramps up expect more demands for these kinds of applications and with a variety of new devices and APIs.

DancingDinosaur is Alan Radding, a veteran information technology analyst and writer. Please follow DancingDinosaur on Twitter, @mainframeblog. See more of his IT writing at technologywriter.com and here.

IBM Makes a Big Play for the API Economy with StrongLoop

September 25, 2015

APIs have become essential in connecting systems of engagement with the systems of record typically found on the IBM z System. That’s one reason why IBM earlier this month acquired StrongLoop, Inc., a software provider that helps developers connect enterprise applications to mobile, Internet of Things (IoT) and web applications in the cloud mainly through rapidly proliferating and changing APIs.  Take this as a key signal IBM intends to be a force in the emerging API economy. Its goal is to connect existing enterprise apps, data, and SOA services to new channels via APIs.

api economy ibm

Courtesy: developer.IBM.com (click to enlarge)

Key to the acquisition is StrongLoop’s position as a leading provider of Node.js, a scripting language that has become a favorite among developers needing to build applications using APIs. According to IBM it intends to integrate Node.js capabilities from StrongLoop with its own software portfolio, which already includes MobileFirst and WebSphere, to help organization better use enterprise data and conduct transactions whether in the cloud or on-premises.

These new capabilities, IBM continues, will enable organizations and developers to build scalable APIs, and more easily connect existing back-end enterprise processes with front-end mobile, IoT, and web apps in an open hybrid cloud. Node.js is one of the fastest growing development frameworks for creating and delivering APIs in part due to it similarities with JavaScript. This shortens the learning curve.

Although Node.js is emerging as the standard for APIs and micro-services, APIs still present challenges. These include the lack of an architected approach, limited scalability, multiple languages and point products, limited data connectors, and large, fragile monolithic applications.

Mainframe data centers, in particular, are sitting on proven software assets that beg to be broken out as micro-services to be combined and recombined to create new apps for use in mobile and Web contexts. As IoT ramps up the demand for these APIs and more will skyrocket.  And the mainframe data center will sit at the center of all this, possibly even becoming a revenue generator.

In response, StrongLoop brings API creation and lifecycle support and back end data connectors. It also will integrate with IBM’s API management, creating an API Platform that can enable polyglot run-times, integration, and API performance monitoring. It also will integrate with IBM’s MobileFirst Platform, WebSphere and other products, such as Bluemix, to enable Node across the product portfolio. StrongLoop also brings Arc and its LoopBack framework, which handles everything from API visual modeling to process manager to scale APIs, and a security gateway. Together StrongLoop Arc along with IBM’s API Management can deliver the full API lifecycle. IBM also will incorporate select capabilities from StrongLoop into its IoT Foundation, a topic DancingDinosaur expects to take up in the future.

At the initial StrongLoop acquisition announcement Marie Wieck, general manager, Middleware, IBM Systems, alluded to the data center possibilities, as noted above: “Enterprises are focused on digital transformation to reach new channels, tap new business models, and personalize their engagement with clients. APIs are a critical ingredient.” The fast adoption of Node.js for rapidly creating APIs combined with IBM’s strength in Java and API management on the IBM cloud platform promises a winning strategy.

To make this even more accessible, IBM is adding Node.js to Bluemix, following a summer of enhancements to Bluemix covered here by DancingDinosaur just a few weeks ago. Java remains the leading language for web applications and transaction systems. Combining StrongLoop’s Node.js tools and services with IBM’s WebSphere and Java capabilities will help organizations bridge Java and Node.js development platforms, enabling enterprises to extract greater value from their application investments. Throw in integration on IBM Bluemix and the Java and Node.js communities will gain access to many other IBM and third-party services including access to mobile services, data analytics, and Watson, IBM’s crown cognitive computing jewel.

DancingDinosaur is Alan Radding, a veteran IT analyst and writer. Please follow DancingDinosaur on Twitter, @mainframeblog. See more of his IT writing at technologywriter.com and here.

IBM Simplifies Internet of Things with developerWorks Recipes

August 6, 2015

IBM has a penchant for working through communities going back as far as Eclipse and probably before. Last week DancingDinosaur looked at the developerWorks Open community. Now let’s look at the IBM’s developerWorks Recipes community intended to address the Internet of Things (IoT).

recipes iot sensor tag

TI SensorTag

The Recipes community  will try to help developers – from novice to experienced – quickly and easily learn how to connect IoT devices to the cloud and how to use data coming from those connected devices. For example one receipe walks you through Connecting the TI Simplelink SensorTag (pictured above) to the IBM IoT foundation service in a few simple step. By following these steps a developer, according to IBM, should be able to connect the SensorTag to the IBM quickstart cloud service in less than 3 minutes. Think of recipes as simplified development patterns—so simple that almost anyone could follow it. (Wanted to try it myself but didn’t have a tag.  Still, it looked straightfoward enough.)

IoT is growing fast. Gartner forecasts 4.9 billion connected things in use in 2015, up 30% from 2014, and will reach 25 billion by 2020. In terms of revenue, this is huge. IDC predicts the worldwide IoT market to grow from $655.8 billion in 2014 to $1.7 trillion in 2020, a compound annual growth rate (CAGR) of 16.9%. For IT people who figure out how to do this, the opportunity will be boundless. Every organization will want to connect its devices to other devices via IoT. The developerWorks Recipes community seems like a perfect way to get started.

IoT isn’t exactly new. Manufacturers have cobbled together machine-to-machine (M2M) networks Banks and retailers have assembled networks of ATMs and POS terminals. DancingDinosaur has been writing about IoT for mainframe shops for several years.  Now deveoperWorks Recipes promises a way for just about anyone to set up their own IoT easily and quickly while leveraging the cloud in the process. There is a handful of recipes now but it provides a mechanism to add recipes so expect the catalog of recipes to steadily increase. And developers are certain to take existing recipes and improvise on them.

IBM has been trying to simplify  development for cloud, mobile, IoT starting with the launch of Bluemix last year. By helping users connect their IoT devices to IBM Bluemix, which today boasts more than 100 open-source tools and services, users can then run advanced analytics, utilize machine learning, and tap into additional Bluemix services to accelerate the adoption of  IoT and more.

As easy as IBM makes IoT development sound this is a nascent effort industry wide. There is a crying need for standards at every level to facilitate the interoperability and data exchange among the many and disparate devices, networks, and applications that will make up IoT.  Multiple organizations have initiated standards efforts but it will take some time to sort it all out.

And then there is the question of security. In a widely reported experiment by Wired Magazine  hackers were able to gain control of a popular smart vehicle. Given that cars are expected to be a major medium for IoT and every manufacturer is rushing to jam as much smart componentry into their vehicles you can only hope every automaker is  scrambling for security solutions .

Home appliances represent another fat, lucrative market target for manufacturers that want to embed intelligent devices and IoT into their all products. What if hackers access your automatic garage door opener? Or worse yet, what if they turn off your coffee maker and water heater? Could you start the day without a hot shower and cup of freshly brewed coffee and still function?

Running IoT through secure clouds like the IBM Cloud is part of the solution. And industry-specific clouds intended for IoT already are being announced, much like the Internet exchanges of a decade or two ago. Still, more work needs to be done on security and interoperability standards if IoT is to work seamlessly and broadly to achieve the trillions of dollars of economic value projected for it.

DancingDinosaur is Alan Radding, a veteran IT analyst and writer. Please follow DancingDinosaur on Twitter, @mainframeblog. See more of his IT writing at technologywriter.com and here.

 

 

 

 

 

 

 

 

IBM Continues Open Source Commitment with Apache Spark

June 18, 2015

If anyone believes IBM’s commitment to open source is a passing fad, forget it. IBM has invested billions in Linux, open Power through the Open Power Foundation, and more. Its latest is the announcement of a major commitment to Apache Spark, a fast open source and general cluster computing system for big data.

spark VGN8668

Courtesy of IBM: developers work with Spark at Galvanize Hackathon

As IBM sees it, Spark brings essential advances to large-scale data processing. Specifically, it dramatically improves the performance of data dependent-apps and is expected to play a big role in the Internet of Things (IoT). In addition, it radically simplifies the process of developing intelligent apps, which are fueled by data. It does so by providing high-level APIs in Scala, Java, and Python, and an optimized engine that supports general computation graphs for data analysis. It also supports a rich set of higher-level tools including Spark SQL for SQL and DataFrames, MLlib for machine learning, GraphX for graph processing, and Spark Streaming for stream processing.

IBM is contributing its breakthrough IBM SystemML machine learning technology to the Spark open source ecosystem. Spark brings essential advances to large-scale data processing, such as improvements in the performance of data dependent apps. It also radically simplifies the process of developing intelligent apps, which are fueled by data. But maybe the biggest advantage is that it can handle data coming from multiple, disparate sources.

What IBM likes in Spark is that it’s agile, fast, and easy to use. It also likes it being open source, which ensures it is improved continuously by a worldwide community. That’s also some of the main reasons mainframe and Power Systems data centers should pay attention to Spark.  Spark will make it easier to connect applications to data residing in your data center. If you haven’t yet noticed an uptick in mobile transactions coming into your data center, they will be coming. These benefit from Spark. And if you look out just a year or two, expect to see IoT applications adding to and needing to combine all sorts of data, much of it ending up on the mainframe or Power System in one form or another. So make sure Spark is on your radar screen.

Over the course of the next few months, IBM scientists and engineers will work with the Apache Spark open community to accelerate access to advanced machine learning capabilities and help drive speed-to-innovation in the development of smart business apps. By contributing SystemML, IBM hopes data scientists iterate faster to address the changing needs of business and to enable a growing ecosystem of app developers who will apply deep intelligence to everything.

To ensure that happens, IBM will commit more than 3,500 researchers and developers to work on Spark-related projects at more than a dozen labs worldwide, and open a Spark Technology Center in San Francisco for the Data Science and Developer community to foster design-led innovation in intelligent applications. IBM also aims to educate more than 1 million data scientists and data engineers on Spark through extensive partnerships with AMPLab, DataCamp, MetiStream, Galvanize, and Big Data University MOOC (Massive Open Online Course).

Of course, Spark isn’t going to be the end of tools to expedite the latest app dev. With IoT just beginning to gain widespread interest expect a flood of tools to expedite developing IoT data-intensive applications and more tools to facilitate connecting all these coming connected devices, estimated to number in the tens of billions within a few years.

DancingDinosaur applauds IBM’s decade-plus commitment to open source and its willingness to put real money and real code behind it. That means the IBM z System mainframe, the POWER platform, Linux, and the rest will be around for some time. That’s good; DancingDinosaur is not quite ready to retire.

DancingDinosaur is Alan Radding, a veteran IT analyst and writer. Please follow DancingDinosaur on Twitter, @mainframeblog. See more of his IT writing on Technologywriter.com and here.

New IBM Initiatives Speed System z to Hybrid Cloud and IoT

November 20, 2014

Cloud computing, especially hybrid cloud computing, is going mainstream. Same is happening with the Internet of Things (IoT).  For mainframe shops unsure of how to get there IBM promises to speed the journey with the two recent initiatives.

Let’s start with hybrid clouds and the z. As IBM describes it, enterprises will continue to derive value from the existing investments in IT infrastructure while looking to the cloud to bolster business agility. The upshot: organizations increasingly are turning to hybrid clouds to obtain the best of both worlds by linking on-premises IT infrastructure to public cloud.

To that end, IBM has designed and tested various use cases around enterprise hybrid architecture involving System z and SoftLayer. These use cases focus on the relevant issues of security, application performance, and potential business cost.

One scenario introduces the cloud as an opportunity to enrich enterprise business services running on the z with external functionality delivered from the cloud.

hybrid use case

Here a retail payment system [click graphic to enlarge] is enriched with global functionality from a loyalty program that allows the consumer to accumulate points. It involves the z and its payment system, a cloud-based loyalty program, and the consumer using a mobile phone.

The hybrid cloud allows the z data center to maintain control of key applications and data in order to meet critical business service level agreements and compliance requirements while tapping the public cloud for new capabilities, business agility, or rapid innovation and shifting expenditure from CAPEX to OPEX.

Since the z serves as the data backbone for many critical applications it makes sense to connect on-premises System z infrastructure with an off-premises cloud environment. In its paper IBM suggests the hybrid architecture should be designed in a way that gives the businesses the flexibility to put their workloads and data where it makes most sense, mixing the right blend of public and private cloud services. And, of course, it also must ensure data security and performance. That’s why you want the z there.

To get started check out the uses cases IBM provides, like the one above. Already a number of organizations are trying the IBM hybrid cloud: Macy’s, Whirlpool, Daimler, and Sicoss Group. Overall, nearly half of IBM’s top 100 strategic outsourcing clients already implementing cloud solutions with IBM as they transition to a hybrid cloud model.

And if hybrid cloud isn’t enough to keep you busy, it also is time to start thinking about the IoT. To make it easier last month the company announced the IBM Internet of Things Foundation, an extension of Bluemix. Like Bluemix, this is a cloud service that, as IBM describes it, makes it possible for a developer to quickly extend an Internet-connected device such as a sensor or controller into the cloud, build an application alongside the device to collect the data, and send real-time insights back to the developer’s business. That data can be analyzed on the z too, using Hadoop on zLinux, which you read about here a few weeks ago.

IoT should be nothing new to System z shops. DancingDinosaur discussed it this past summer here. Basically it’s the POS or ATM network on steroids with orders on magnitude more complexity. IDC estimates that by 2020 there will be as many as 28 billion autonomous IoT devices installed. Today it estimates there are nine billion.

Between the cloud, hybrid clouds, and IoT, z data centers will have a lot to keep them busy. But with IBM’s new initiatives in both areas you can get simple, highly secure and powerful application access to the cloud, IoT devices, and data. With the IoT Foundation you can rapidly compose applications, visualization dashboards and mobile apps that can generate valuable insights when linked with back office enterprise applications like those on the z.

DancingDinosaur is Alan Radding, a veteran IT writer/analyst. You can follow DancingDinosaur on Twitter, @mainframeblog. Also check out my other IT writing at Technologywriter.com and here.

Mobile Steps Up at IBM Enterprise2014

September 23, 2014

According to eMarketer, by 2017, mobile phone penetration will rise to 69.4% of the global population. The global smartphone audience, eMarketer reports, surpassed 1 billion in 2012 and will total 1.75 billion in 2014 and will continue a fast-paced trajectory through 2017.

OK, mobile and smartphones are hot, driving everything from the Internet of Things, (IoT) to shifts in mainframe peak volume trends. Between IoT and mobile you very well could looking at the future of the mainframe. IBM’s latest mainframe win, announced Sept. 11, identified the government of Croatia adopting the IBM z12EC 703 as the foundation for a new mobile government solution that enables citizens to choose to receive myriad messages and conduct official business on their mobile and smart devices while getting real time alerts.

Mobile transaction volumes already are starting to skew z/OS software usage and trigger new pricing programs. If you haven’t pinned down your mobile mainframe and Power Systems strategy, plan to get over to IBM Enterprise2014, Oct. 6-10 at the Venetian in Las Vegas. There you will find a wide range of mobile-related sessions for the System z, Power, and System i platforms.

You could start with Planning Your Mobile Enterprise Strategy: Future Directions in Enterprise Mobile Application Development by Ian Robinson. Here Robinson looks out the next 12-18 months at a new generation of mobile devices—including smartphones, tablets, wearables and other technologies comprising the IoT—being adopted in large numbers by consumers and employees while introducing considerable challenges and opportunities for enterprise IT managers and application developers. This session reviews the latest mobile trends and highlights where IBM MobileFirst software and services can help enterprise IT strategists prepare their organizations as mobile enterprises. BTW, last week IBM MobileFirst was highly endorsed by both Gartner and IDC.

You also will want to check out Robinson’s session on The MobileFirst Portfolio: IBM’s End-to-end Solution for the Enterprise Mobile App Development Lifecycle. Here he notes that the emerging era of enterprise mobile apps is radically different from traditional software delivery, leading many CIOs and IT managers to completely redefine their enterprise application strategies due to the rapid growth of smartphones and tablet devices among end users. In this session he describes IBM’s MobileFirst portfolio, an industry-leading set of products and capabilities designed to support the entire mobile app lifecycle, from design and development through to testing, integration, optimization, and deployment. Using real MobileFirst client examples, this session also highlights where IBM System z, Power Systems and PureSystems can play an essential role in supporting an enterprise mobile strategy.

A different take on mobile is IBM Electronic Support Engagement—Mobile Service Request, Support Portal, Twitter, Blogs and Wikis by Julie Craft.  Including a demo, she details all the ways you can use IBM electronic support tools from a mobile and social perspective. She presents new mobile apps as well as enhancements to service request that make working with IBM Support easier and saves time. In this session you are invited to voice your views on ways IBM can improve its interfaces to enhance your experience. She reports you’ll even have an opportunity to follow along on your mobile devices.

Finally, here’s a look at a hardware platform-specific mobile session: Mobile to Go, Overview of Mobile Technologies on IBM i by Tim Rowe, Alison Buterill. Android, Blackberry, iPhone, iPad, tablet, and on and on. So many mobile devices, so many applications. Employees want to work 24X7. They want access to email, to development, data, and the system. And they want to use their own interface from wherever they happen to be. How can you deliver the right interface to the right person at the right time? What is available to make the job easier? This session explores the various IBM i solutions that can help you deliver on the request to “Make Mine to Go”. A related session, Test Drive IBM i Mobile Access, provides a preview of the IBM i Mobile Access Solution in the form of a lab that offers a guided, self-paced, interaction with the solution. You can explore the 5250 interface, run SQL Queries, interact with Printed Output and the IFS and much more. Maybe, they suggest, you can take the lab from your own mobile device. Other sessions address mobile on the Power platform.

Without changing the mainframe’s basic role mobile is poised to dramatically alter mainframe computing. The z will continue as the always available, highly secure and scalable backend resource that delivers information on request and handles volumes of transactions.

Finally, don’t miss three evenings of live performances: 2 country rock groups, Delta Rae and The Wild Feathers and then, Rock of Ages. Check out all three and more here.

Alan Radding is DancingDinosaur. Look for me at Enterprise2014. You can follow this blog and more on Twitter, @mainframeblog. Also, find me on Technologywriter.com.


%d bloggers like this: