How Is Your Mainframe Modernization Effort Going?
Beware the Ides of March!
"I predict that the last mainframe will be unplugged on March 15,1996"
Stewart Alsop, Former InfoWorld Editor-in-Chief (1991)
Seems his crystal ball may have been defective! He is not alone. For decades people have been predicting the death of the mainframe, but today large enterprises still run the bulk of their business on this platform using the same, decades old, DevOps. Why?
In a word, risk. The mainframe has proven to be an incredibly stable and reliable tool for decades. In that time over 200 billion lines of COBOL code alone have been written to run on this platform. That code represents the core business rules of the enterprise as well as billions of dollars in investment. Any attempt to refactor or re-platform that code and/or data must be approached with great caution as getting it wrong, even a little wrong, could prove very costly to the business.
I thought it apropos to address this topic on the 60th birthday of COBOL (or 61st depending on how you look at it). Either way it has fared just a bit better than Julius Caesar!
The Modernization Trifecta
If it is so risky, then why do this in the first place? There are three primary concerns driving this initiative.
The underlying reason for any initiative of this size typically revolves around costs. Either reducing cost or at least preventing increase. This is quite relevant as running a mainframe shop is certainly not cheap. Large shops can spend tens, if not hundreds, of millions of dollars a year keeping their mainframe environments up and running. There are however other factors driving this need.
Human resources are another piece of the puzzle. Most shops today require a wide variety of skill sets just to maintain the mainframe status quo. Skilled Operators, Systems Programmers (MVS, CICS, IMS etc.), DB2 Admins, COBOL Programmers (not to mention Natural and PLI among others) etc. These resources will have a wide variety of skills in esoteric disciplines such as VSAM, JCL, BMS and many others. Many of these resources are nearing retirement age and, although training in these technologies is still available, young students are for the most part focusing on more modern languages and techniques. Shops are concerned about the ability to obtain/retain resources possessing a high degree of skill in these areas. Case in point: New Jersey Governor calls for emergency COBOL volunteers
Mainframe applications today are built, managed, and deployed on closed, proprietary architectures. User facing systems lack the look, feel, and features of modern systems. Simply converting from older to more modern languages, although a good first step, may not get us any closer to a true modern DevOps model.
I don’t have a crystal ball handy, unfortunately, but I still feel safe saying that the question is no longer ‘Is it too risky to move forwards’ but ‘Is it too risky to stand still’. I think we all know the answer.
Common Approaches to Mainframe Modernization
Although there are several ways of approaching mainframe modernization, as well as a variety of third-party tooling available, most approaches will look like one of the following.
- Leave the data and applications where they are but start converting programs to more modern languages such as Java. Converting non-agnostic data sources, such as VSAM, to more platform neutral relational data stores is also an important consideration. This will help to mitigate some of the risks regarding human resources while possibly making business code eligible to run on less costly CPUs (zIIP). I would look at this approach as a cost reduction play and not as much a modernization strategy as it does not address the fundamental way in which applications are being built today although it is a big step in that direction.
- Complete rewrite using more modern languages, techniques, and platforms. This is probably the direction that most mainframe shops would like to take. In doing so, they can opt to move data and process to a Cloud based architecture (either public or private). Modern DevOps tools and techniques (e.g. Docker Containers, Kubernetes, OpenShift etc.) can be used. This approach gives shops the option of staying on their mainframe platform or leaving the mainframe platform in part or altogether.
- A hybrid strategy that involves a combination of these approaches taken in logical steps.
Should I Stay or Should I Go?
For some, mainframe modernization means mainframe elimination. In some cases, this strategy may make sense but in others staying on the mainframe could be the wiser decision. With their acquisition of Red Hat (OpenShift) IBM is embracing the concepts of open hybrid cloud, open source, and modern DevOps. In addition, most enterprise’s critical data resides on the mainframe. Centralized and collocated process and data has many advantages. IBM has a well-earned reputation for their ability to run and manage large operations in a reliable and scalable fashion.
The good news is that, as the mainframe continues to evolve to more modern and open architectures, there will be a lot more latitude for making ‘best fit’ decisions when it comes to platform provisioning.
In subsequent blogs I’ll consider some of the strategies currently being employed for tackling this complex and delicate initiative. This will include discussing what might be some of the pitfalls, things to look for, and things to avoid when considering an approach and/or selecting from a growing set of available tooling.
Learn more about Sola, the Akana mainframe API solution.