Submit support requests and browse self-service resources.
Beware the Ides of March!
"I predict that the last mainframe will be unplugged on March 15,1996"Stewart Alsop, Former InfoWorld Editor-in-Chief (1991)
"I predict that the last mainframe will be unplugged on March 15,1996"
Stewart Alsop, Former InfoWorld Editor-in-Chief (1991)
Seems his crystal ball may have been defective! He is not alone. For decades people have been predicting the death of the mainframe, but today large enterprises still run the bulk of their business on this platform using the same, decades old, DevOps. Why?
In a word, risk. The mainframe has proven to be an incredibly stable and reliable tool for decades. In that time over 200 billion lines of COBOL code alone have been written to run on this platform. That code represents the core business rules of the enterprise as well as billions of dollars in investment. Any attempt to refactor or re-platform that code and/or data must be approached with great caution as getting it wrong, even a little wrong, could prove very costly to the business.
I thought it apropos to address this topic on the 60th birthday of COBOL (or 61st depending on how you look at it). Either way it has fared just a bit better than Julius Caesar!
If it is so risky, then why do this in the first place? There are three primary concerns driving this initiative.
The underlying reason for any initiative of this size typically revolves around costs. Either reducing cost or at least preventing increase. This is quite relevant as running a mainframe shop is certainly not cheap. Large shops can spend tens, if not hundreds, of millions of dollars a year keeping their mainframe environments up and running. There are however other factors driving this need.
Human resources are another piece of the puzzle. Most shops today require a wide variety of skill sets just to maintain the mainframe status quo. Skilled Operators, Systems Programmers (MVS, CICS, IMS etc.), DB2 Admins, COBOL Programmers (not to mention Natural and PLI among others) etc. These resources will have a wide variety of skills in esoteric disciplines such as VSAM, JCL, BMS and many others. Many of these resources are nearing retirement age and, although training in these technologies is still available, young students are for the most part focusing on more modern languages and techniques. Shops are concerned about the ability to obtain/retain resources possessing a high degree of skill in these areas. Case in point: New Jersey Governor calls for emergency COBOL volunteers
Mainframe applications today are built, managed, and deployed on closed, proprietary architectures. User facing systems lack the look, feel, and features of modern systems. Simply converting from older to more modern languages, although a good first step, may not get us any closer to a true modern DevOps model.
I don’t have a crystal ball handy, unfortunately, but I still feel safe saying that the question is no longer ‘Is it too risky to move forwards’ but ‘Is it too risky to stand still’. I think we all know the answer.
Although there are several ways of approaching mainframe modernization, as well as a variety of third-party tooling available, most approaches will look like one of the following.
Mainframe to Cloud in MinutesLearn how you can use APIs to connect your mainframe data and applications to cloud. Watch the webinar below.
Learn how you can use APIs to connect your mainframe data and applications to cloud. Watch the webinar below.
For some, mainframe modernization means mainframe elimination. In some cases, this strategy may make sense but in others staying on the mainframe could be the wiser decision. With their acquisition of Red Hat (OpenShift) IBM is embracing the concepts of open hybrid cloud, open source, and modern DevOps. In addition, most enterprise’s critical data resides on the mainframe. Centralized and collocated process and data has many advantages. IBM has a well-earned reputation for their ability to run and manage large operations in a reliable and scalable fashion.
The good news is that, as the mainframe continues to evolve to more modern and open architectures, there will be a lot more latitude for making ‘best fit’ decisions when it comes to platform provisioning.
In subsequent blogs I’ll consider some of the strategies currently being employed for tackling this complex and delicate initiative. This will include discussing what might be some of the pitfalls, things to look for, and things to avoid when considering an approach and/or selecting from a growing set of available tooling.
See for yourself how Sola supports mainframe modernization.
▶️ WATCH THE DEMO
Principal Consultant, Akana by Perforce
Mike has over 30 years experience in software development and mainframe data communications. Mike was one of the original architects and writers of the Sola product and has experience in a wide variety of programming languages as well as having performed the roles of CICS Systems Programmer and DB2 Administrator.