It seems like one of those natural exclusion zones – ageing mainframe computers and modern, whizzy DevOps applications development methodologies seem culturally and technologically so far apart that they would never mix, so don’t even think about it.
Yet according to Rosalind Radcliffe a Distinguished Engineer and Chief Architect for DevOps at IBM, and David Rizzo, VP of Product Development with mainframe software tools specialist Compuware, the two go together well. Indeed, they go together well enough to open up new potential opportunities for both the systems and – hell’s teeth this can’t be true – its extremely pensionable dominant applications language, COBOL.
Speaking at the recent London outing of the Dev/Ops Enterprise Summit, Radcliffe set out the basic issue in simple terms:
z/OS (IBM’s mainframe operating system) is now just another DevOps target: it just works and is as much a part of the DevOps panoply as anything else. Does this mean that we will see 30-something developers working in COBOL? The common answer is currently `No’, but in fact the answer is already `Yes’. And the way to get them doing just that is to tell them that they can’t do it. We are finding that the typical time needed is around 12 hours for them to get started. Add in the DevOps model and there really could be a new lease of life for both COBOL and the mainframe.
Tools are already available for building applications using DevOps models, and more are coming along, which is where Compuware comes in. According to Rizzo, the potential of DevOps is being seen as a real opportunity for the company. He acknowledged that the company had been living largely on the residual markets and products that come with such a hardy species as the mainframe.
Though the machines have been written off and consigned to the dustbin more times than many DevOps developers have years in their ages, the machines and their applications continue to exist. The issue is that no company has yet come up with a viable, reliable and secure alternative for those vital back office functions that the machines still provide. And because the systems do the job so well there is little appetite for the risk of making such changes.
So the market for the annual – and often tri-annual – updates of applications, tools and operating systems is there as a steady, never-ending line of business out into the future.
How many updates? In a single year? Gosh…
But with the arrival of DevOps, and the realisation that its practices apply equally well in the musty mainframe environment, suddenly the chance of producing applications updates on a quarterly basis has become a reality. As Rizzo observed, while an update every three months may seem laughably slow in the web applications area where updates can number in the tens and even hundreds per day, in the mainframe world producing quarterly updates borders on indecent haste:
At first, many of our customers said they were not going to going to take it, were not going to go near the DevOps approach. But they have soon found there are benefits from updating applications far more regularly. We are now trying to push new ideas and functions on them, and they are responding positively.
For example, one approach now being taken up by users is Unit Testing of COBOL applications components on the mainframe, which is already common practice with Java developments. Compuware’s tools can also work directly with Jenkins, which is a bleeding edge tool amongst many in the DevOps community. Here, Jenkins communicates directly with the Compuware agent operating on the mainframe system. This has proved to be far a more simple task than many might presume, not least because most of Compuware’s applications and tools are written in Assembler, and Jenkins works with that language directly.
Now the company is grasping the DevOps nettle far more firmly with launch of Topaz Enterprise Data as part of the Topaz application development and testing environment. This has been specifically designed to provide mainframe DevOps teams with faster and more efficient ways to securely access and exploit data from both mainframes and other systems.
This is also now being hosted on AWS, making it available to developers on a SaaS basis. Topaz, and the company’s other applications and tools, also come equipped with APIs that allow them to connect and collaborate with other applications.
The company has also introduced z/Adviser, a new collaborative service that collects and collates DevOps-related data from Compuware customers, and uses a machine-learning model to provide back to them evidence of what development actions produce the best outcomes in code quality and speed of delivery.
Rizzo sees these additions opening up new applications opportunities for the venerable machines as they move into their second half-century of useful life:
We expect this to lead to the appearance of new applications for mainframe systems, especially in using new ways of connecting and exploiting the way front and back office systems can work together. In fact, we expect to be a mainframe systems partner for the next 50 years.
My take
This may prove to be a case where, if Mohammed won’t come to the mountain, then why not move the mountain to Mohammed? There have been many attempts to find ways of moving beyond the old world of mainframe batch programming and bring those essential back office services into the modern world. But the reliability and all-round efficacy of mainframes has always defeated such attempts. So why not bring some of that modern world to the mainframe?
It may surprise some people that DevOps models work just as well on the venerable beasts as it does on the latest whizzy cloud services, but that appears to be the case. And once that knowledge is absorbed, it is possible to see some interesting new service developments emerging.
Image credit – Mousemat available from zazzle.com
Article by channel:
Everything you need to know about Digital Transformation
The best articles, news and events direct to your inbox
Read more articles tagged: