Mainframe will be with us for a long time yet. The reasons are many and complex.
Lack of impetus
There are plenty of neobanks out there that have built scalable, secure infrastructures using modern development practices in the cloud. But despite growing rapidly, none of them has the scale to challenge the big banks. Similarly in areas like the airline industry, the big airlines are all old and back-end technology is rarely the deciding factor in whether or not one airline is more efficient than another. There simply isn't as strong an impetus to change as you'd expect.
Risk
There is very little incentive for any given executive at one of these firms to take the risk involved in staking their reputation on a big technology migration. Equally, there's nothing quite like a failed transformation project to destroy the careers of those associated with it. If you think mainframe is antediluvian, take a look at the ERP software the same companies are running. Layer upon layer of legacy with custom code built to manage myriad edge-cases that nobody understands anymore. Why take the risk when you can build a new system that integrates with the mainframe using (for example) a modern database that gives you a modern transactional API while micro-batching updates back to the mainframe. The incentives are all to create additional cruft.
Outsourcing
Most if not all of the big banks, airlines etc. have outsourced considerable parts of their operations over the years. In doing so, institutional knowledge was shifted out of the business into those outsourcers. The outsourcers in turn have little incentive to drive transformation of the mainframe given that a move to cloud sees their revenue deriving from the infrastructure management go to near zero. The outsourcers don't even have to act in bad faith for this to be a major problem. McKinsey and the rest thrive on complexity and by advising clients to outsource, they layered organizational and contractual complexity on the technology complexity, making the problem of transformation increasingly irreducible.
After risk, outsourcing is probably the most important factor since it is extremely difficult to create outsourced structures which maintain and develop an organic link between those responsible for business processes, and those responsible for technology. The result is an ever growing pile of sclerotic processes, dysfunctional governance bodies and uni-functional teams (often themselves outsourced to different parties for competitive purposes) that purport to control but which really just create complexity.
Outsourcing has served to worsen the organizational complexity that most mainframe users already suffered from. The result is a situation in which any programme of work to get off mainframe becomes fearsomely complex. I've worked in places which would have regular meetings of large parts of the company to try to coordinate major business process change in a single area. I've seen companies nearly break themselves trying to bring a single outsourced business function back in house. The question is why, when they're so incredibly inefficient and inflexible, they aren't competed away. That's a different question on which I have my own opinions, but this comment is too long already.
Knowledge
The loss of COBOL and other mainframe technology knowledge is real. I remember working at a bank in the EU around 2010 where I sat with a bunch of elderly gentlemen (walking sticks were a theme) who had been contracted back into the bank to develop integration between an ancient mainframe application and something modern the bank was building.
But that stereotype aside (there are surprising numbers of younger mainframe experts in India thanks to outsourcing), the problem is real, particularly when it comes to migration of software from mainframe to cloud using modern development practices. Any migration away from mainframe software requires understanding the whole technology stack and more importantly, how that stack interacts with the equally complex stack of business processes.
AI code interpretation and generation might take a COBOL program and translate it into modern code, or even help re-architect it using modern principles. But without that understanding of the business processes as well as the up and downstream dependencies in their many forms, anything other than piecemeal change looks terrifying to anyone who might try to move away from mainframe.
IBM
The fact is that mainframe is an effective technology stack. But more importantly, IBM has become extremely good at both keeping it up to date while also owning the best ways of modernizing it.
They're good at making sure they control the path away from mainframe. The best, simplest and lowest risk approaches to getting off legacy code on mainframe are either developed by or bought by IBM. By enabling Linux on mainframe and providing straightforward migration paths from legacy code to that platform, IBM (and its many partners) ensures that modernization of mainframe for the most part means staying on mainframe. This has gone through multiple phases and taken lots of forms over the years but really, IBM has done a stupendous job of ensuing that the future of mainframe is usually mainframe.
The advent of AI code interpretation and generation is another example of this. IBM has already announced their own AI tooling to help customers make the migration to mainframe Linux faster and smoother: https://newsroom.ibm.com/2023-08-22-IBM-Unveils-watsonx-Gene....
The challenge for any AI startup or professional services company wanting to help customers move away from mainframe is that the people best placed to sell those tools are... IBM and its partners.
Might the situation change?
AI code interpretation and generation is getting better all the time. LLM context sizes are growing rapidly. The possibility of fine-tuning a code-generation model using a business' own source code is there. It's even possible that businesses who no longer have source code can use AI to analyze and decompose binaries. The days when AI can analyze a whole software infrastructure, re-architect it and re-write it whole-cloth are coming. But even with those tools, the organizational layering, process cruft and generalized loss of institutional knowledge is going to make elimination of mainframe a long-term, high-risk project.
This is not to say that it won't happen. But technology change can only ever happen successfully at the rate an organization is able to change along with it. The organizations which still use mainframe tend to be the biggest, most complex and sclerotic organizations on the planet. IBM is going to be enjoying the benefits of what it built decades ago for decades to come.
Lack of impetus
There are plenty of neobanks out there that have built scalable, secure infrastructures using modern development practices in the cloud. But despite growing rapidly, none of them has the scale to challenge the big banks. Similarly in areas like the airline industry, the big airlines are all old and back-end technology is rarely the deciding factor in whether or not one airline is more efficient than another. There simply isn't as strong an impetus to change as you'd expect.
Risk
There is very little incentive for any given executive at one of these firms to take the risk involved in staking their reputation on a big technology migration. Equally, there's nothing quite like a failed transformation project to destroy the careers of those associated with it. If you think mainframe is antediluvian, take a look at the ERP software the same companies are running. Layer upon layer of legacy with custom code built to manage myriad edge-cases that nobody understands anymore. Why take the risk when you can build a new system that integrates with the mainframe using (for example) a modern database that gives you a modern transactional API while micro-batching updates back to the mainframe. The incentives are all to create additional cruft.
Outsourcing
Most if not all of the big banks, airlines etc. have outsourced considerable parts of their operations over the years. In doing so, institutional knowledge was shifted out of the business into those outsourcers. The outsourcers in turn have little incentive to drive transformation of the mainframe given that a move to cloud sees their revenue deriving from the infrastructure management go to near zero. The outsourcers don't even have to act in bad faith for this to be a major problem. McKinsey and the rest thrive on complexity and by advising clients to outsource, they layered organizational and contractual complexity on the technology complexity, making the problem of transformation increasingly irreducible.
After risk, outsourcing is probably the most important factor since it is extremely difficult to create outsourced structures which maintain and develop an organic link between those responsible for business processes, and those responsible for technology. The result is an ever growing pile of sclerotic processes, dysfunctional governance bodies and uni-functional teams (often themselves outsourced to different parties for competitive purposes) that purport to control but which really just create complexity.
Outsourcing has served to worsen the organizational complexity that most mainframe users already suffered from. The result is a situation in which any programme of work to get off mainframe becomes fearsomely complex. I've worked in places which would have regular meetings of large parts of the company to try to coordinate major business process change in a single area. I've seen companies nearly break themselves trying to bring a single outsourced business function back in house. The question is why, when they're so incredibly inefficient and inflexible, they aren't competed away. That's a different question on which I have my own opinions, but this comment is too long already.
Knowledge
The loss of COBOL and other mainframe technology knowledge is real. I remember working at a bank in the EU around 2010 where I sat with a bunch of elderly gentlemen (walking sticks were a theme) who had been contracted back into the bank to develop integration between an ancient mainframe application and something modern the bank was building.
But that stereotype aside (there are surprising numbers of younger mainframe experts in India thanks to outsourcing), the problem is real, particularly when it comes to migration of software from mainframe to cloud using modern development practices. Any migration away from mainframe software requires understanding the whole technology stack and more importantly, how that stack interacts with the equally complex stack of business processes.
AI code interpretation and generation might take a COBOL program and translate it into modern code, or even help re-architect it using modern principles. But without that understanding of the business processes as well as the up and downstream dependencies in their many forms, anything other than piecemeal change looks terrifying to anyone who might try to move away from mainframe.
IBM
The fact is that mainframe is an effective technology stack. But more importantly, IBM has become extremely good at both keeping it up to date while also owning the best ways of modernizing it.
They're good at making sure they control the path away from mainframe. The best, simplest and lowest risk approaches to getting off legacy code on mainframe are either developed by or bought by IBM. By enabling Linux on mainframe and providing straightforward migration paths from legacy code to that platform, IBM (and its many partners) ensures that modernization of mainframe for the most part means staying on mainframe. This has gone through multiple phases and taken lots of forms over the years but really, IBM has done a stupendous job of ensuing that the future of mainframe is usually mainframe.
The advent of AI code interpretation and generation is another example of this. IBM has already announced their own AI tooling to help customers make the migration to mainframe Linux faster and smoother: https://newsroom.ibm.com/2023-08-22-IBM-Unveils-watsonx-Gene....
The challenge for any AI startup or professional services company wanting to help customers move away from mainframe is that the people best placed to sell those tools are... IBM and its partners.
Might the situation change?
AI code interpretation and generation is getting better all the time. LLM context sizes are growing rapidly. The possibility of fine-tuning a code-generation model using a business' own source code is there. It's even possible that businesses who no longer have source code can use AI to analyze and decompose binaries. The days when AI can analyze a whole software infrastructure, re-architect it and re-write it whole-cloth are coming. But even with those tools, the organizational layering, process cruft and generalized loss of institutional knowledge is going to make elimination of mainframe a long-term, high-risk project.
This is not to say that it won't happen. But technology change can only ever happen successfully at the rate an organization is able to change along with it. The organizations which still use mainframe tend to be the biggest, most complex and sclerotic organizations on the planet. IBM is going to be enjoying the benefits of what it built decades ago for decades to come.