It has been a while since mainframes were a hot topic, but they’re finding new life in the AI era, moving beyond merely transactional platforms to ones that can extract valuable insights, reduce IT downtime, and keep data secure.
While mainframe computers continue to perform some of an enterprise’s most essential tasks, from processing millions of customer transactions in seconds to supporting massive workloads without missing a beat, they are also known for their high-processing power and ability to handle massive volumes of data efficiently, which makes them ideal for complex AI calculations.
Stopping fraud in its tracks
In the digital age, banks, retailers, and other organizations are increasingly vulnerable to a plethora of fraud schemes that can cause significant financial losses and reputational damage.
Just last November, Singapore’s largest banks, DBS and Citibank suffered outages that affected 2.5 million payment and ATM transactions, and blocked up to 810,000 attempts to access the digital banking platforms of both banks. While the outages were caused by a technical fault in the cooling system, these incidents highlight just how egregious downtime for such crucial institutions can be.
But by running AI models directly on the mainframe, where the transaction is executed and transactional data resides, the time it takes for data to be processed and made available is minimized. This means patterns can be analyzed in real-time, resulting in potentially fraudulent activities to be able to be quickly identified and flagged for further investigation.
Other activities like automated credit decision-making and loan modification can also be done directly on a mainframe platform, making AI insights and business decisions quicker and easier to assimilate.
Essential institutions such as banks have leveraged mainframes to perform AI fraud detection on 100 percent of their transactions. One such example is how they have engaged the retrofitting of existing payment authorization component of DXC’s Hogan banking system with IBM’s Telum chip to monitor suspicious activity, enabling swift action to prevent losses. Such adoptions can result in a $ 120 million yearly savings for an average tier-1 bank.
Embedding AI to address talent shortages & older languages
In today’s fast-paced digital landscape, the complexity of applications paired with a growing talent shortage can leave organizations unprepared for rapid spikes in demand or unable to quickly deliver the innovations their customers care about and need.
Due to their monolithic and complex design, mainframe applications are often hard for developers to understand and maintain.
However, by embedding AI directly into the mainframe development process, organizations can tackle service issues as they happen and allocate resources where they’re needed the most.
This includes helping with understanding existing code bases, automating the restructuring of computer code, and testing the accuracy of a translated piece of code. This means that when code is converted from one programming language to another, the translated version performs the exact same functions and produces the same results as the original code, with no errors or unintended changes in logic or behavior.
Also, with a dwindling number of developers today able to understand COBOL and other programming languages, new AI tools can assist with older coding language work by enabling developers to automate testing processes, build applications faster, and improve overall system performance.
This frees up developers to focus on more complex, creative, and exploratory tasks.
An operations-minded approach for the future
Even small IT failures can cost millions and result in reputational damage.
But by integrating AI into their mainframe operations, organizations will soon be able to reduce the time it takes to recover from an outage and bring critical systems back online.
AI will also be able to predict when an event might occur, allowing for proactive measures to prevent or mitigate its impact on production.
In this vision of the future, the need for specialized expertise is diminished and a broader range of mainframe developers can be successfully deployed. The result is a more cost-effective approach as the labor pool expands and the pressure to maintain complex systems with a narrow skill set is alleviated.
The big picture
Mainframes have long been the silent giants within the technological landscape. And we expect them to continue to evolve by teaming up with AI to make it easier for organizations to extract deeper insights, reduce project complexity, streamline operations, and keep fraud away.
Chris Drumgoole is managing director for Global Infrastructure Services at DXC Global Infrastructure Services, where he oversees a comprehensive portfolio that includes Modern Workplace, Cloud and Infrastructure, and Security services.
TNGlobal INSIDER publishes contributions relevant to entrepreneurship and innovation. You may submit your own original or published contributions subject to editorial discretion.
Featured image: Freepik