Outside of the context of mainframes, modernization is generally considered to be a good thing by bringing something that much closer to the future. Inside the context of mainframes, you have two warring factions: one declaring that the mainframe, thanks to its impeccable engineering and groundbreaking innovations in hardware, is the most modern platform in the world; the other claiming that clinging to the roots of what I will refer to as “traditional” concepts of computing is preventing the platform from being a true component of a “modern” IT ecosystem.
Businesses have largely rallied around the mainframe as a secure authoritative source of data. The mainframe held its place, safely storing that critical information and humming along with what seemed like minimal care and feeding. That worked quite well until someone pointed at these mountains of accumulated data and said, “You know, there’s some really important stuff there!” And in that gaseous cloud of AI, Machine Learning, and Microservices—modernization was born.
As more consumer-accessible technologies made their way into the data center and became core components of a business’s IT strategy, suddenly the mainframe looked more “different” and accessing that important stuff became a top priority. But the journey to that access is littered with fear, uncertainty, and doubt that is often driven by siloed technologists without a holistic understanding of multiple platforms. And what better way to explore this topic than everyone’s (least?) favorite corporate icebreaker.
But before I tell you my two truths and a lie about mainframe modernization, it might be useful to understand what the phrase means to me, particularly because this topic tends to lend itself well to simple redefinition for the sake of convenience.