Is The Mainframe Dead? If Not, Who Uses it?
Mainframes first appeared in the early 1940s. The most popular vendors included IBM, Hitachi and Amdahl. Some recently considered mainframes as an obsolete technology with no real remaining use. Yet today, as in every decade since its inception, mainframe computers and the mainframe style of computing dominate the landscape of large-scale business computing. Mainframe computers now play a central role in the daily operations of many of the world’s largest Fortune 1000 companies. Though other forms of computing are used extensively in various business capacities, the mainframe occupies a coveted place in today’s e-business environment. In banking, finance, health care, insurance, public utilities, government, and a host of other public and private enterprises, the mainframe computer continues to form the foundation of modern business.
The main difference between mainframes and supercomputers is their typical application domain – mainframes excel in reliable volume computing in domains requiring integer operations (e.g, financial, indexing, comparisons, etc). Supercomputers are design to excel in their ability to perform floating point operations – addition, subtraction, and multiplication with enough digits of precision to model continuous phenomena such as weather. Despite the continual change in IT, mainframe computers considered by many to be the most stable, secure, and compatible of all computing platforms. The latest models can handle the most advanced and demanding customer workloads, yet continue to run applications that were written in earlier decades. For those who think there is no use for the ‘big iron’ now, they would really be surprised. The truth is that we are all mainframe users in one way or another. Are mainframes a legacy technology? In a previous blog post that lumped mainframes in with other legacy technologies, including COBOL, DEC minicomputers, and Windows XP, we answered yes. And though we still believe that to be the case, the question of whether mainframes are a legacy technology is a complex one, with both sides of the debate having some convincing arguments, and one that deserves closer consideration than we gave it in that original post.
“Legacy technology” is older technology used by businesses that is outdated in one way or another (usually to the extent that the company that created the technology no longer supports it, and people with the skills and experience to maintain the technology are tough to find), but that is hard or impossible to replace with newer technology because of the cost or difficulty of upgrading.”
Mainframes are large, powerful, centralized, and highly-reliable computers. They are mainly used by big organizations to quickly, continually, and securely process and store huge amounts of data. Some banks, for example, use mainframes to manage all of their transactions. Other types of organizations that are likely to have mainframes include insurance companies, retailers, credit card companies, universities, and government agencies.
“In fact, 96 of the world’s largest 100 banks, nine out of 10 of the world’s largest insurance companies, 23 of the 25 largest retailers in the United States, and 71 percent of the Fortune 500 use IBM System z mainframes. Currently, there are 10,000 mainframes actively being used around the world.”
Some additional points to make about the mainframe not being a legacy technology that overlooked in the aforementioned blog, meanwhile, include:
Mainframes, in addition to supporting many legacy applications and systems, can be set up to run some modern applications and systems, as well, and serve as a team collaboration tool.
When it comes to processing huge amounts of information, mainframes are both faster and more cost-efficient than distributed servers, even the most cutting-edge and perfectly-configured ones
Businesses still purchase $5.3 billion in new mainframes every year
In addition to these points that support the argument that the mainframe isn’t a legacy technology, however, we also overlooked a few points that make the opposite argument, that the mainframe is an outdated product that will soon be obsolete.
The key argument for the mainframe being a legacy technology, in fact, has nothing to do with any limitations of the hardware itself; rather, it’s that the world is running out of people that know how to maintain and support mainframes.
This is because many of the people that have the training and experience to keep a mainframe running are retired or deceased, or soon to become so; and many businesses would rather switch to traditional servers than spend time and money on training programs for a potentially obsolete technology; and neither are IT workers likely to train themselves in a technology that has such limited job prospects.
Though it might not seem fair to label the mainframe a legacy technology when it’s still one of the better options for businesses with extremely high data processing requirements, it doesn’t change the fact that the number of people capable of maintaining and supporting a mainframe is dwindling, and that the only organizations and people that could possibly reverse this trend have little incentive to do so.
All of which goes to show that it does not matter how useful a technology is if there’s no one around that’s capable of keeping it running.
One of the most striking elements of mainframe computing, when viewed over time, is the extent to which the architecture changes to accommodate user requirements. One of the early selling points of System/360 was its stand-alone emulation of 2nd generation systems. By the time System/370 came along, stand-alone emulation was replaced by integrated emulation, a critical user requirement. Hundreds of RPQs have been made available over the years to satisfy one user requirement or another. Some of these solutions were limited time offerings; others became a permanent part of the architecture. One of my favorites from the former group was the High Accuracy Arithmetic Facility (HAAF) available on the IBM 4361. This mainframe, marketed as a supermini, was targeted at university math and physics departments. With installation of the HAAF one could do floating point arithmetic without carrying a characteristic in the floating point number. Moreover, all errors introduced by fraction (mantissa) shifting were eliminated. This facility permitted floating point arithmetic to be analyzed for accuracy under a wide range of computational conditions, a stunning capability for the math and physics users.