Mainframes Still Command Respect

    Most references to the mainframe in modern enterprise infrastructure point out the fact that such an aging technology is still chugging along in today’s frenetic data environment. But the truth of the matter is that the mainframe is still the powerhouse of the large data center, with some estimates claiming that it accounts for as much as 80 percent of corporate transactional data.

    Clearly, says Brian Pereira of Digital Creed, today’s mainframes are quite different from the mouseless, tape-spinning behemoths of decades past, and the software that runs them is as sophisticated as anything you’ll find in a virtualized, distributed server architecture. But that’s simply more proof of the mainframe’s enduring appeal: Software companies like BMC continue to write code for big iron even though it only makes up about a third of their revenues compared to 90 percent or more in the mainframe’s heyday.

    Another sign of the mainframe’s longevity is the fact that it can adapt to emerging technologies like containers. Geekwire reports that Docker recently rolled out a new version of its Enterprise Edition software that allows organizations running mainframe operating systems like System z to containerize their apps without having to alter source code. In this way, large organizations will be able to break critical applications down to their component parts for improved management and security, just as they can with their Windows and Linux apps running in the cloud. As well, the new version offers improved administration capabilities to build greater isolation between projects running on the same Docker cluster.

    Fujitsu is also out with a new version of its BS2000 operating system to provide higher levels of digitization and interoperability with open systems. The BS2000 OSD/BC Version 11.0 features live migration between System/390 servers for improved downtime and maintenance, as well as extended storage integration for text-based files in support of seamless data exchange over shared NAS storage. The company has also added new encryption capabilities and diagnostic functions to boost long-term availability.

    As with virtually all hardware these days, however, the mainframe is not immune to new service-based architectures that provide better cost and utilization models than fully owned infrastructure. A new report from Database Trends and Applications highlights the growing prevalence of Mainframe Data as a Service offerings (registration required) that are helping push the technology past its traditional role as a system of record to providing greater engagement with emerging open source systems of engagement. Current database platforms like Hadoop and Spark are emerging as key assets for mission-critical applications, but in many cases they are cut off from the data housed in the mainframe. By freeing up the mainframe through service-layer virtualization, the enterprise can push all of its data into analytics and deep learning infrastructure, greatly improving its agility and cost structures for next-gen IoT and other applications.

    While the mainframe has many differences with today’s distributed architectures, at a basic level they are roughly the same: numerous core processors connected to each other and to storage via high-speed networking. In fact, as the enterprise reverts back to converged and hyperconverged resources, they are re-discovering the same all-in-one approach to IT that the mainframe first championed. The only major difference is that converged systems can scale more easily with the addition of compatible compute modules.

    Apparently, “what goes around, comes around” applies to technology as well as other facets of life. With the mainframe still going strong at large organizations, data infrastructure will maintain the diversity needed to meet the highly customized requirements of a constantly evolving digital economy.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Latest Articles