Two salient facts are emerging about enterprise infrastructure as the second decade of the 21st Century unfolds. One is that cloud computing will usher in new levels of dynamism and flexibility that will both drive performance and lower costs. The other is that mainframes will continue to support significant portions of the data environment at most large organizations.
While these two architectures are not mutually exclusive, it does pose a conundrum of sorts: How will mainframe users resolve the differences between aging big iron platforms and the fast-paced culture of the cloud?
Although predictions of the mainframe's imminent demise are as old as, well, the rack-based server, it is clear now that top-tier organizations' commitment to the technology is solid. BMC Software recently issued a survey indicating that 93 percent of current users expect mainframe capacity to hold steady or even grow over the next few years. The plan, at least, is to continually tweak their current mainframe platforms to accommodate both the cloud and mobile computing as these two technologies continue to reshape the overall enterprise landscape.
This is partly the reason why many organizations are pursuing a private cloud strategy. Legacy infrastructure is not something to be cast aside lightly, and the fact is that big iron will be called on to handle the Big Data that is swamping the enterprise. Besides, as Datamation's Jeff Vance points out, private clouds are not the slam dunk that many people believe them to be. Virtualization has gone a long way toward breaking the data silos that evolved in many environments, but they can still arise within virtual ones. As well, issues surrounding service policies, operating system compatibility and data management are more pronounced in mainframe-sized environments, all of which will certainly slow the implementation of private clouds.
Hopefully, though, the process won't take too long because the people skilled at running today's mainframes won't be around forever and the younger techies don't show much interest in "yesterday's technology," according to Compuware's Neil Richards. This comes at a time when mainframe application complexity is increasing as organizations adopt customer-facing tools to accommodate functions like online banking and transaction processing. One solution, he says, is to add modern GUIs to the standard ISPF interface, both as a means to increase interactivity and to attract new generations of developers to the platform.
Meanwhile, you have large database firms like Oracle actively seeking to shift workloads onto the cloud. The company recently updated its Tuxedo transaction processing server to make it easier to port applications to the Oracle Exalogic platform. The company claims it can cut operational costs by 80 percent, in part by integrating management under Oracle Enterprise Manager 12c and increasing the use of template-based provisioning for improved scalability. Next month, look for a new Tuxedo ART platform that emulates IBM's Customer Information Control System (CICS) and Information Management System (IMS).
Clearly, large enterprises are coming to a crossroads. Do you continue to utilize legacy environments even as costs mount and expertise heads elsewhere, or do you cast your chips in with the more nimble, but riskier, technology of the future?
For most, the answer will be decided by simple economics. However, the longer you delay that day of reckoning, the tougher it will be to keep up with a rapidly changing data universe.