Most people, when asked if they favor or are oppose to disruption, will say they are opposed to it. Disruption is scary, produces a lot of unknowns and generally requires a great deal of work as new processes and skill sets take hold in the workplace.
In reality, however, these attitudes depend largely on whether you are the disruptor or the disruptee. For those who are ready to embrace change, disruption is cathartic in that it sheds old problems and ushers in new opportunities.
When we’re talking about disruption in the data center, the ideal is to implement disruptive technologies in a non-disruptive way; that is, to welcome new technologies and new ways of doing things without completely severing ties to legacy systems until you are ready. Part of it is discerning between good disruption and bad, says tech analyst Dan Kuznetsky, and unfortunately the IT industry is rife with systems and platforms that require a lot of rip-and-replace but then do not provide adequate replacement of all that has been ripped. The reason technologies like the mainframe have had such long shelf-lives is because of the value they bring to the enterprise, so the first criterion for any replacement is that it must provide equal or superior value to those who rely on the legacy system.
At the moment, however, the tech industry is not simply promising new technology but a completely new means of digital interaction, and it will be hard to implement things like software-defined networks, Big Data infrastructure and the Internet of Things without significant disruption. As IDC analyst Pushkaraksh Shanbhag noted to New Zealand Reseller News recently, a revamped infrastructure is crucial to the agility and responsiveness required of 21st Century processes and applications, so organizations that cling to aging, static infrastructure too long will be at a significant competitive disadvantage in relatively short order.
This is why it’s important to understand the disruptive trends and technologies that are targeting the enterprise, says Network World’s Michael Cooney, if only to be better prepared when they arrive at your doorstep. According to Gartner, some of the most far-reaching are open source hardware, workload monitoring, the rise of diverse client-side solutions and decoupled, integrative applications and services environments. By 2020, half of all enterprises will have a designated digital risk officer tasked with assessing the impact of these and other developments and how best to minimize the threats they pose.
Still, it is important to understand that data center disruption is in fact grounded in Moore’s Law, says Intel Data Center Group Senior VP Diane Bryant, meaning that enterprises will have to upgrade their technology architectures the way they upgraded their server and desktop chips in the past. This is a much trickier proposition, of course, since the data center is the very heart of the technology revolution affecting billions of people around the world, but it is also a necessary process because of the astounding new capabilities currently under development, particularly in fields like health care and telecommunications.
This all goes back to the nature of any given disruption and your relation to the thing being disrupted. Change is most certainly difficult, and for all the talk about getting outside one’s “comfort zone,” the fact is that most of us do not like our comfort being messed with. But even a crumbling house can be comfortable, until the roof collapses.
Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.