Planning for the Data Center Future

    Slide Show

    2016 Data Storage Trends: DevOps, Flash and Hybrid Cloud

    The future of the data center is quickly evolving into the question of the day as changes to technology, business processes and the economy itself spur the reconsideration of long-held design precepts up and down the data stack.

    Existential angst over the data center is no different from what philosophers have been pondering for millennia – “Who am I? Where am I going? What does it all mean?” – but in this day and age, plans for the future, and not even the very long-term future, are having direct consequences on decisions being made in the here and now. So amid the mad rush to get on the cloud, deploy Big Data and remake all that the IT department holds dear, it’s worth it to stop and think where we want to be in a few years.

    According to Rakesh Kumar Singh, lead tech of data center technologies at Juniper, the future data center will focus heavily on client-facing and analytical workloads, with the overarching goal being to maintain and even extend a competitive edge in an increasingly cut-throat economy. The best way to approach this is to upend the age-old practice of constantly seeking out and deploying the latest and greatest technologies to instead focus on business priorities and work out the infrastructure from there. As IDC noted in its most recent FutureScape study, half of all infrastructure investment by 2018 will foster greater engagement, insight and action rather than systems maintenance, while 45 percent of the installed base will employ automation and even autonomy to improve performance, lower costs, and provide the agility and scalability to remain relevant in the coming years.

    This smarter, software-defined data center is already taking shape, and it will be as different from today’s data center as the mobile phone is from the rotary-dial clunkers of old, says vXchange’s John Hawkins. And since software is more fungible and flexible than hardware, you can expect less reliance on highly centralized facilities in favor of more edge processing. According to a recent vXchange survey, more than 80 percent of IT decision-makers now recognize the importance of pushing data closer to customers, while nearly 90 percent are already starting to do this through cloud-based storage and networking.

    Odd as it may seem, the future of the data center may very well be taking shape in the federal government which, save for parts of the military and homeland security establishment, has not been known for its embrace of cutting-edge technology. But as Information Week noted recently, Uncle Sam is well under way with a cloud-first program that has already shut down more than 3,000 data centers used by a variety of agencies, with targets to trim another 2,000 by the end of the decade. That represents nearly half the total of 2010, shaving upwards of $8 billion off of IT operating costs.

    Of course, the future never really arrives when it comes to technology, as something better is always on the horizon. So rather than simply target a random date and try to predict what that will be like, perhaps it’s better to look at the journey ahead to determine what is and is not an appropriate course of action today. For tech journalist John Edwards, the transition to the hybrid cloud should be Job 1 for organizations that hope to remain relevant in the years to come. But rather than let IT dictate the terms of this change as in the past, hybrid development should be a true collaborative process, bringing in views from line-of-business stakeholders, the executive office and even partners and customers. Again, this harkens back to the idea of establishing what you hope to accomplish first and then tailoring infrastructure in support of those goals. As long as processes are adequately supported, issues like capex and opex should fall into place.

    If anything about the future can be determined today, it’s that the only constant will be change. With virtual and software-defined infrastructure becoming the norm, enterprises should have a much easier time keeping up with the Joneses than in the old days of static hardware.

    But the rules of success are changing as well. Before long, it won’t be the organization that has the fastest throughput, the most storage or the highest level of processing that wins in the end. It will be the one that can leverage available technologies and services to carve out a more effective business model.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles