The Data Center in the New Digital Economy

    Slide Show

    Beware the Hidden Dangers of the Internet of Things

    The cloud is a powerful new weapon in the IT arsenal, but it is pretty clear at this point that most enterprises will continue to invest in local data infrastructure for the foreseeable future.

    This is in fact the more difficult strategy because it forces the enterprise into a series of tough decisions regarding infrastructure, architecture and technology at a time when the use case for the data center itself is undergoing such dramatic change (more on that toward the end of the post).

    To date, only a handful of organizations have opted for a fully cloud-based data environment, the largest and most well-known being Netflix. The company is prepared to power down the last of its data centers any day now, according to the Wall Street Journal, placing its entire footprint – from video streaming and customer service to back-end office support – on the public cloud. The company actually began migrating key systems to Amazon back in 2008 following a serious hardware failure within its internal infrastructure, although the actual content delivery network (CDN), which competes with a similar service from Amazon, is ported out to various smaller providers.

    Organizations that aren’t willing to trust the cloud to this extent are by no means at a disadvantage to all-cloud competitors, as long as they put the time, effort and money into streamlining and optimizing local infrastructure. As I mentioned earlier this week, part of this transition will involve system consolidation and modularization, which in fact will be driven by the same hyperscale technology that is populating the data centers of the top webscale companies. Already, many traditional IT vendors are touting hyperscale platforms with the idea that organizations looking to maintain control of their data infrastructure will still need to keep it as lean and mean as possible.

    One of the biggest changes is likely to be the end of the traditional SAN in favor of the more modular server SAN, says the Register’s Chris Mellor. Recent research from Wikibon suggests that the use of server SANs consisting of direct-attached Flash memory or even DRAM will be nearly universal in 10 years, with the sole holdout being long-term storage and archiving. Server SANs are not only less costly and easier to manage than complex traditional SANs, they also provide a much more flexible means of supporting the kind of pooled storage environments that modern virtual platforms require. As well, the open platforms currently hitting the channel do away with the proprietary coding and services that are hampering data connectivity within existing storage environments.

    Data Center

    And this brings us back to the changing use cases for the local data center. As Moor Insights & Strategy analyst Gina Longoria noted recently, Big Data and the Internet of Things are changing the rules when it comes to the way infrastructure should support the business model. Simple storage and processing are taking a back seat to analytics and massive scalability. This is the only way to turn the data coming in from legions of connected devices (the things) into actionable intelligence, so any organization that fails to transition infrastructure to this purpose will not only pay a lot to maintain its data infrastructure but will ultimately fail to optimize this investment for maximum productivity.

    So while it may be too early to write the epitaph for the enterprise data center, it is clear that it won’t remain in its present form for much longer. Like a person who develops an expanding waistline during middle age, the data center is due for a crash diet in order to get back into fighting trim.

    There is bound to be some pain during this transition, but if handled properly, the results should speak for themselves.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata, Carpathia and NetMagic.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles