Keep Upgrading Your Data Center No Matter Where the Cloud Takes You

Arthur Cole
Slide Show

Fourteen Technologies Needed to Implement a Fully Functional Private Cloud

Improving the data center to keep up with advancing technologies has been the chief, perennial responsibility of CIOs over the years. These days, however, the job has taken on a new twist as new questions arise: Is the data center the best platform to boost enterprise productivity? Do we need a data center at all anymore?

Most large organizations seem to be solidly in the owned-and-operated camp when it comes to the data center, but the farther into the SMB space we go, the certainty starts to waiver. Clearly, the reliance on traditional physical-layer infrastructure is under serious assault across the board. According to MarketsandMarkets, spending on software-defined data center (SDDC) technology will jump from about $400 million today to $5.41 billion by 2018, reflecting the enterprise’s desire to not only improve operational capabilities but to integrate in-house infrastructure with the broader cloud ecosystem.

But as “software defined” becomes the order of the day, the question becomes whether to continue to pour money into your own systems and infrastructure or to off-load that responsibility to someone else and concentrate on core business activities.


For the moment, at least, both of these strategies can be implemented through improvements to internal data center infrastructure. The more cloud-like the data center becomes, the easier it will be to integrate into external cloud architectures, enabling a seamless transition if the day should come when maintaining your own IT is no longer viable. Top vendors like EMC are already moving their platforms in this direction. The company’s new ViPR storage systems and even the VNX flash storage array are designed from the ground up to support advanced cloud architectures that fit comfortably within the enterprise data center or the cloud hosting facility. The strategy is coming together under Project Nile, which is billed as an elastic cloud platform that combines customizable performance features suitable for a wide range of file, block and object storage architectures.

Most efforts to upgrade the data center for the cloud focus on core infrastructure, but as data environments become more distributed, the edge will need increased attention as well. As Taneja Group’s Mike Matchett notes, edge devices will need to do more than simply shuttle data to and from remote offices; they will also need to provide for full extension of data center infrastructure over the wide area. Such capability will become increasingly crucial as organizations adopt virtual desktop platforms that need to be available not just to remote PCs but to the legions of mobile devices that are flooding the enterprise. If organizations hope to foster a truly productive, distributed environment, the last thing they need is artificial tiers of user performance that give an edge to those who remain tethered to their desks.

Envisioning the future is one thing, but getting there is quite another. The path to success will undoubtedly hold many pitfalls, but at least the broad outlines of the future flexible data center are coming into view. Brocade’s Jason Nolet says the foundational elements include a high-speed physical network capable of supporting virtual, fabric-based architectures, supplemented by software-based controllers and a high degree of orchestration and automation that ideally conform to widely implemented interoperability standards like OpenStack and CloudStack. In this way, enterprises will be able to fulfill virtually any user requirement either through home-grown data infrastructure or the myriad services that can be cherry-picked from the cloud.

Ultimately, then, the choice between an all-cloud operation or an internal data center will not be as stark as it seems today. In fact, many organizations will likely pursue hybrid architectures well into the future. If done right, there is no reason to expect the external cloud to be any more functional or any less expensive to build and maintain than the internal one. In fact, both solutions will likely prove superior for select applications and user environments.

By maintaining robust, up-to-date cloud capabilities both inside and outside the data center, enterprises should be fully capable of handling just about anything that comes along.



Add Comment      Leave a comment on this blog post
Nov 6, 2013 7:49 AM Duane Tursi Duane Tursi  says:
Seems to me like the pendulum may have swung a bit too far 'to the public cloud' and that in time businesses will land on the hybrid cloud architecture as the standard. At an absolute minimum, owning and controlling at least one copy of your information seems like the responsible thing to do. Over time, business will get savvy at determining where applications should be delivered from, locally or via a public cloud and this will likely be determined in large part by whether these applications are 'core' to the business or 'contextual' to the business. Reply

Post a comment

 

 

 

 


(Maximum characters: 1200). You have 1200 characters left.

 

null
null

 

Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.