The prevailing attitude that enterprise executives express toward the cloud is that while much can be gained from the broad scalability and low-cost operation of public services, the risk to critical applications and data is still too great. The private cloud is seen as a much better solution because it keeps the important stuff tucked safely behind the firewall.
But if that is the case, why has development of private cloud infrastructure moved so slowly? After all, the basic technology has been around for nearly half a decade, and the underlying virtual layer hit the industry shortly after the turn of the millennium. So why haven’t we seen a more robust deployment of private cloud architectures?
According to InfoWorld’s Eric Knorr, we seem to be at a point in which optimism is starting to bump up against reality – that is, in order to fulfill the promise of all things cloud, you have to suffer through the messy conversion process first, and all the disruption it is likely to bring. For one thing, scale-out infrastructure is only as good as underlying hardware will allow, and most legacy systems are not designed for massive scale. As well, a key portion of that infrastructure, networking, has only recently been virtualized through SDN, and the enterprise has barely taken its first tentative steps toward that conversion. Once all is said and done with hardware, the small matter of data migration must still be dealt with.
It’s for this reason many organizations are looking to deploy private clouds on Greenfield deployments, most commonly on converged, commodity platforms. However, another alternative is the hosted private cloud, says ZDnet’s Heather Clancy. New solutions like Dimension Data’s Private Compute as a Service (CaaS) offering are built on Windows Server 2012 R2 and the Hyper-V platform, which taps into the Azure cloud to support both Windows and Linux workloads. In this way, the enterprise gains a ready-made, fully scalable solution for common and mission-critical applications using dedicated, not shared, infrastructure, plus a streamlined migration path for legacy enterprise workloads.
Of course, without a strong private cloud the enterprise will gain only minimal benefit from public services. As Gartner’s Milind Govekar noted recently, it is the hybrid cloud that will ultimately provide the balance between operational efficiency and customized, secure service that the enterprise is looking for. Whether the issue is cost, availability, scalability, security or nearly any other metric that data executives use to measure success, the hybrid cloud meets or exceeds all other solutions. But you can’t get there unless you have a robust private cloud firmly in place.
When it comes to calculating ROI in the private cloud, however, don’t look for any easy answers, says Datamation’s James Maguire. Not only are the goals different for every organization, but the variables to determine success or failure are as diverse as the infrastructure and architectures that support legacy environments. At best, you can come up with some ballpark figures using tools like Datamation’s own Private Cloud ROI Calculator, but even then, actual results will depend on numerous factors regarding legacy environment attributes, types of workload and the level of virtualization already in place.
The decision to deploy cloud infrastructure is not so cut and dry as it would seem, then. When presented with all the shiny new opportunities that the cloud represents, the direction seems clear. But along the way, numerous crossroads appear that work to break down that clarity, leaving enterprise executives with the uncomfortable realization that there is not one cloud, but many.
Choosing the right solution, of course, will depend very much on what you hope to do with it, and what kind of data environment offers the greatest potential for the success of your business processes.