It seems that when it comes to the enterprise and the cloud, it’s all over but the shouting. Organizations around the world have integrated cloud infrastructure into their overall data footprints in a major way, and at this point there is no chance of undoing it all.
But this doesn’t mean the cloud is putting all data operations on easy street. Indeed, just as local data infrastructure did in the past, the cloud will fuel its own endless cycle of upgrades and revisions as users come to demand new levels of performance and flexibility at every turn.
As eWeek reported last month, the cloud currently supports about 28 percent of the worldwide compute and storage load, and that portion is expected to increase to 58 percent over the next decade. And perhaps most telling, a good 83 percent of respondents to a recent Tata Communications survey said they have received benefits from cloud computing that they did not expect at the outset, including higher productivity and faster data access.
But whether cloud infrastructure spans public, private or hybrid deployments, challenges remain. While much of the focus in recent years has centered on security issues, naturally, it appears that the next major hurdle is compliance. As Happiest Minds Technologies’ Priya Kanduri noted on ITProPortal recently, compliance could be an even tougher nut to crack because it involves government regulations and the transnational exchange of data, which cannot be solved by simply deploying a new layer of software. While many cloud providers are creating local nodes and implementing other compliance services, the onus will remain on the enterprise to ensure it is not running afoul of the law – and that can only add caution and hesitancy to the adoption of new cloud infrastructure.
Yet another issue is provisioning the right resources for any given workload, says Cloud Technology Partners’ David Linthicum. In most cases, users are flying blind when it comes to properly sizing machine instances for their projects, usually by estimating the correct size based on what they would provision from legacy infrastructure. But while this is eminently logical, it is not always accurate because both server types and the applications themselves are not the same when applied to public cloud resources. While many providers do offer autoprovisioning and autoscaling of virtual instances, what is really needed is autosizing to provide fine-grained coordination between workloads and resource consumption.
Even applications that are native to the cloud may not be the best fit for the enterprise. According to a recent study by CloudLock, more than a quarter of enterprise cloud apps are deemed high-risk. While this is a subjective assessment, and CloudLock makes its living by helping organizations lower their cloud vulnerabilities, with the average enterprise exposing data to more than 700 cloud applications, the potential for trouble is high even if the high-risk pool were to be halved. And perhaps most significantly, more than half of third-party apps would be banned if someone tried to deploy them on local infrastructure.
Few people expected the cloud to provide a digital nirvana, but the fact remains that cloud infrastructure is and will remain a work in progress. While the cloud does remove the burden of infrastructure management from the enterprise’s shoulders, responsibility for data remains the same.
Whether it is the cloud, automation, software-defined X or any other technological advancement, IT’s role in supporting the data environment never diminishes – it only changes in form.
Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.