Flexibility Is Key to Handling Big Data

Arthur Cole
Slide Show

Eight Big-Name Options for Big Data

Whether it's virtualization, the cloud, or nearly any other technology of the past few years, the overriding goal of most deployments of late has been to boost the data center's ability to handle Big Data.


But while the knee-jerk reaction to increased data loads might be to simply increase the size and scope of today's platforms, there is also a growing consensus that future infrastructure must be smarter as well as larger.


Storage, in particular, needs a lot of work to handle the burdens of Big Data. Most storage area networks (SANs), for instance, are not well suited to the unstructured data that comprise most Big Data sets. NAS is eminently more scalable, according to EMC's Nick Kirsch, and provides a higher degree of availability as it scales out with more nodes rather than simply increasing the size of existing nodes.


Whether storage is scale up or scale out, however, it will forever lag behind the rest of the data infrastructure unless it can derive the same benefits from virtualization as servers and networking, according to DataCore. To do that you need an actual storage hypervisor, one that can more closely match the resource flexibility, automation and management capabilities that are driving productivity elsewhere. It's also the best way to aggregate storage silos that are currently separated by platform, interface and other proprietary restraints.


Storage virtualization is especially crucial as enterprises offload more of their Big Data responsibilities to the cloud, says Nutanix CEO Dheeraj Pandey. OS virtualization has helped with Big Data platforms like Hadoop, but it still requires a lot of inefficient data shuttling between various storage tiers. With separate logic for storage and data analytics, data can be moved transparently across, say, server-attached and network-attached resources.


Big Data will also require a new generation of search capability. Azaleos Corp., for example, has issued a new Managed Enterprise Search function to its SharePoint Services portfolio that allows enterprises to tap into the Microsoft FAST Search Server 2010 platform built for Big Data operations. The combo helps hide the many management and administrative challenges of maintaining high visibility in complex environments


Data environments have the luxury of going "big" as a means to accommodate the changing requirements of enterprise knowledge users. Infrastructure, however, is limited by costs, logistics and other factors that value efficiency as much as capacity.



When Big Data starts heading your way, you'll need to meet it head on, but only with an infrastructure that puts agility on an equal footing as size.



Add Comment      Leave a comment on this blog post

Post a comment

 

 

 

 


(Maximum characters: 1200). You have 1200 characters left.

 

null
null

 

Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.