New Storage Software Strives for Intelligent Management

    Slide Show

    Five Innovations in the Data Storage Landscape

    Server and networking resources are routinely virtualized and consolidated, primarily because their utilization rates are usually low in their native states. Storage can be virtualized as well, but this is primarily on the networking side – at the end of the day, a bit is a bit and it needs somewhere to go until it is retrieved.

    So when it comes to storage consolidation, much of the attention is on the data side as tools like compression and deduplication strive to lower the amount of data under storage for a more efficient approach to resource management.

    The next step on that path appears to be storage intelligence, which is essentially a deep-dive analysis of the data load to more accurately gauge not only what type of data it is and how it is used, but also how it can be most effectively stored to drive greater value to applications and business models.

    A number of start-ups are taking different tacks in the development of intelligent storage, including Qumulo, which recently unveiled the Qumulo Core platform that provides a “data aware” scale-out NAS architecture to bring advanced functionality as file counts extend into the trillions. Aimed primarily at unstructured data, the platform incorporates real-time relevant products/services analytics to provide enhanced visibility into key storage requirements, including overall data value, user/application access requirements, archival and lifecycle management and recommended storage options. The system runs as a Linux application aboard commodity hardware, utilizing the Qumulo Scalable File System to weigh factors like price/performance, file size and read/write requirements to determine optimal storage conditions.

    Software-defined storage solutions are also drawing a fair bit of activity, one of the newest being Springpath, which aims to leverage existing storage resources for Big Data loads. The idea is to unite existing “fragmented” storage infrastructure under a common framework that can be managed on commodity servers rather than the storage systems themselves. The company has developed the HALO file system that incorporates data caching, intelligent distribution, deduplication and other functions to provide a unified architecture that allows server resources to access a wide range of storage solutions. At the same time, it reduces the storage infrastructure that typically supports data center operations and incorporates advanced failover to preserve data in the event of an outage.

    Server-side Flash solutions are also seeing new breeds of management software to make them behave more like the traditional storage environments that the enterprise is used to. PernixData recently released the latest version of its FVP platform that brings advanced functions like adaptive compression, intelligent I/O profiling and role-based access control into the mix. As explained to ITBE’s Mike Vizard, the software plugs into the hypervisor to enable intelligent data analysis that ensures files are assigned to the proper tier, which ultimately lessens the cost of Flash compared to optimizing traditional disk storage for high-speed applications.

    Even long-standing enterprise platform providers are upping their storage management capabilities, lest they lose market share to the upstarts. IBM is out with the new Spectrum Storage platform that utilizes intelligent management techniques to create a “data footprint” that can be used to compile storage at optimal price and performance points. The system already incorporates more than 700 patents and is likely to be a prime beneficiary of nearly $1 billion in R&D money that IBM intends to devote to storage solutions over the next five years, says IT Pro Portal’s Sead Fadilpašić. The software scales into the yottabyte range and can encompass IBM and third-party solutions, in some cases delivering a 90 percent reduction in storage costs.

    Usually, the greatest gains in storage performance and efficiency are the result of physical developments: new storage media, more refined tracking and cell designs or simply streamlined, low-power components that provide better results at lower costs. But the management side of storage still has room for substantial improvement, particularly as the tools propelling Big Data analytics turn inward to data center and cloud-based operations.

    Before laying out big money for additional storage, it might be worth your while to see how storage software could help extend what you already have.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata, Carpathia and NetMagic.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles