Managing the Coming Data Storage Deluge

Michael Vizard
Slide Show

Five Data Storage Predictions for 2012

There's a lot of concern these days about whether storage technologies will be able to keep pace with the mass of information that organizations are going to want to store over the coming decade.

By 2015 alone it's expected that collectively the amount of information that will need to be stored with exceed 8 zettabytes. Obviously, that's an abstract number; no one organization is going to generate anywhere near that kind of data. But thanks to the rise of video, Big Data, social networking, mobile computing and virtualization, the amount of data that each organization will need to store is not only going to grow exponentially, the actual size of the data volumes that will need to be manipulated at any given time are going to expand considerably.

Speaking at an IBM Smarter Computing Executive Forum, Brian Truskowski, general manager for IBM system and storage, said this week that as more IT organizations adopt technologies that automate the storage of data across multiple tiers of storage technologies, the average organization should see a 25 percent increase in disk utilization, and a 20 percent increase in the utilization of its most expensive Tier 1 disk systems.


In addition, by storing less data on Tier 1 storage systems, the average large company should reduce the amount of data on those systems by 12 percent, which he says on average would save $1.9 million per year. Truskowski also says that by making greater use of data deduplication technologies, the average organization could reclaim about 11 percent of its storage space, which would save about $3 million while also reducing many of the headaches associated with backup and recovery. Technologies such as Active Cloud Engine, he adds, will also automatically move data into lower-cost cloud storage systems based on the actual usage of data within your organization.

At the same time, Truskowski notes that in addition to the rise of automated storage management technologies, lower-cost Flash memory that is being applied at the processor, server and storage level will significantly improve I/O performance in the years ahead. Longer term, IBM is working on file systems capable of managing 120 petabytes of data, "RaceTrack" memory that optimizes the distribution of data storage based on its relationship to other data on that same device, while pushing the atomic limits of disk storage. For instance, IBM recently demonstrated the ability to store one bit of data using only 12 atoms, which compares to over 1 million atoms that are required to store one bit of data today.

There is no doubt that data storage along with other systems management technologies in the years ahead will advance considerably. It's also pretty apparent that there is plenty of room to optimize the storage systems that IT organizations already have in place. The only real question is whether those advancements will arrive in enough time to turn the mass of data that currently threatens to overwhelm IT into an actual asset.

Add Comment      Leave a comment on this blog post

Post a comment





(Maximum characters: 1200). You have 1200 characters left.




Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.