Growing Data Management Nightmare: Unstructured Rich Media

Arthur Cole
Slide Show

The Information Governance Conundrum

While "rigorous discovery" and defensible disposal of information is the most powerful outcome of information governance efforts, very few are able to effectively dispose of data.

This may seem counterintuitive, but for most enterprise workers, data infrastructure is not a major concern. Individual business units may place an extremely high value on their actual data, but the underlying framework to keep it readily available? That's IT's job.

For most of enterprise history, provisioning and maintaining the resources needed to meet data loads has been a thorny but manageable problem. The amount of data being handled was generally commensurate with the amount of revenue coming in, which in turn fueled the build-out of more infrastructure when needed.

All that began to change about five years ago when it became apparent that data loads were increasing even as business activity slackened. This was primarily due to the flood of unstructured data taking over e-mail servers and storage resources. It's gotten to the point that upwards of 80 percent of enterprise data is unstructured, which makes it very difficult to gauge each data set's overall value for backup and archiving purposes.

But the worst is yet to come. As enterprises increasingly convert to more social architectures, willingly or not, the use of rich media files-audio, video, hi-def-is on the rise. And naturally, these take up a lot more real estate both in the storage array and on the network than plain old text or 2-D graphics.

To date, the industry has responded with an ever-expanding roster of management systems and technologies aimed at large-volume, unstructured environments. As ThinkerNet's Mary Jander pointed out recently, just about every major IT development in recent years, up to and including Dell's purchase of Compellent, can be tied in some way to the need to manage unstructured data.

The problem, though, is that in this case, technology alone will not be enough. According to Gartner, the very nature of data and data systems will have to change, at least in the eyes of users, if this challenge is to be met effectively. But that could be a tough road considering it means new investment in adaptive infrastructure and data integration capabilities even as the value of that technology shifts from the systems themselves to the data they carry.

There also needs to be a broader understanding of that value across the entire enterprise. Business units need to be made aware of their individual data load and the strain it may be placing on the infrastructure. A good way to do that is through improved capacity management, data utilization and, most importantly, charge-back, according to Symantec. The company recently introduced the Data Insight for Storage platform, soon to be added to the Veritas Operations Manager stack, that utilizes all three approaches to not only determine who is using what, but to place awareness of that usage front and center in the operations budget.

Going forward, it will be interesting to see where our limits lie. More than likely, the vast majority of this unstructured data will wind up in the cloud. But despite what you hear, resources on the cloud are limited, albeit more by financial restraints than capacity ones.

In the end, we could wind up with a stark choice: continue to push IT budgets even higher to keep up with the coming data loads or place restrictions on the kinds of social environments that are said to be the key to future prosperity.

Add Comment      Leave a comment on this blog post

Post a comment





(Maximum characters: 1200). You have 1200 characters left.



Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.