In Search of the Optimal Data Environment

Arthur Cole

Enterprise resources are certainly getting more powerful, with the comforting thought that provisioning a broad array of external systems will become both cheaper and easier as cloud architectures take hold.


However, the fact remains that both structured and unstructured data is accumulating at an accelerated rate, driven largely by new compliance regulations following the financial collapse. And even if the resources are there to keep it all contained, your ability to keep track of it all and ensure it can get from place to place could become seriously compromised, nonetheless.


According to The Burton Group, enterprises have to start thinking more about managing data and less about managing systems as the barriers between hardware and software continue to break down. That means paying greater attention to things like database management, data governance and business intelligence, and in particular the metrics used to measure overall performance.


Fortunately, solutions to these problems are evolving along several tracks. On the one hand, you have the "performance optimization" set, which seeks to streamline the I/O paths between various physical, virtual and cloud platforms. Adaptec recently pushed the ball a little further along these lines with the Data Conditioning Platform that aims to coordinate a number of the company's technologies into an improved data management strategy. The system uses a new caching solution called the MaxIQ that turns SSDs and HDDs into "High-Performance Hybrid Arrays" (HPHAs) to accelerate application read performance and adds modules such as the company's Zero-Maintenance Cache Protection and Intelligent Power Management.


There's also something to be said for reducing the overall data load that many of you are dealing with. Data deduplication has been getting much of the press lately, but that is not the only game in town. Storewize, for example, has been showcasing its ability to reduce data in VMware environments and Hitachi Data Systems' NAS platforms by up to 70 percent through real-time data compression techniques. Not only does this ease the burden on administrators, but can allow for significant hardware consolidation as well.


If the recession has proved anything over these past few months, it's that data generation increases regardless of whether business activity is picking up or slowing down. Getting a handle on data management now will avoid a lot of pain later when virtual and cloud architectures are even more prevalent.

Add Comment      Leave a comment on this blog post
Sep 11, 2009 12:32 PM Ed Gillespie Ed Gillespie  says:

Timely piece, Arthur.  Was just readding at PBBI how data quality costs an organization, on average, $8.2 million a year.


Post a comment





(Maximum characters: 1200). You have 1200 characters left.




Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.