More

    Dell Flexes Data Management Muscle

    The problem with data storage today can be summarized by our inability to easily store the right data in the right place at the right time. That fundamental failure not only results in higher costs for IT infrastructure; it also leads to all kinds of downstream data management issues related to compliance, security and even analytics.

    To address this issue in the wake of its recent acquisition of Compellent, Dell is embarking on an effort to create what the company calls a Fluid Data Architecture that promises to simplify the challenges IT organizations have with managing data today.

    At the core of that plan is a unification of the Dell storage product line, which now includes storage systems from Compellent alongside technologies that Dell added to its portfolio with the acquisitions of EqualLogic, Exanet and Ocarina.

    According to Scott Horst, executive director of marketing for Dell Compellent, Dell will standardize on the object-based file system from Exanet, which over the coming year will be added to every storage offering in the Dell lineup. Once that’s in place, Dell will then enhance its data storage management tools to allow IT organizations to create policies to easily manage data across different classes of storage. The ultimate goal, says Horst, is to be able to associate data with its business value and then manage it on the most appropriate storage platform based on the criticality of that data.

    That may seem like a simple idea. But we live in a world where most applications not only have their own servers, but dedicated storage as well. That may allow IT organizations to guarantee certain levels of performance, but it also results in storage utilization rates that hover somewhere around 20 percent. In contrast, Horst says new deployments of Compellent-based storage systems in virtual server environments routinely see utilization rates in the 70 percent range.

    As part of its effort to reduce the cost of storage, Dell also plans to embed the data duplication software it picked up with the acquisition of Ocarina on all its storage platforms. The core idea is that duplicate data should be eliminated at the source, rather than waiting to deduplicate it at the backup system after that data has already taken up space on primary storage systems and eaten up precious network bandwidth.

    When you think about data management in the enterprise today, it’s clearly an oxymoron. IT organizations process data, but they do very little in the way of actually managing it. The end result is skyrocketing IT costs that consume a huge percentage of a company’s capital budget. So it’s little wonder that there is so much interest in cloud computing. The fundamental issue is whether companies are going to outsource IT to cloud computing providers because they can’t get a handle on their data, or whether as part of a private cloud computing initiative they will lower the cost of IT by starting to proactively manage data. Most IT organizations will aspire for the latter, but can they execute it?

    Mike Vizard
    Mike Vizard
    Michael Vizard is a seasoned IT journalist, with nearly 30 years of experience writing and editing about enterprise IT issues. He is a contributor to publications including Programmableweb, IT Business Edge, CIOinsight and UBM Tech. He formerly was editorial director for Ziff-Davis Enterprise, where he launched the company’s custom content division, and has also served as editor in chief for CRN and InfoWorld. He also has held editorial positions at PC Week, Computerworld and Digital Review.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles