At a time when many IT organizations are struggling with how to reduce their data storage costs, many of them may be overlooking their real costs by obsessing too much about the cost per megabyte of storage.
According to Hitachi Data Systems CTO Hu Yoshida, the fundamentals of the economics of storage are being overlooked because not enough attention is paid to the operational costs associated with managing data that is growing at exponential rates.
Yoshida argues that IT organizations would be much better off if they not only considered the cost of an individual megabyte, but also the cost of managing that megabyte and the amount of infrastructure that has to be powered to support it.
These issues, says Yohida, are what lead HDS to develop a Universal Storage Platform for managing data in a way that can be distributed in multiple formats via various gateways. Instead of buying separate network attach servers (NAS) and storage area networks (SANs), HDS has an approach that makes data available to servers in these formats via a series of gateways attached to the HDS Universal Storage Platform. That means instead of having multiple copies of data in separate storage arrays that need to be managed and powered, IT organizations can apply a common management infrastructure across a centralized HDS architecture that substantially reduces the total cost of storage by eliminating arrays.
In addition, Yoshida says this approach allows IT organizations to maintain relative headcount while increasing the amount of data managed by each administrator.
The rise of unstructured data coupled with a tendency to replicate data across the enterprise is creating a compelling argument for a new platform-centric approach to data management that reduces the real cost of storage, as opposed to narrowly focusing on the cost per megabyte at a time when the number of megabytes consumed is spiraling out of control.