Ten Tough Storage Questions IT Needs to Ask

Email     |     Share  
1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11
Previous Ten Tough Storage Questions IT Needs to Ask-1 Next

Click through for advice, courtesy of Scale Computing.

IT organizations today are coping with what amounts to the most difficult set of management challenges ever in terms of managing data storage.

On the one hand, the amount of data that needs to be managed is growing exponentially. At the same time, management wants reductions in the IT budget being allocated to storage in terms of not just lower costs per terabyte, but also the actual total amount of money being spent on storage.

After benefiting from massive savings on server consolidation thanks to virtualization, senior IT executives want to see the same effect on their storage capacity, which can take up as much as 20 percent to 50 percent of the IT hardware budget. But while IT managers would like to achieve this by increasing the utilization rate of each storage array, in no way can the I/O performance of any given application be compromised by storage virtualization.

Peter Fuller, vice president of business development for Scale Computing, a provider of storage clustering software, says these issues are driving many IT organizations to completely rethink their approach to storage management. In addition to aggressively tiering data to make sure that they get the most out of expensive storage systems, they are also pursuing approaches that provide the highest level of flexibility.

That means, among other things cited by Fuller, finding storage architectures that don't lock customers into a particular storage protocol, being able to add storage capacity in granular 1TB units and, most importantly, reducing the amount of time it takes to actually manage large amounts of storage.

This level of flexibility is essential, said Fuller, because with storage pricing dropping rapidly, customers can't afford to be locked into long-term storage contracts based on a fixed price.

Ultimately, Fuller says the next generation of storage management will be defined by cost, control and convenience, which are three metrics that most existing storage architectures fail to measure up against.

 

Related Topics : IBM Looks to Redefine Industry Standard Servers, Fujitsu, Transport Layer Security, Server Virtualization, Server Operating Systems

 
More Slideshows

Holiday20-190x128 5 Ways to Protect Your Data Center from a 'Zombie' Server Attack

It's easy to forget about the ghosts of servers past that are hiding in the background, continuing to consume electricity and potentially exposing organizations to malicious attacks. ...  More >>

DataCtr8-190x128 Microsoft Windows Server 2003 End of Life Is Approaching, Is Your Business Ready?

Insight on what end of life for Server 2003 will mean for your business, in addition to tips on avoiding data loss through the migration process. ...  More >>

Gartner2015StrategicTechTrends0x Top 10 Strategic Technology Trends for 2015

The coming year's trends will be focused on merging the real and virtual worlds, data intelligence gathering, and shifting to a digital business model. ...  More >>

Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.