IBM Unfurls Global Project Elastic Storage Initiative

Mike Vizard
Slide Show

Ten Things You Need to Know About Software-Defined Storage

In theory at least, cloud computing is pretty much defined by elasticity. The whole idea of building a cloud, after all, is to make IT infrastructure resources available on demand. IBM announced today that it is applying that concept to storage on a global scale regardless of the format in which it is stored or the devices used to house it.

Later this year, IBM will deliver a truly elastic data storage capability, code-named Project Elastic Storage, on top of the IBM SoftLayer cloud computing platform. Built on top of the IBM General Parallel File System (GPFS) technology that IBM has been using in high-performance computing (HPC) environments for decades, Project Elastic Storage turns GPFS into a cloud service capable of supporting file, block and object-based storage services.

Bernie Spang, IBM vice president of strategy for software-defined environments, says that besides providing a single global name space for both IBM and third-party storage devices, the OpenStack-compatible Project Elastic Storage initiative will also provide distributed caching services, encryption, and the ability to remotely delete files in the event that a device goes missing. In addition to OpenStack Cinder and Swift support, Elastic Storage will also support other open APIs such as POSIX and Hadoop.

In effect, Project Elastic Storage represents an IBM commitment to automate the process of creating tiers of storage across hybrid cloud computing environments, which will be completely defined by software. As such, Spang says not only are the economics of storage being fundamentally changed for the better, but the days when storage administrators had to manually provision storage systems are rapidly coming to a close.

Spang also says that HPC environments have been wrestling with managing IT infrastructure at scale for decades. Many of the issues that cloud computing environments have are similar in nature; it’s just that they tend be much more distributed. It makes sense, however, to apply technologies that have already been proven in HPC environments to solve cloud computing challenges, says Spang. In this case, IBM is making use of HPC technologies to manage storage at a much higher level of abstraction in the age of the cloud.

In the meantime, as cloud computing continues to evolve, it’s clear that a fundamental realignment of enterprise computing strategies is at hand. Instead of provisioning IT infrastructure to support both the peak performance of application workloads along with having every piece of data stored on premise, IT organizations will look to strike a balance between public cloud services and their own internal IT infrastructure resources.



Add Comment      Leave a comment on this blog post

Post a comment

 

 

 

 


(Maximum characters: 1200). You have 1200 characters left.

 

 

Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.


 
Resource centers

Business Intelligence

Business performance information for strategic and operational decision-making

SOA

SOA uses interoperable services grouped around business processes to ease data integration

Data Warehousing

Data warehousing helps companies make sense of their operational data


Thanks for your registration, follow us on our social networks to keep up-to-date