One of the main problems in introducing scale-out architecture to legacy data environments is the sheer number of incompatible formats, platforms and vendor solutions that have infiltrated the data center over the years.
The drive to remove these siloes and federate the data environment under either a single proprietary solution or the myriad open platforms currently available is well underway. But in many cases the transition is happening too slowly given that the need to scale out is immediate as enterprises attempt to cope with issues like Big Data and the Internet of Things.
This is why many researchers are looking to move the concept of virtualization to an entirely new level. Rather than focus on infrastructure like servers, storage and networking, virtualization on the data plane introduces a level of abstraction that allows data and applications to sit on any hardware, and thus interact with other data sets across the enterprise and into the cloud. And as tech author Anne Buff points out, it would also optimize hardware utilization and reduce system complexity, as well as offer more centralized security and control.
A number of start-ups are eager to fulfill this vision. One of the newest is Primary Data, founded by the original backers of Fusion-io, including famed computer guru and dancer with the stars Steve Wozniak. The company is working on a solution that would integrate tools like a data hypervisor and policy engine with a centralized management stack that would allow data and applications to access multiple storage environments ranging from direct attached solutions to advanced public and private cloud architectures. One of the ways to do this is to decouple the data access channel from the control channel, which enables client devices to become protocol-agnostic entities capable of supporting a global dataspace, even across third-party infrastructure. The company is currently raising venture capital and testing its platform with select customers, with an anticipated launch sometime next year.
Another contender is Coho Data, which describes its system as a flash-tuned scale-out storage architecture aimed at private cloud deployments. The platform consists of both hardware and software that utilize commodity storage components and a data hypervisor that does away with layered file systems, volume managers, RAID parity schemes and other tools that tend to populate traditional storage environments. The aim is to provide shared, scalable storage that is suitable for a wide range of disparate applications, primarily through the use of object-based storage and an advanced translation layer to oversee various block and file protocols like iSCSI.
Another key advantage to data virtualization is in application deployment and delivery, according to Delphix’s Ted Girard. By virtualizing the application and data level, you have the ability to implement a virtual environment from within a database, a data warehouse and even a file. This allows development teams to spin up appropriate environments within minutes, and then extend that environment across multiple copies of the data so each copy is couched within its own fully formed virtual instance. With the appropriate APIs, source applications can then interact with each other through a single, shared master image that can accommodate 20 or more virtual environments. The end result is more time devoted to development and deployment and less time creating and managing virtual resources.
With many of the details of data virtualization still being worked out, it is rather difficult to see how big of an impact it will have on the enterprise data environment. The potential is certainly there, but the devil, as they say, is in the details.
Still, some very smart minds are now focused on finding ways to allow data virtualization to produce real value in the data center, so it will be interesting to see what they come up with over the next year or two.
Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata, Carpathia and NetMagic.