High-speed Architectures Would Benefit from Broader Interoperability

Arthur Cole

Interoperability has always been a desirable trait among enterprise systems and architectures. The easier it is for data to traverse platforms, the more productive and less costly the entire data system environment becomes.

But it seems that as technologies like virtualization and the cloud push the overall data universe to higher and higher levels of productivity, interoperability is transitioning from a welcome luxury to a bread-and-butter necessity.

The need for universal or near-universal interoperability is most pronounced on the high-speed, high-data network architectures ready to hit the channel. A unified network does wonders for streamlining overall infrastructure, but its value is still diminished if multiple data silos remain in place.

The Ethernet Alliance has placed interoperability at the top of its priorities list for the new 40G and 100G standards. The organization, along with the Next Generation Enterprise Cabling group, is working on a new set of interconnect standards designed to ensure compatibility across multi-vendor environments. Specs and demonstrations are expected at the Technology Exploration Forum coming up in June in Santa Clara, Calif.

More than just networking compatibility will be necessary to realize promised productivity gains, however. High-bandwidth applications like videoconferencing need to be able to cross vendor lines as well. At the moment, however, most of the movement is centered around individual vendor partnerships that have, say, Polycom supporting Microsoft's Real-Time Video (RTV) codec or Vidyo providing plug-ins for Lync and Office Communications Server (OCS).

The same is true for traditional data applications, such as storage-area networking. Coraid, for example, recently hooked up with Arista Networks to develop high-speed SAN solutions consisting of Arista 10 GbE switches and the Coraid EtherDrive array. Together, they claim to deliver a five- to eight-fold improvement in price/performance over existing Fibre Channel solutions, although the high degree of interoperability required to achieve this level of performance extends only to these two platforms.

To be fair, the prospects of an entirely interoperable data center environment are as distant today as they ever were. But the more dependent individual enterprises become on third-party resources and high-speed infrastructure, the greater the pressure to ensure that all systems work together no matter whose proprietary technology is in place.

If vendors want to tap the cash flow that is likely to come from cloud computing, they just may find the motivation to make broad interoperability a reality.

Add Comment      Leave a comment on this blog post

Post a comment





(Maximum characters: 1200). You have 1200 characters left.




Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.