More

    Application Support in the Software-Defined Data Center

    Slide Show

    How the Data Center Will Grow Up in Three Years

    Now that the software-defined data center (SDDC) is nearly upon us, enterprise executives need to start asking a number of pertinent questions; namely, how do I build one, and what do I do with it once it is built?

    In essence, the SDDC is more about applications than technology. The same basic virtual and cloud technologies that have infiltrated server, storage and now networking are employed to lift data architectures off of bare metal hardware and into software. But it is the way in which those architectures support enterprise apps, and the way in which the apps themselves are reconfigured to leverage this new, more flexible environment that gives the SDDC its cachet.

    Until lately, however, the application side of the SDDC has been largely invisible, with most developments aimed at the platform itself. Last week, however, VMware announced an agreement with India’s Tata Consulting Services (TCS) to develop pre-tested and pre-integrated applications for the SDDC. Under the plan, TCS will provide architectural support and operational expertise to help organizations transition legacy apps into virtual environments powered by VMware solutions, namely vSphere, NSX, Virtual SAN and the vRealize Suite. The deal also calls for the creation of a Center of Excellence to link data centers in Milford, Ohio and Pune, India, to handle beta test and workload assessment functions.

    The biggest challenge in the SDDC is to foster environments that finally fulfill the expectations of IaaS and PaaS architectures, says VMware India’s BS Nagarajan. Software-defined environments push the concepts of resource flexibility and scalability to extremes, and the fact is that applications built around static legacy infrastructure will not provide effective solutions for the mobile/cloud era. Getting both the SDDC and the application stack off on the right foot is vital in order to capitalize on the investment already made in developing virtual and cloud-based architectures and is in fact the final step in the process of creating the service-oriented IT model.

    Another goal of the SDDC is to allow applications to be deconstructed according to the needs of the workload – most likely along compute, network and storage lines – so that resources can then be optimized for each function, says Tech Radar’s Roger Smith. VMware’s majority owner, EMC, is taking steps in this direction in the form of the Federation platform that the company hopes will emerge as a Java-esque reference architecture for SDDC application development. The system consists of a number of tools developed by EMC and its subsidiaries, such as the ViPR storage platform, vSphere virtualization, Avamar data protection and PaaS technology from Pivotal CF.

    Data Center

    To fully realize the benefits of software-defined architectures, we need to expand our thinking about the virtual layer to see how it handles the requirements of a dynamic data environment, says tech author and consultant Dan Kusnetzky. Similar to the networking stack, the virtualization stack in fact consists of seven layers, governing everything from remote client interoperability (Layer 1) to monitoring and authorization (Layer 7). Far from being the basic element of the virtual environment, the virtual machine (VM) is only one component of Layer 3, which governs multiple workload support and is more properly known as processor virtualization. Only by leveraging all seven layers of the virtual stack properly will the SDDC truly come into its own as the foundation for the next-generation data ecosystem.

    It is unlikely that the enterprise will encounter a “light switch moment” in which the SDDC is suddenly powered up in all its glory. Rather, it will be more of an evolutionary process, albeit a rapid one. Once the network layer becomes virtualized in the data center, there is little to prevent users and applications from cobbling together the necessary resources completely in software, save for the need to deploy effective management and governance of the new environment.

    This will require a new way of looking at the data center, and a new set of skills for those who are charged with overseeing it, but it should enable the enterprise to finally kick data productivity into high gear without blowing the budget on infrastructure and services.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata, Carpathia and NetMagic.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles