More

    Keeping High-Scale Data and Resources in Sync

    One of the earliest promises of virtualization was that it would allow the enterprise to dynamically shift workloads to the most efficient hardware footprint. This, in turn, would improve resource utilization and lifecycles, not to mention power consumption and overall performance.

    In reality, however, this proved to be a bit more complicated than it initially seemed, particularly since truly dynamic workload balancing required the cooperation of storage and networking as well.

    But now that entire data center architectures can be provisioned in software, the basis for dynamic workload allocation is in place. All that’s needed is the means, and the will, to do it.

    One critical element that is needed for a dynamic data environment is a thorough understanding of workloads and how they interact with underlying resources. In a recent survey by the Enterprise Strategy Group commissioned by Virtual Instruments, barely 40 percent of IT shops profile their workloads and conduct adequate testing before they purchase additional resources like storage. This produces a significant blind spot when it comes to understanding true application needs, causing many organizations to continue over-provisioning both physical and virtual infrastructure on-premises and in the cloud.

    In many cases, enterprises rely on either vendor recommendations or cloud/colocation providers to guide their resource consumption, says IO Data Centers’ David Mettler. This is kind of like allowing your grocer to tell you how much bread to buy. Most providers will stress capacity over efficiency when it comes to resource allocation, when in reality the enterprise should stress efficiency over capacity. Again, though, this will require deep-dive analysis into application needs and data usage patterns with an eye toward meeting today’s load and tomorrow’s.

    The good news is that many emerging IT management stacks are starting to stress the efficiency factor in virtual architectures now that the enterprise is starting to implement scale-out data environments like Hadoop and Cassandra. Dell EMC recently teamed up with DriveScale to help improve the ratio between resource consumption and data load in cloud native applications. DriveScale’s SCI platform provides for software-composable infrastructure that provides individual pools of compute and storage that can be dynamically scaled to match the unpredictability of advanced analytics loads. Under the agreement, Dell EMC will resell SCI as a certified add-on to PowerEdge Servers, Ethernet switches and Direct Attached Storage products.

    And earlier this year, Big Data-as-a-Service company Qubole introduced what it calls the first autonomous data platform that can intelligently and automatically analyze resource usage to make data processes more productive. The system consists of three key elements: the Qubole Data Service (QDS) Community Edition, the QDS Enterprise Edition and QDS Cloud Agents. In addition to providing rules-based, workload-aware predictive management, the system includes an auto-scaling feature for dynamic resource allocation as well as a Spot Shopper Agent for AWS that continuously seeks out optimal deployment architectures on the Amazon cloud.

    As data architectures evolve into the cloud and beyond, so too will the ability to optimize the way data and applications are deployed. But all the technology in the world will not be enough to streamline operations if the enterprise lacks a clear understanding of their data usage and their goals.

    In the end, IT management is captive to one longstanding rule: If you put garbage in, you get garbage out.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles