More

    Striving Toward the Optimal Data Environment

    The enterprise still seems at a loss over what to do with its data center. On the one hand, advanced cloud architectures are making it easier and less costly to support scale-out infrastructure, while converged and hyper-converged systems promise to radically shrink the size of on-premises resources, and simplify their management to boot.

    About the only thing that everyone agrees upon is that data operations need to become more streamlined and the relationship between workloads and resources needs to be optimized.

    The question is, how. According to a recent survey by IDG, less than 10 percent of IT executives believe they are receiving optimal performance from their data centers. This includes parameters such as high availability, scalability and the ability to provide secure access to resources and applications from any device. Also somewhat interestingly, nearly 40 percent report moving workloads from the cloud to the local data center due to either security or cost concerns. Ideally, then, a fully optimized data environment would consist of both on-premises and third-party resources that are virtualized to the point where workloads are dynamically assigned to resource configurations that provide the highest performance at the lowest possible cost.

    Too bad it’s not that simple. Indeed, if there was a template for this kind of functionality, there wouldn’t be much left for IT developers and managers to do. Instead, we’re seeing the rise of a new generation of data center optimization services in which consultant and legacy data equipment vendors devise customized approaches to the enterprise’s particular data needs. According to Persistence Market Research, these efforts are emerging in multiple formats, such as service type (storage, networking, virtualization), industry verticals, and along regional bases to accommodate compliance, residency and other requirements.

    A key element in the drive toward the optimized data environment is power consumption, which has led to the rise of Data Center Infrastructure Management (DCIM) and related platforms. But as FNT Software consultant Oliver Lindner noted to Datacenter Journal recently, optimization is more than a matter of uniting IT with facilities management. The pace of technological development is so rapid these days that once an overarching approach has been designed and implemented, it is already obsolete. Instead, organizations should adopt an ongoing operational strategy that identifies and corrects problems for all key stakeholders, such as breaking down data silos, embracing digital transformation, and focusing on successful user experiences.

    This isn’t to say organizations should ignore the power equation in their optimization strategies. Chatsworth Products (CPI) recently released a set of best practices to help align resource consumption with thermal management and other environmental control elements to drive power efficiency to new levels. These include adjustments to airflow through vertical exhaustion, increased use of blanking filler panels, air dams, and other tools to better align power and heat densities, and intelligent power and cooling monitoring to adopt a more proactive approach to load balancing and resource utilization.

    A fully optimized data ecosystem, of course, is a constantly moving target. Once a certain level of service has been achieved, the push begins for the next level. But with much of the world’s data still sitting on legacy, physical systems, it is fair to say that many organizations will see substantial gains simply through greater deployment of virtual, cloud and software-defined infrastructure.

    An entirely new generation of data users, in fact, is demanding more data, better service and increasingly sophisticated applications, and enterprises that cannot fulfill those desires in an optimal, cost-effective manner will simply cease to exist before too long.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles