Mixing It Up in the Storage Farm

    Slide Show

    Software-Defined Storage: Driving a New Era of the Cloud

    The enterprise has more options than ever when it comes to building storage infrastructure these days. This has the benefit of providing a more tailored, responsive environment for varying data requirements, but it also poses the risk of undue complexity that could ultimately lead to lost productivity or even lost data.

    But with disk, tape, solid-state and even optical solutions all clamoring for a place in the storage farm, how are IT executives supposed to know what the proper mix should be going forward?

    According to former Germane Systems VP of Engineering Jim O’Reilly, advanced tiering will be critical to storage environments going forward, coupled with a robust management platform capable of assigning data to its proper place. Solid-state storage is often viewed as a Tier 1 (or Tier 0) solution, but this shouldn’t imply that it is a superior medium—just a faster one that can deliver critical data at a moment’s notice. Other tiers may sacrifice on speed, but they provide more feature-rich environments to meet increasingly diverse requirements. As well, lower tiers will likely consist of mixed media in order to meet a wide variety of I/O, bulk storage and other needs.

    Diversity of storage also increases dramatically once the cloud is brought in, says Markley Group’s Devon Cutchins. But when using public resources for, say, backup, media considerations take a back seat to the particular features and performance required of the data, which means the enterprise will need to start classifying data along various usage profiles. Typically, this will be some form of hot, cold or warm designation that will allow volumes to be placed according to availability or recovery needs or on the lowest cost basis for long-term archives. As well, data classification will have to take into account policy and governance issues, such as security, access and compliance.

    And expect things to get even more complicated once the era of Big Data kicks into high gear, says Gigaom Research’s Ashar Baig. Before long, it will be common practice for businesses to routinely analyze current data against historical sets in order to gain perspective on market trends and other developments. This means even cold data will need to be pulled up from time to time, which places it on a more active tier, usually at higher cost. Optimizing this dynamic environment so that the enterprise maintains maximum performance while still keeping the lid on the storage budget will be a top priority going forward.

    Still, when it comes to building the new storage infrastructure, the newest piece is the solid-state or Flash array, which is why many organizations are still wondering how it should be configured. According to the UK Register’s Chris Mellor, all the top solutions offer a number of key attributes, such as multiple controllers, an internal backplane or fabric, and enough bandwidth to handle a high number of I/O requests. Most are also designed for scale-up, rather than scale-out, which may seem like a disconnect in the era of hyperscale and cluster-based architectures, but Mellor contends that scale-up has actually demonstrated superior reliability and performance so far.

    The days of “store it and forget it” are quickly coming to a close, even for long-term archives. Future storage environments will be more active, agile and dynamic in order to better match the enhanced capabilities of compute and networking infrastructure.

    In all likelihood, storage will continue to consume the lion’s share of the IT budget, but it will also be more efficient on a cost-per-GB basis and deliver higher performance for emerging data-driven business processes.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata, Carpathia and NetMagic.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Latest Articles