More

    HDS Extends Data Storage Control to Public Clouds

    Slide Show

    Twelve Cloud Storage Services for Work and Play

    The biggest objection to public cloud computing is not the idea that there are external IT resources as much as there is not enough control being provided over them.

    To address that issue, Hitachi Data Systems (HDS) this week announced it is adding tools to the Hitachi Content Platform (HCP), which allows IT organizations to tier data across private and public clouds.

    Peter Sjoberg, CTO of content products at HDS, says updates to the Hitachi Content Platform Anywhere (HCP Anywhere), and Hitachi Data Ingestor (HDI) now extend the reach of Hitachi cloud file controllers and directors to public clouds managed by Google, Microsoft and Amazon Web Services (AWS). Support for additional public cloud computing environments is also planned, says Sjoberg.

    Sjoberg says HCP makes use of an object storage system that is extended to any number of file systems using RESTful APIs. As such, organizations can leverage HCP to manage legacy application environments based on file systems and next-generation cloud applications that would take advantage of object storage.

    Sjoberg says the challenge that HDS is overcoming is that HCP now allows organizations to read and write data to public clouds as transparently as they currently do to private clouds. The end result is a hybrid cloud computing environment that enables end users to leverage HCP Anywhere to securely access and synchronize files anywhere they are located.

    When all the emotional issues surrounding public clouds are peeled away, what most organizations are really left with is a data governance issue. As data storage systems continue to evolve, it’s clear that in a lot of use cases, a public cloud is really just a virtual storage array. The data that gets placed in that public cloud just needs to be stored in a way that preserves its context with the rest of enterprise.

    In that scenario, most IT organizations really need a mechanism for extending control over their internal IT systems to external cloud services. Once that’s accomplished, most end users will be hard pressed to tell where their data is being accessed or stored at any given moment. All they’ll know is that the internal IT organizations responsible for securing and governing that data made the decision for them based not only on how many copies of the data already exist, but also on the actual business value of that data to the organization.

    Mike Vizard
    Mike Vizard
    Michael Vizard is a seasoned IT journalist, with nearly 30 years of experience writing and editing about enterprise IT issues. He is a contributor to publications including Programmableweb, IT Business Edge, CIOinsight and UBM Tech. He formerly was editorial director for Ziff-Davis Enterprise, where he launched the company’s custom content division, and has also served as editor in chief for CRN and InfoWorld. He also has held editorial positions at PC Week, Computerworld and Digital Review.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles