More

    Storage Development Proceeding on Multiple Fronts

    Slide Show

    Five Ways to Address Your Data Management Issues

    Data is not like real estate: Someone somewhere is definitely making more and more of it.

    But while the rising tide of data is contributing mightily to the increased productivity of the world economy, it also puts a strain on worldwide storage infrastructure – so much so that development of new storage solutions is running at a fevered pitch.

    A main contributor to the data increase is the Internet of Things (IoT), which so far, unfortunately, has only given us the tiniest trickle of the flood that is to come. At the moment, most IoT-related infrastructure is gravitating toward object storage, says Caringo CEO Jonathan Ring, given its ability to provide high scalability and flexibility at low cost. This will be crucial for the literally billions of sensor-driven data streams that are expected by the end of the decade, as well as all the metadata that needs to be generated in order to turn those raw feeds into actionable intelligence. As an inherently software-defined approach, object storage also functions more efficiently with the virtualized compute and networking infrastructure that is quickly taking over the data center.

    On the physical layer, however, even object storage relies on the two-dimensional storage architectures that have existed since the earliest tape drives. But what if there was a way to utilize 3D space for storage? That is what University of New Mexico professor Jeff Rack is attempting to do with a new laser-based system that would allow data to reside inside a storage medium, not just on the surface. Rack is currently investigating various photonic and photochromatic materials to determine which ones provide the most resilience when altered by beams of light. Adding to the challenge, of course, is the need for the system to function predictably and at room temperature, which is even more difficult considering the lack of test and measurement gear for this sort of work.

    A better medium is not the only factor in an enhanced storage system, however. When data loads mount, storage management becomes just as important. According to ioFabric’s Greg Wyman, the ideal solution would be storage in any amount that can be provisioned as easily as turning on a faucet. Such an environment is possible, but it will require a wealth of technologies, including intelligent converged infrastructure, software-defined storage, advanced QoS management and containers. But the beauty of it is that it can be layered atop virtually any platform to finally allow techs to focus on finding innovative solutions to pressing problems rather than simply managing storage.

    Many data users are also turning to shared storage in the cloud to support their digital activities, but this presents a wealth of access, reliability and security issues for the enterprise. To that end, researchers like Texas A&M’s Dr. Jennifer Welch are studying new ways to maintain consistent conditions across shared storage architectures, particularly ones that are hosted on distributed cloud platforms. A key requirement in shared storage is the need to provide parallel, rather than sequential, data access to multiple users, which can be accomplished by either relaxing the consistency requirements for return values following a particular execution or developing new algorithms that enable non-linear operations in an otherwise sequential data queue. Either way, the challenge is to maintain this more flexible operational paradigm without introducing latency or sacrificing data integrity.

    The technology industry has been caught in a pattern that is not likely to let up any time soon: The more it provides, the more users want. With every advance in processing, storage or networking, there is invariably a developer somewhere ready to push things to the next level.

    The paradigm shift in data activity is already occurring. The question remains whether storage capabilities can keep pace without pushing either costs of complexity to unsustainable levels.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.

    Save

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles