More

    In-Memory Solutions: Confronting the Big Data Challenge

    Enterprises are scrambling to come up with ways to scale their infrastructure to meet the demands of Big Data and other high-volume initiatives. Many are turning to the cloud for support, which ultimately puts cloud providers under the gun to enable the hyperscale infrastructure that will be needed by multiple Big Data clients.

    Increasingly, organizations are turning to in-memory solutions as a means to provide both the scale and flexibility of emerging database platforms like Hadoop. Heavy data loads have already seen a significant performance boost with the introduction of Flash in the storage farm and in the server itself, and the ability to harness non-volatile RAM and other forms of memory into scalable fabrics is quickly moving off the drawing board, according to Evaluator Group’s John Webster. In essence, the same cost/benefit ratio that solid state is bringing to the storage farm is working its way into the broader data infrastructure. And with platforms like SAP HANA hitting the channel, it is becoming quite a simple matter to host entire databases within memory in order to gain real-time performance and other benefits while still maintaining persistent states within traditional storage.

    For example, a company called Altibase has developed what it calls an in-memory hybrid database, the ALTIBASE HDB, that integrates an in-memory relational database with traditional on-disk solutions to enable both the speed and sustainability that emerging analytics functions require. At the same time, it offers an easy migration path from legacy platforms to high-speed, scale-out infrastructure. In fact, the company is preparing a free version of the software to be released within the quarter.

    As well, BlueData is out with an in-memory module for its EPIC platform that enables high-speed Hadoop, Hbase and other clustering solutions for the open source Tachyon distributed storage environment. The system is targeted at key applications like trading, fraud detection and system monitoring that require both high-scale and real-time performance, with the added benefit that it provides a unified Tachyon environment from the start, rather than a cluster-by-cluster deployment. In this way, the company says it can “democratize” Big Data infrastructure by removing cost and complexity as a barrier to adoption.

    Still, the enterprise should avoid thinking about Big Data in terms of a single solution, says InfoStructure Associates’ Wayne Kernochan. In-memory brings many advantages to the table, but so do standard Hadoop/NoSQL approaches, virtualized DBs, as well as streaming and even columnar solutions. In-memory provides a flattened fabric architecture, while non-relational systems like NoSQL can accommodate higher degrees of scale without sacrificing performance. As well, virtualization can do for the database what it has done for the rest of the data environment, namely, introduce a level of abstraction that improves management and functionality. And these solutions are not mutually exclusive, either – combinations such as virtual, columnar platforms riding atop an in-memory infrastructure will likely be the order of the day, with the caveat that each solution will be both enhanced and diminished by the strengths and weakness of the other.

    The good news in all of this is that Big Data and scale-out architectures do not have to be met with more infrastructure, but rather a reimagining of the data architectures that have served the enterprise within the data center. These solutions will cost a premium and involve a fair amount of complexity at first, but will gradually become more functional as they enter the mainstream.

    In the meantime, it helps to know that solutions like in-memory architectures are already at a level of maturity to step in when Big Data comes calling.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata, Carpathia and NetMagic.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles