SHARE
Facebook X Pinterest WhatsApp

In-Memory Solutions: Confronting the Big Data Challenge

Enterprises are scrambling to come up with ways to scale their infrastructure to meet the demands of Big Data and other high-volume initiatives. Many are turning to the cloud for support, which ultimately puts cloud providers under the gun to enable the hyperscale infrastructure that will be needed by multiple Big Data clients. Increasingly, organizations […]

Written By
thumbnail
Arthur Cole
Arthur Cole
Feb 3, 2015

Enterprises are scrambling to come up with ways to scale their infrastructure to meet the demands of Big Data and other high-volume initiatives. Many are turning to the cloud for support, which ultimately puts cloud providers under the gun to enable the hyperscale infrastructure that will be needed by multiple Big Data clients.

Increasingly, organizations are turning to in-memory solutions as a means to provide both the scale and flexibility of emerging database platforms like Hadoop. Heavy data loads have already seen a significant performance boost with the introduction of Flash in the storage farm and in the server itself, and the ability to harness non-volatile RAM and other forms of memory into scalable fabrics is quickly moving off the drawing board, according to Evaluator Group’s John Webster. In essence, the same cost/benefit ratio that solid state is bringing to the storage farm is working its way into the broader data infrastructure. And with platforms like SAP HANA hitting the channel, it is becoming quite a simple matter to host entire databases within memory in order to gain real-time performance and other benefits while still maintaining persistent states within traditional storage.

For example, a company called Altibase has developed what it calls an in-memory hybrid database, the ALTIBASE HDB, that integrates an in-memory relational database with traditional on-disk solutions to enable both the speed and sustainability that emerging analytics functions require. At the same time, it offers an easy migration path from legacy platforms to high-speed, scale-out infrastructure. In fact, the company is preparing a free version of the software to be released within the quarter.

As well, BlueData is out with an in-memory module for its EPIC platform that enables high-speed Hadoop, Hbase and other clustering solutions for the open source Tachyon distributed storage environment. The system is targeted at key applications like trading, fraud detection and system monitoring that require both high-scale and real-time performance, with the added benefit that it provides a unified Tachyon environment from the start, rather than a cluster-by-cluster deployment. In this way, the company says it can “democratize” Big Data infrastructure by removing cost and complexity as a barrier to adoption.

Still, the enterprise should avoid thinking about Big Data in terms of a single solution, says InfoStructure Associates’ Wayne Kernochan. In-memory brings many advantages to the table, but so do standard Hadoop/NoSQL approaches, virtualized DBs, as well as streaming and even columnar solutions. In-memory provides a flattened fabric architecture, while non-relational systems like NoSQL can accommodate higher degrees of scale without sacrificing performance. As well, virtualization can do for the database what it has done for the rest of the data environment, namely, introduce a level of abstraction that improves management and functionality. And these solutions are not mutually exclusive, either – combinations such as virtual, columnar platforms riding atop an in-memory infrastructure will likely be the order of the day, with the caveat that each solution will be both enhanced and diminished by the strengths and weakness of the other.

The good news in all of this is that Big Data and scale-out architectures do not have to be met with more infrastructure, but rather a reimagining of the data architectures that have served the enterprise within the data center. These solutions will cost a premium and involve a fair amount of complexity at first, but will gradually become more functional as they enter the mainstream.

In the meantime, it helps to know that solutions like in-memory architectures are already at a level of maturity to step in when Big Data comes calling.

Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata, Carpathia and NetMagic.

Recommended for you...

How Revolutionary Are Meta’s AI Efforts?
Kashyap Vyas
Aug 8, 2022
Data Lake Strategy Options: From Self-Service to Full-Service
Chad Kime
Aug 8, 2022
What’s New With Google Vertex AI?
Kashyap Vyas
Jul 26, 2022
Data Lake vs. Data Warehouse: What’s the Difference?
Aminu Abdullahi
Jul 25, 2022
IT Business Edge Logo

The go-to resource for IT professionals from all corners of the tech world looking for cutting edge technology solutions that solve their unique business challenges. We aim to help these professionals grow their knowledge base and authority in their field with the top news and trends in the technology space.

Property of TechnologyAdvice. © 2025 TechnologyAdvice. All Rights Reserved

Advertiser Disclosure: Some of the products that appear on this site are from companies from which TechnologyAdvice receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. TechnologyAdvice does not include all companies or all types of products available in the marketplace.