As memory becomes both less expensive and more pervasive, the number of applications that are designed to run completely in memory is going to sharply increase in the months ahead.
Case in point is the latest release of Pentaho Business Analytics, which can now run in memory. According to Pentaho chief technology evangelist Ian Fyfe, the underlying technology that Pentaho Business Analytics creates is a distributed cluster that is loaded completely in memory. To the application, that cluster looks like a standard database, but because it's running in memory, the response times are measured in seconds.
Fyfe says that while data management frameworks such as Hadoop make it cost-affordable to store lots of data, those systems are all batch-oriented. As a result, applications such as Pentaho will need to load a subset of that data for the purpose of analytics. Pentaho, however, is providing a way to analyze that data in seconds by loading it into memory versus relying on a disk-based storage system.
One of the major side benefits of that approach is that it reduces reliance on database administrators who previously were needed to model the data in a database. Instead, the raw data that needs to be analyzed is loaded directly in memory.
Better yet, says Fyfe, the number of analytics projects that can be run expands because the cost of the systems, along with the cost of the people needed to manage that data, is sharply reduced.
What all this means is that not only is analytics getting a lot faster, the cost of ownership associated with those applications is dropping sharply. The implications of that capability for the business can be nothing short of profound, because with each new analytics application, the potential to run the business better increases exponentially.