While there has never been more memory available to speed the performance of applications, it turns out that one of the limiting factors when it comes to actually accessing that memory are the limitations of programming languages such as Java.
Most programming languages were not designed at a time when anyone thought that hundreds of gigabytes of memory would be routinely available on x86-class servers. The assumption was that the application would always need to call out to disk. But now entire applications can theoretically run in memory. The challenge has been finding ways to make that happen.
It's that very issue that makes the folks at Terracotta, a subsidiary of Software AG, say that 2012 is going to be a break-out year for distributed caching technologies. In the past, demand for distributed caching has been limited to a select few high-end distributed applications. But as more IT organizations start to think of memory as "the new disk" for applications, they will bump into the current limitations of Java memory management.
They could, of course, look to other languages to create applications. But Mike Allen, vice president of product management for Terracotta, says that as far as mainstream commercial applications used in business applications, Java remains the standard. What Java developers will need, says Allen, is a distributed caching capability provided by Terracotta that makes all the memory in not only one processor readily available, but also makes all the memory distributed in multiple processors look like one logical entity.
The implications of all that memory also go well beyond performance. As more of the underlying elements of the application such as the database run in memory alongside business logic, the line between those software technologies starts to blur. That may incent some developers to tie applications tightly to a given set of middleware depending upon how important they think it is to make sure their applications remain a distinct portable entity that can run on multiple platforms.
No matter how you look at it, Allen says that there is now an impedance mismatch between the amount of memory that is available and what developers today can use. That untapped potential is going to lead developers to not only look for new ways to run existing applications much faster, but also create a new generation of applications that correlate vast amounts of data in real time, which should ultimately lead to a new Renaissance period in application development.