With the development of its High Performance Analytics Appliance (HANA) platform, SAP is setting out to create a new type of mainstream computing platform that is based on in-memory computing.
At the SAP TechEd 2011 conference today, SAP is delivering a progress update in the form of two new offerings that illustrate the new classes of real-time applications that can be developed by leveraging in-memory computing. The first is a smart meter analytics application, while the second is an analytics application that is optimized for financial data.
According to Kijoon Lee, SAP senior director for in-memory computing and SAP HANA solution marketing, the technical impact of HANA comes down to three things. The first is that in-memory computing eliminates the need to pre-aggregate data because the compute capacity to handle raw data is readily available. It also makes it much easier to embed analytics inside an application, as opposed to requiring users to invoke a separate analytics application. Finally, it significantly improves performance because as more business logic moves directly into the database, the line between the applications and the database starts to blur.
A big part of the reason for the performance improvement, says Lee, is that HANA is designed to optimize the handoffs between cache and memory, as opposed to other systems that simply optimize the handoffs between disk and memory.
But the real value of HANA, adds Ken Tsai, SAP senior director of platform marketing, is going to manifest itself once SAP and third-party developers start creating a raft of real-time applications that were never really feasible to produce before. Tsai says these will give birth to new classes of applications that will significantly expand the critical role that software plays in driving a multitude of business processes, because applications will soon routinely be able to sense actual patterns and then respond automatically.