Rather than having to switch database platforms to drive the convergence of transaction processing and analytics, IBM is making the case for a new faster version of DB2 with Blu Acceleration that not only runs some queries a thousand times faster, but also now fully supports SAP applications.
Nancy Kopp-Hensley, director of strategy and marketing for IBM database systems, says the latest iteration of DB2 Blu Acceleration, codenamed Cancun, makes extensive use of “shadow tables” to make it a lot more efficient to combine real-time analytics and transaction processing without introducing additional application performance overhead or having to switch to an entirely new database architecture to accomplish that goal.
For the foreseeable future, Kopp-Hensley notes that many organizations are going to need to run both traditional data warehouses and real-time analytics using the same database platforms. As such, DB2 provides a platform that can support both models in a way that doesn’t constrain the database to only the amount of memory that is available. That flexibility is critical, says Kopp-Hensley, at a time when not every database fits neatly in the amount of available memory, and the performance of the level two and three cache that exists outside the processor can be faster in terms of overall throughput than what is on the processor.
Finally, IBM is making available instances of DB2 Blu Accelerator on smaller clusters that are intended to make the platform more accessible to a broader number of customers. In addition, IBM is adding support for Ethernet configurations that no longer require IT organizations to invest in Fibre Channel to run DB2 Blu Accelerator.
All in all, as the software that essentially manages access to both memory and disk, DB2 Blu Accelerator is designed to abstract away much of the complexity associated with managing I/O performance without requiring IT organizations to go to the expense of replacing a row-level architecture with a columnar database such as the SAP In-Memory HANA platform. IBM contends that the end result is an ability to converge analytics and transaction processing running in memory in a way that causes the least amount of disruption to the existing IT environment as possible.