Few would argue that we’ve entered the age of “Big Data.” Companies in arguably every industry are witnessing an exponential increase in data creation with data taking many forms – analytical, transactional, operational, structured or unstructured.
Concurrently, concerns about where Big Data resides in the hype cycle have recently surfaced. However, the CIOs and business unit leaders that I speak with daily universally agree that their approach to implementing Big Data strategies and solutions is not a question of whether they go down this road but when and how they best navigate the journey. They grapple with questions such as these: How does my organization tap the tremendous potential for Big Data? How much do I have to invest in new infrastructure? Can I leverage the tools, technologies, systems and software that we’ve only recently deployed?
To that point, today’s most popular business intelligence (BI) systems — like SAP NetWeaver Business Warehouse (SAP NetWeaver BW) — remain in their prime, despite experiencing record levels of data growth from terabytes to petabytes and beyond. Conversely, newer tools such as SAP HANA promise to help transform this mounting data into a powerful organizational asset.
Perhaps the most significant Big Data challenge facing IT organizations and their end users is figuring out how to maximize the opportunity for never-before-realized real-time business intelligence, while minimizing the impact of exploding data volumes on productivity and total cost of ownership (TCO).
These organizations need to understand that implementing an efficient data management program is an important step that can help build a bridge from their current BI and data warehouse environments to SAP HANA. Creating this bridge can be accelerated with the adoption of two key strategies:
The promise of Big Data applications and SAP HANA’s seemingly endless performance capabilities do not negate the reality that turning massive amounts of data into insightful real-time intelligence requires a sound data management infrastructure. As more data accumulates in BW and ERP (SAP ECC) systems, one major threat to data access is the I/O performance bottleneck caused by traditional storage technology. SAP HANA eliminates this performance bottleneck by keeping data in memory; however, the extra cost to add sufficient capacity will be excessive.
An innovative and cost-effective approach to solving this challenge and a foundational piece to paving the road to SAP HANA is to move large amounts of static data to a low-cost, high-performance nearline storage (NLS) environment. New technologies, like the SAP HANA platform, provide a column-oriented, in-memory data appliance for fast, ad hoc data analytics. An efficient NLS system can complement the in-memory SAP HANA architecture, and because a company’s data archiving strategy continues after a system is moved to the SAP HANA platform, NLS remains an essential solution for maintaining data growth and optimal performance in the production system.
Whether companies continue to run their BW environments on existing platforms with a relational database management system and a BW accelerator, or plan to migrate to SAP HANA, it is critical to segregate frequently used, “high-value” information from data that provides lower business value.
By implementing a NLS solution to address Big Data challenges, companies can also prepare themselves to introduce SAP HANA into their environment. Before implementing SAP HANA, customers should shrink the amount of data they put into the in-memory system to ensure they are not bloating SAP HANA with old or useless data. This starts with putting “value” on data and creating a plan to archive older, seldom-used data that’s not critical to ongoing, real-time operations.
Meanwhile, data archiving continues to be an essential part of this data management process, and a NLS solution ensures the right balance between performance and storage costs. These solutions not only enhance current SAP NetWeaver BW deployments, but will do the same as companies move to SAP HANA in the future. They do so by moving data from the active database to NLS, substantially compressing the data and making it accessible without having to reload it back into SAP NetWeaver BW.
Archiving can boost performance and reduce costs for the active system without sacrificing transparent access to the data, critical to real-time BI applications that sit on top of the highly scalable, column-based database.
The column-based nearline environment provides the foundation for a reliable and inexpensive information management strategy for BW systems. It can also serve as an analytics and indexing engine for archiving in transaction-oriented systems, such as SAP ERP, CRM, SRM, etc. In this way, a NLS solution delivers the necessary throughput to effectively archive high transaction volumes in these systems.
Dr. Werner Hopf is responsible for setting the company’s strategic corporate direction and is the Archiving Principal at Dolphin. With more than 20 years of experience in the information technology industry, 14 focused in SAP, Dr. Hopf specializes in SAP Information Lifecycle Management initiatives including Data and Document archiving, SAP Archive Link Storage Solutions and Business Process solutions. His experience spans both large and mid-sized companies across all major SAP modules. Having worked on SAP projects across North America and Europe, he has extensive experience in global markets and is well known for his expertise. Dr. Hopf earned a Masters of Computer Science degree and a PhD in Business Administration from Regensburg University, Germany.