We all know that the amount of data that needs to be managed is spiraling out of control. Between the size of the applications being used to comply with increased regulations, the amount of data available for analysis is phenomenal. And just to make things interesting, business users want the ability to analyze it in real time.
From an IT perspective, this creates a perfect storm when it comes to data management, so many of them are looking to see what vendors intend to do help them solve the problem. Off course, the core problem for many IT organizations is they never really managed the data in the first place; they just found places to store it.
Now they want tools to sort out the most valuable data to be handled in real time, while archiving most of the rest as inexpensively as possible. Those issues are exactly what is driving an alliance announced this week between Teradata and DataFlux, a unit of the SAS Institute. The basic idea is to use the DataFlux data-management tools to identify the data that needs to be most urgently processed using Teradata database appliances.
According to DataFlux CEO Tony Fisher, companies have finally come to realize that there are billions of dollars of lost opportunities hiding in their data, so many of them are finally taking a strategic approach to analyzing it.
Rob Berman, vice president of Teradata, adds that while companies are expanding the spectrum of technologies they use, the need for near real-time analysis of that data is expanding demand for high-speed database servers.
As we see the rise of new approaches to managing data such as Apache Hadoop and new ways of processing and distributing that information, there's more diversity in data management than ever. But it's becoming obvious that the need for speed has never been greater.