Big Data is turning into a big driver of enterprise infrastructure deployment, but this begs the question: Since so little is known about Big Data and how it can be used, can we make any firm decisions about how to support it with existing technology?
According to a recent study by market tracking firm SteelBrick, 72 percent of high-tech providers are reporting increased sales volumes due to Big Data, and more than 40 percent report accelerating sales cycles, in some cases from more than a year to as low as three months. This means that not only is more product moving off the shelf, but also buyers are upgrading legacy systems at a faster pace. The results spanned virtually the entire enterprise data spectrum, from basic infrastructure to cloud computing and software-as-a-service. If these trends continue, expect demand to soon outstrip supply, says SteelBrick CEO Godard Abel, which inevitably leads to product shortages and rising costs.
The opportunities that Big Data presents are certainly not lost on traditional IT vendors who have been struggling to cope with declining sales of legacy platforms in the era of virtualization and the cloud. HP and Intel recently partnered up to develop high performance computing (HPC) solutions that will help the average enterprise deal with Big Data. The venture will focus on integrating the HP Apollo platform with Intel’s Scalable System Framework in ways that will enable HPC infrastructure to scale in tandem across computer, networking, memory and software. The integration will also foster greater democratization of access to allow for increased self-service capabilities for the knowledge workforce.
Big Data infrastructure is not likely to be cheap or easy to manage, however, which is why start-ups are already looking to leverage the cloud in support of Big Data as a Service (BDaaS). A company called Cazena is currently gathering financing for its BDaaS solution that allows enterprises to offload the processing and management of Big Data, although apparently not the data store itself. The service provides workload intelligence for guaranteed service levels across Hadoop, Spark and other solutions, as well as end-to-end automation for both cloud and enterprise-based tools and data sources. The platform also offers integrated encryption to maintain security, governance and compliance.
This plethora of new systems is no doubt welcome relief for enterprises struggling to cope with Big Data, but there is still a danger of merely throwing technology at a problem and hoping it all works out, says Rachel Wolfson of IT service assurance firm Moogsoft. On a more practical level, the enterprise will have to make changes to its warehousing and ETL procedures if it hopes to leverage Big Data to its full potential. This is the primary reason why Dell, Intel and Cloudera have brought in a company called Syncsort to their combined Big Data solution. Syncsort’s DMX-h software enables even heavy Hadoop workloads to be shifted across underlying platforms without making changes to application or workflow parameters. In this way, organizations gain a high level of consistency across data collection, processing, distribution and other functions, plus a more streamlined systems upgrade approach as new technologies and management architectures become available.
It’s been said many times that Big Data is both a challenge and an opportunity. The only caveat is that you’ll have to deal with many challenges before the opportunities arise.
It is unlikely the enterprise will be able to craft an effective Big Data solution on its own – there are simply too many moving parts. But by taking a hard look at both the available options now and the ways in which they will impact business processes in the future, organizations and vendors/integrators should at least come up with a solid footing on which to build advanced, reconfigurable architectures over the coming decade.