Big, Complex, Highly Valuable Data

Arthur Cole

Big Data is an appropriate name for the massive volumes that modern enterprises are dealing with, but it is by no means a complete description. A more apt moniker would be Big, Complex Yet Highly Valuable Data But Only For Those Who Know How to Use It.

Slide Show

Five Ways to Know if Your Challenge Is Big Data or Lots of Data

Of course, the key issue in all decisions regarding Big Data isn’t so much the amount of data on hand, but the manner in which it is to be parsed and analyzed. And while enterprises of all sizes have been quick to implement analysis and management software capable of handling Big Data, it seems that infrastructure has not been given due consideration.

According to a recent study by Vanson Bourne at the request of Hitachi Data Systems, nearly three quarters of UK enterprises are investing in Big Data analytics. However, more than 60 percent of that group admits they do not have the necessary infrastructure to conduct effective analyses in a timely fashion. With data flowing in every second these days, there is a risk that valuable pieces of information may go untapped, only to be discovered at a later date when its value has diminished. Indeed, more than half of those surveyed admit that they could base critical decisions on old data.


The good news, however, is that as the enterprise gains more experience with Big Data, new guidelines are coming out to help leverage it. According to Interxion’s Patrick Lastennet, Big Data should be managed according to the three V’s: Volume, Velocity and Variety. Again, infrastructure plays a key role here, particularly if the goal is to foster real- or near-real-time analysis. Cloud-based services, data center co-location and open source software platforms are all valuable tools in the Big Data era, but first enterprises must wean themselves from silo-based legacy infrastructure.

Another crucial factor in the Big Data equation is the fact that not all of it has value. To hear some tell it, in fact, very little is of any real benefit. But as volumes increase, so does the cost of storage and other infrastructure needed to support it. All the more reason for enterprise managers to take a more critical view as to what to keep and what to discard. According to Data Center Journal’s Jeff Clark, this downsizing can be accomplished in a number of ways, such as self-expiring data, storage restriction policies, targeted backups and limitations on system-generated data flow. Each of these approaches will require some soul-searching when it comes to identifying what data to keep and what to toss, and above all, IT will have to resist the temptation to preserve any and all data on the hunch that someday it may be useful.

The most worrisome thing about unchecked, unfettered Big Data accumulation is that it will inevitably lead to Big Budgets, particularly on the capital expenditure side. Retooling infrastructure to meet both the capacity and analytic needs of Big Data is necessary to bring the enterprise in line with 21st Century realities, but it need not be a frantic, haphazard process.

The tools to accomplish this transition are already at the enterprise’s disposal. All that is needed is a solid plan for how to manage it, preferably one based on clear notions of how Big Data is to be properly utilized.



Add Comment      Leave a comment on this blog post
Jun 17, 2013 12:26 AM Sunitha Sunitha  says:
Nice article, attend Big Data Bootcamp On July 12, 13 & 14 at Santa Clara Convention Center. Secure your spot now and avail the $250 savings using the Discount Code EARLYBIRD, for dteails visit http://bit.ly/11MHKvs Reply
Jun 17, 2013 3:30 PM DataH DataH  says:
Arthur, we are seeing an increase in businesses seeking specialized skills to help address challenges that arose with the era of big data. The HPCC Systems platform from LexisNexis helps to fill this gap by allowing data analysts themselves to own the complete data lifecycle. Designed by data scientists, ECL is a declarative programming language used to express data algorithms across the entire HPCC platform. Their built-in analytics libraries for Machine Learning and BI integration provide a complete integrated solution from data ingestion and data processing to data delivery. More at http://hpccsystems.com Reply

Post a comment

 

 

 

 


(Maximum characters: 1200). You have 1200 characters left.

 

null
null

 

Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.