More

    Big, Complex, Highly Valuable Data

    Big Data is an appropriate name for the massive volumes that modern enterprises are dealing with, but it is by no means a complete description. A more apt moniker would be Big, Complex Yet Highly Valuable Data But Only For Those Who Know How to Use It.

    Slide Show

    Five Ways to Know if Your Challenge Is Big Data or Lots of Data

    Of course, the key issue in all decisions regarding Big Data isn’t so much the amount of data on hand, but the manner in which it is to be parsed and analyzed. And while enterprises of all sizes have been quick to implement analysis and management software capable of handling Big Data, it seems that infrastructure has not been given due consideration.

    According to a recent study by Vanson Bourne at the request of Hitachi Data Systems, nearly three quarters of UK enterprises are investing in Big Data analytics. However, more than 60 percent of that group admits they do not have the necessary infrastructure to conduct effective analyses in a timely fashion. With data flowing in every second these days, there is a risk that valuable pieces of information may go untapped, only to be discovered at a later date when its value has diminished. Indeed, more than half of those surveyed admit that they could base critical decisions on old data.

    The good news, however, is that as the enterprise gains more experience with Big Data, new guidelines are coming out to help leverage it. According to Interxion’s Patrick Lastennet, Big Data should be managed according to the three V’s: Volume, Velocity and Variety. Again, infrastructure plays a key role here, particularly if the goal is to foster real- or near-real-time analysis. Cloud-based services, data center co-location and open source software platforms are all valuable tools in the Big Data era, but first enterprises must wean themselves from silo-based legacy infrastructure.

    Another crucial factor in the Big Data equation is the fact that not all of it has value. To hear some tell it, in fact, very little is of any real benefit. But as volumes increase, so does the cost of storage and other infrastructure needed to support it. All the more reason for enterprise managers to take a more critical view as to what to keep and what to discard. According to Data Center Journal’s Jeff Clark, this downsizing can be accomplished in a number of ways, such as self-expiring data, storage restriction policies, targeted backups and limitations on system-generated data flow. Each of these approaches will require some soul-searching when it comes to identifying what data to keep and what to toss, and above all, IT will have to resist the temptation to preserve any and all data on the hunch that someday it may be useful.

    The most worrisome thing about unchecked, unfettered Big Data accumulation is that it will inevitably lead to Big Budgets, particularly on the capital expenditure side. Retooling infrastructure to meet both the capacity and analytic needs of Big Data is necessary to bring the enterprise in line with 21st Century realities, but it need not be a frantic, haphazard process.

    The tools to accomplish this transition are already at the enterprise’s disposal. All that is needed is a solid plan for how to manage it, preferably one based on clear notions of how Big Data is to be properly utilized.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles