More

    The Big Data Challenge: Becoming Both Big and Smart

    Slide Show

    Five Ways Automation Speeds Up Big Data Deployments

    The enterprise is poised to embark on a number of data and infrastructure initiatives in the coming years, almost all of which are focused on the capture and analysis of Big Data.

    But while the term “Big Data” is appropriate to describe the scale of the challenge ahead, it leaves the impression that the solution is simply to deploy more resources to accommodate larger workloads. But as many early adopters are finding out, Big Data is not just big, it’s also complex and nuanced — and that spells trouble for anyone who thinks they can just throw resources at Big Data and make it work.

    As MarkLogic’s Jon Bakke points out, Big Data can encompass everything from large text and database files to audio/video and real-time data streams tracking changes to complex systems and environments. To handle this, the enterprise will need to mount a multi-pronged approach that encompasses not just advanced database systems and emerging infrastructure technologies, but legacy systems as well. A key strategy in squaring this circle is the logical data warehouse (LDW), which encompasses two or more physical database platforms united under a common access and control mechanism. In this way, the enterprise can take advantage of existing capabilities like RDBMS while employing state-of-the-art capabilities for the specific functions that need them.

    There is also a tendency among many IT practitioners to view Big Data as a singular phenomenon that will arrive at the data center door someday. But the fact is that Big Data continues to evolve, says tech writer Drew Robb, and the workloads hitting the enterprise now are not likely to be the same in a few years. The Internet of Things is likely to represent the biggest change in that it involves data streams from literally millions of sensors spread across complex structures, but it is not the only one. Indeed, the access and portability of data sets is likely to change radically as data stores shift from the province of data specialists to the wider knowledge community in order to foster broader analysis and interpretation.

    It could also emerge that Big Data itself is mostly a useless construct that just leads to confusion and indecision, says IT strategist David Lavenda, and what is really needed is a way to turn Big Data into small data, or even better: small, smart data. In conversations with IDC’s Mike Fauscette and Constellation Group’s Andy Mulholland, the consensus is that adding context, relevance and focus to Big Data is key, because otherwise it is difficult to deliver the right data to the right person in order to maximize its value. This is just as difficult as it sounds, however, particularly since context and relevance can vary widely from person to person. But it nevertheless conforms to that time-honored management tradition of breaking big jobs into small, more manageable jobs.

    Data Analytics

    Again, though, most of this activity is going to take place in the enterprise data warehouse (EDW), so it would be wise for all data-facing organizations to assess their capabilities now in order to prepare for the very near future. But as SAP’s Brian Wood told B-Eye Network, the focus should not be about building or rebuilding systems or infrastructure, but in how both Big Data and the EDW can work together to achieve desirable goals. Ideally, this should encompass everything from the data itself and the databases at hand to security, authorization, lifecycle management, orchestration and management/monitoring. This is quite different from today’s resource/load equation that guides most enterprises, but the danger of being overwhelmed by Big Data goes beyond whether there is adequate processing or application support.

    The purveyors of Big Data solutions have done a good job of spreading Big Data fear throughout the IT industry: “Prepare today or get clobbered tomorrow!” And clearly, it would be unwise for the CIO to sit on his or her hands when it comes to Big Data.

    But the fact is that the same uncertainties that are holding you back are probably plaguing your competitors as well – if not, they run the risk of building a Big Data solution that might prove inadequate in a few short years.

    The details surrounding Big Data will emerge as the market matures. The thing to do now is strike the right posture that accommodates both immediate and longer-term goals.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata, Carpathia and NetMagic.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles