More

    Big Data/IoT: Start with the Basics

    Slide Show

    How to Find Business Value in Your Data Through Modernization

    You can say one thing about Big Data and the Internet of Things: It has given the enterprise a goal to strive for.

    Figuring out how to achieve that goal is a decidedly different matter, however, particularly given the uncertainty around the technology and the high degree of customization involved in turning legacy processes into automated, analytics-driven ones.

    As it is with any major project, though, the best way to do Big Data/IoT is to break it into smaller, more manageable components. As long as each phase is tied to the bigger architectural picture, the enterprise should come out with a reasonably functional data environment in the end, albeit one that will require continuous optimization going forward.

    None of this will work without a solid foundation, says Dell’s Shawn Rogers, so enterprises just starting the Big Data journey should make sure the fundamentals are sound before the real number-crunching begins. The core technologies behind Big Data processes include a conventional relational database, a versatile data lake, most likely based on Hadoop, and a sophisticated analytics platform. Equally important, however, are strong data governance and security policies that keep intruders at bay but do not inhibit legitimate user access. This can be a tricky dance because it requires careful coordination between data, metadata, user profiles and related elements, but fortunately you will have access to a crack analytics engine to keep tabs on things.

    No matter how the system is architected, the driving operational goal should be speed, not capacity. In the age of the cloud, resources like compute, storage and networking are available somewhere, so the challenge isn’t so much finding a place to keep all this data or a core to process it, but doing so at the pace of modern business. Fortunately, this message seems to be getting across, says Enterprise Apps Today’s Ann All. A recent survey by OpsClarity revealed that 90 percent of organizations are looking to increase investment in high-speed data infrastructure. The goal is to replace today’s cumbersome batch processing with real-time streaming data functionality, giving both human decision-makers and, increasingly, automated systems the ability to respond to opportunities and threats immediately.

    To a company like Cisco, this all centers on networking – again, not in terms of capacity but speed and throughput across increasingly complex distributed fabric architectures. The company’s new Digital Network Architecture (DNA) is intended to provide the base for security, collaboration, analytics and a host of other functions, says ZDNet’s Corinne Reichert. The platform was designed with digital transformation of the business model in mind, supporting functions like real-time policy deployment, automated network programmability and continuous threat monitoring. Using API-based management, the platform supports the dynamic configuration needs of increasingly software-defined infrastructure.

    At some point, however, infrastructure has to give way to actual processes, so it helps to understand what you are going for as you put the pieces of your Big Data environment in place. According to Muhi Majzoub, executive VP of OpenText, the three key processes for the digital age are discovery, reporting/visualization and analysis. All of these currently reside on existing Business Intelligence platforms, but if your goal is to convert Big Data – both structured and unstructured – into valuable knowledge, they will need to be kicked up a few notches. Discovery, for example, should encompass the entire data lifecycle from creation to governance to archiving to disposal, while analysis will need high-speed classification mechanisms and greater contextual capabilities in order to gauge both present and future realities.

    The process of digital transformation has only just begun at many organizations, so it is difficult, if not impossible, to assess the myriad requirements that will be placed on Big Data and IoT infrastructure once things really get rolling. But service-based, data-driven business models are already starting to wreak havoc in many established industries, so today’s enterprise does not have a whole lot of time to ponder all the implications of current deployment decisions.

    You can dip a toe in the lake to see how nice the water is, but you won’t learn how to swim until you jump right in.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.

    Save

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles