No Digital Transformation Without Big Data

    Big Data is likely to be a key element in the digital transformation taking hold in the enterprise, but there are still questions as to how it will influence future business processes and what steps to take now to ensure it provides optimal support in a fast-changing economy.

    According to a recent report by Verizon and the Harvard Business Review, most organizations are looking forward to dramatic improvements in services and business models through Big Data, but few are implementing technology and infrastructure as part of a strategic approach to transformation. The study found that 52 percent expect Big Data to lead to new services for the Internet of Things (IoT), while 44 percent say it will transform their business models. However, upwards of 78 percent are currently leveraging only limited amounts of IoT data or none at all, while an equal percentage say they need new networking technologies to fully implement Big Data operations. In fact, most Big Data initiatives today are being carried out on an ad hoc basis, not as part of a strategic imperative.

    With the amount of data in play, it is easy to see why most enterprises are overwhelmed at this point. But as Pure Storage CTO Alex McMullan noted on IT Pro Portal recently, data volumes are only one aspect of the challenge. The other is managing the sheer number of streams coming in from countless sensors and devices, all of which has to be gathered, sorted, secured and pre-analyzed before it gets to the main analytics engines that are supposed to isolate the valuable information from the junk. This will require a new approach to enterprise infrastructure incorporating high-speed compute, storage and networking, as well as cloud-scale resources, cognitive computing, and a host of other advances.

    One of the key habits of the past that no longer applies to the world of Big Data is the idea that more necessarily equals better, says Coho Data CTO Andrew Warfield. In traditional settings, increased data loads were met with more servers, more storage and more network pathways, but in the emerging world organizations will have to look into the underlying relationships between these elements in order to drive greater efficiency in the data handling process. A case in point is the need to bulk-copy HDFS jobs from system to system in traditional enterprise infrastructure, whereas support for direct HDFS protocol-based access to data would go a long way toward federating data across multiple disparate systems. And to really kick things into high gear, the enterprise should start thinking about a unified storage architecture that can natively run file, block and HDFS data, preferably with support for containerized workloads.

    Some may wonder whether leading High-Performance Computing (HPC) platforms might offer the keys to Big Data nirvana. While traditionally, HPC platforms have focused on modeling and simulation as opposed to machine learning and data correlation analysis, there is enough commonality between the two that Tabor Communications recently combined its Big Data and HPC events to see if there are any ways to approach convergence. According to Enterprise Tech, there are: Technologies like GPU-based deep learning are proving beneficial to both fields as they strive toward real-time performance across highly diverse data sets. The Tabor event in Jacksonville, Fla., last month drew representatives from firms as diverse as Dell-EMC and

    There is always a lag between what the leading vendors say is happening in the IT market vs. what is actually happening. The fact is that few organizations are leveraging Big Data and the IoT in significant ways because the technologies supporting them have only just started to trickle into the channel. And in this case, the transition will be doubly difficult because we’re not talking about new hardware and software alone but a reimagining of the entire business structure.

    But that doesn’t mean the enterprise has time to dawdle. New businesses are sprouting up every day providing a wealth of services on brand new scale-out, analytics-optimized infrastructure, with none of the technological or cultural baggage that weighs down the traditional enterprise.

    To compete, today’s business needs not only new tools at its disposal, but a new way of carrying out its core functions.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.


    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles