Data Acceleration: No Longer a Luxury

    Slide Show

    7 Ways the Internet of Things Will Impact ITSM

    One of the biggest misnomers about Big Data and the Internet of Things is that the data volumes they are expected to generate will be the primary challenge for enterprise IT.

    While it is true that organizations will have to up their storage capacity, processing power and network bandwidth to handle all of this data, the real trick will be doing it in a way that not only maintains but enhances the speed at which data can be ingested, analyzed and converted into actionable intelligence.

    This is why many platform developers have turned their attention to data acceleration of late. A faster server or a wider network pathways may help with the data turnaround somewhat, but the real benefit will come from highly targeted solutions that do nothing but make sure that silicon-level processing is at its fastest and most efficient.

    The latest move in this direction comes from IBM and Google, which have compiled a new consortium aimed at fostering an open version of the Power processor’s Coherent Accelerator Processor Interface (OpenCAPI) as an industry standard. The spec is intended to boost server performance at least 10-fold and would seemingly rival interconnect technologies by Intel, according to Reuters’ Rick Wilking. The consortium also includes AMD, Dell, HPE, and Mellanox, as well as emerging platform providers like Micron, Nvidia and Xilinx, indicating the group is looking at fostering the system for advanced analytics, artificial intelligence and other functions targeting the IoT.

    At the same time, however, many of these same companies (but again, not Intel) are involved with the Gen-Z Consortium that is looking to develop a new scalable interconnect protocol that would support high-performance fabric architectures. The goal is to bridge the divide between fast, volatile memory systems and slower but persistent and more reliable storage arrays. The need for such a system is growing, says The Inquirer’s Graeme Burton, because solid state technologies are quickly overtaking disk storage in the enterprise, leading to more data-centric applications that require scalable pools of memory for real-time analytics. The group is focusing on a solution that provides throughput on the order of several hundred Gbps with latency less than 100 nanoseconds.

    Meanwhile, the Hadoop development community is seeing new vendor platforms aimed specifically at accelerating Big Data workloads within the analytics engine. BI developer Jethro recently came out with a new version of its index-based SQL acceleration system that features dynamic aggregation of “micro-cubes” (mini three-dimensional databases) to boost query performance across a range of applications and workloads. These “auto-cubes” are generated based on intelligent analysis of user activity, providing transparent support for shifting data loads and interactive response times to speed up virtually any BI scenario, according to company CTO Boaz Raufman.

    The enterprise can best take advantage of these and other technologies by applying advanced supply chain processes to its data, according to Accenture Labs. This includes improved inventory, identification of inefficient processes and data silos, expanding access and prioritizing various value chains. With a robust acceleration platform in place, organizations can improve data movement by extracting it from multiple sources without loss or compromise of its contextual reference, while at the same time bolster both the pre-process scrubbing of incoming data and the actual processing required for calculation, statistical comparison and simulation. As well, it fosters improved interactivity across disparate platforms.

    Modern data users – particularly the younger, mobile-facing generation – are pushing the concept of instant gratification to new extremes. Enterprises that do not deliver services on-demand will quickly find out that customer loyalty is nothing like what it was in the old days.

    Building Big Data and IoT infrastructure is only the first step in meeting the demands of this new digital economy. Ultimately, success or failure will depend on how fast and flexible the enterprise becomes.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles