GPUs Ready to Take on the Next-Generation Database

    If the enterprise wants to get ahead with Big Data and artificial intelligence, it will have to embrace new forms of infrastructure all the way to the silicon layer.

    The fact is that with the data loads being projected and the physical limitations of CPUs at their max, the future likely belongs to GPUs and perhaps even more advanced architectures going forward. And this change is likely to reverberate up and down the IT stack as new processing capabilities drive new classes of applications and services, which in turn will fuel even further development of chip architectures, and so on.

    According to Ami Gal, CEO of SQream Technologies, 2016 was the tipping point for GPUs because it was when the enterprise finally woke up to the fact that these devices can do a whole lot more than just graphics. With GPUs already handling data at faster rates than CPUs, and still improving at about 40 percent per year, they are quickly becoming the go-to solution for the heavy-duty analytics engines that will power the IoT and emerging cognitive technologies. As well, the GPU can hold a massive number of cores, which is more fitting for the gargantuan databases that will soon populate the data center, leading to faster results for more users simultaneously.

    Indeed, say Kinetica CEO Amit Vij and Vice President Joseph Lee, as the consumption of digital services grows, businesses, particularly those engaged in ecommerce, will have no choice but to embrace GPU-powered databases for a wide range of applications. With the basic law of “survival of the fittest” entering its next phase, keeping ahead of marketplace characteristics, consumer trends, social sentiment and logistical efficiency will require a higher level of performance throughout the data chain. GPUs can already deliver analytical insights 1,000 times faster than CPUs at one-tenth the cost.

    Still, the enterprise should be careful of simply throwing a GPU-accelerated database at analytics problems, says Richard Heyns, CEO of database developer Brytlyt. For one thing, the platform should integrate well with legacy architectures, particularly existing software – unless, of course, you are OK with upgrading your entire technology footprint at once. At the same time, however, it should be quick and easy to onboard new technologies, preferably in ways that avoid employee retraining. And, of course, it will need to scale to meet the ebb and flow of the business cycle. All of this will help ensure that you spend less time querying your powerful new database and more time drawing insights from it.

    When it comes to applying intelligence to modern data operations, however, the GPU may not be the right solution. Jeff Dorsch, tech editor at Semi Engineering, noted that when it comes to supporting dynamic, autonomous workloads, the field-programmable gate array (FPGA) has some advantages over the GPU. For one thing, FPGAs are programmable, so they can accommodate a wider range of processing demands as intelligent systems alter their functionality for new environments. They are also more parallel and rely on fixed-point calculation rather than floating-point, which is more amenable to functions like machine learning. The FPGA market is still rather small, however, and the architectures are still evolving, so it will probably be a little while before they make a significant impact on enterprise data systems.

    While many enterprises will no doubt strive to build their own GPU architectures, deployment will most likely be faster on the cloud. With economies of scale in their favor and competitive pressures fueling a constant desire to innovate, cloud providers are in the best position to provide the most advanced capabilities at a rapid pace.

    After all, the enterprise is no longer interested in being the first to own a new technology, but the first to benefit from the results it delivers.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Latest Articles