Early Signs of a Future Hardware-Defined Infrastructure

    Slide Show

    Five Innovations in the Data Storage Landscape

    The enterprise desires a high degree of flexibility in the data environment, but it also needs to balance broad scalability with low capital and operating costs in order to accommodate Big Data and other heavy workloads without blowing the IT budget.

    This is why commodity hardware is looking so attractive: It provides a cost-effective means to build scale while still supporting the flexibility needed for modern business processes via the emerging class of software-defined architectures.

    The danger here is that the more layers of software you place between data, applications and underlying hardware, the less performance you get from the overall stack. You can compensate for this by provisioning more resources, either at home or in the cloud, but this steadily diminishes the ROI of your infrastructure even though data initiatives are still being met.

    But what if commodity hardware could be reprogrammed to provide higher levels of support to end users and applications? Even better, what if it could adopt a fair degree of intelligence so that it actually learns what is expected of it post-deployment and could adapt to its environment?

    Field programmable silicon is nothing new. Research and Markets has the field programmable gate array (FPGA) sector growing at an 8 percent annual clip for the remainder of the decade, driven largely by telecommunications, automotive, aerospace and other verticals. But FPGAs are starting to make their way into enterprise settings as well, particularly as cloud and hosted service providers look for low-cost ways to distinguish themselves from the competition.

    Like their standard brethren, the newest FPGAs are stressing higher densities and lower power consumption in order to meet the demands of burgeoning workloads. Flex Logix’ new EFLX platform features a unique hierarchical design that allows more switchable connections over a smaller area, which lowers manufacturing costs and allows the chips to become more adaptable to a wider set of operating environments. Once deployed, the platform can be customized to suit various I/O protocols, encryption algorithms and even search functions, giving systems integrators and end users wide latitude as to how they will be employed.

    Microsoft has been putting FPGAs to good use in its cloud infrastructure, most notably in its “convolutional neural network” (CNN) accelerator built on the Altera Stratix-V device and aimed at highly demanding applications like language processing and image classification/recognition. The setup is part of the larger Catapult project that has an FPGA fabric connecting more than 1,600 servers that power the Bing search engine. The intent is that the CNN accelerator will be able to apply reprogrammable logic to a broader spectrum of applications while at the same time improving overall throughput.


    And in storage infrastructure, Violin Memory is leveraging FPGAs in its new Flash Storage Platform (FSP), primarily on the controller where they provide more flexibility than standard volume ASICs. The controller, in turn, powers the Violin Intelligent Memory Module (VIMM) that can be customized for specific workloads and, when leveraged with Violin’s advanced solid-state design, can ultimately provide RAID 3 performance without the latency and sync problems that afflict standard Flash architectures.

    At the moment, most FPGAs in the enterprise are aimed at networking applications, primarily because this is where the greatest cost-benefit ratios lie. It is unclear at this point whether FPGAs have the chops to outclass the ASIC, or the ARM, in the server farm, although Intel is said to be working on a Xeon/FPGA hybrid that supposedly would integrate programmability and processing power on a single platform.

    It is conceivable, then, that before too long the most advanced layers of the data infrastructure will consist of a mixture of both hardware- and software-defined elements.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata, Carpathia and NetMagic.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Latest Articles