The Role of Servers in an Increasingly Cloudy Universe

    Slide Show

    Five Ways to Realize Server Room Profitability

    Data center infrastructure will undergo dramatic change across the board in the coming year, but while much of the focus will be on software-defined architectures and cloud computing, bare metal changes are on tap as well.

    This is actually quite a heady time for servers in particular, given that the pressure to revamp data-handling capabilities is mounting as the enterprise struggles to meet the challenges of mobility, Big Data, collaboration and other macro forces.

    For InterWorx’ Graeme Caldwell, the rise of high-volume/small packet data traffic will lead directly to the ARM architecture finally breaking the “x86 monoculture” that has gripped the enterprise for so long. ARMs thrive in the chaotic universe of mobile data, so if the enterprise wishes to scale resources up and down to suit ever-changing load volumes, they would be better off with legions of low-power ARM units at their disposal than highly virtualized x86 machines. And while Intel currently holds a slight edge with its 64-bit Avoton SoC, the coming year will see 64-bit ARMs from Caldexa, Applied Micro and others.

    Servers are also front and center in the converged infrastructure movement. Many enterprises are already working with Flash cache and in-memory solutions as a way to increase performance for Big Data loads, while fully integrated, modular infrastructure is quickly becoming the solution of choice for cloud-facing hyperscale facilities. HP recently unveiled its ConvergedSystem portfolio to capitalize on this trend, joining IBM, Cisco, Dell and a host of smaller firms vying to convince the enterprise that a new, streamlined architecture is at hand.

    Modularity, in fact, is a key component in the rise of purpose-built server infrastructure among top hardware consumers like Google and Facebook. Both firms utilize servers and other components of their own design and manufactured by specialty shops in the Pacific Rim, thus avoiding both the traditional IT manufacturing community and the distribution channels that support it. Facebook has even taken the step of releasing its design under the Open Compute Project, giving the industry at large an opportunity to deploy bare-bones hardware infrastructure in support of scale-out cloud environments.

    And under a kind of “News of the Weird” scenario, a group of British researchers say they have devised the outlines of a completely server-free environment in which devices are empowered to communicate and exchange directly, with no middleman required. Built for the EU-funded Pursuit project, the platform utilizes file-sharing and other techniques to empower direct connectivity and implement a high degree of fragmentation of data among multiple users. With mobile devices now offering more storage than PCs of just a few years ago, such a system could prove to be both infinitely scalable and highly versatile, and would give snoops at the NSA real fits.

    Future paradigms aside, the enterprise server is becoming smaller and more adept at handling highly dynamic data loads, but that doesn’t mean today’s high-powered systems are headed for the scrap heap. Plenty of work still lies ahead for the foreseeable future in Big Data analytics, database processing, graphics design and a host of other jobs that warrant a powerful device, even a big iron mainframe.

    The choice of hardware will, in fact, matter a great deal in the coming year.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Latest Articles