More

    Reinventing the Server

    As the price of memory continues to fall and the number of processors on a chip increases, the way the IT industry thinks about building servers is slowly being transformed. Instead of relying on hard disks for primary I/O storage, the servers of the near future are going to rely on memory-to-service I/O requests that used to be handled by primary storage systems.

    One of the best examples of this new type of server architecture is the High-Performance Analytics Appliance (HANA) that SAP developed in conjunction with Dell, IBM and Hewlett-Packard. Originally designed to augment existing servers when it came time to run an analytics application, many IT industry executives from across the spectrum now see HANA as a forerunner of next-generation server architectures.

    Andy Lark, head of global marketing for large enterprises for Dell, says next-generation servers are going to rely on in-memory processing to process workloads in parallel. When an IT organization wants to scale up an application, it will simply add more in-memory server capacity in a way that scales out applications using concepts typically associated with high-performance computing (HPC) environments. Thanks to major drops in the cost of memory, Lark says this style of computing will become especially popular in cloud computing scenarios that need to support elastic data requirements.

    Ultimately, SAP CIO Oliver Bussmann says that much of our existing IT infrastructure will flatten because there won’t be need for separate systems to run data warehousing and analytics applications. Those applications will access memory directly on the same in-memory server that is running the production applications.

    And Amit Sinha, SAP vice president of technology and innovation marketing, adds that not only will the IT infrastructure flatten out, we’ll also begin to see new classes of real-time applications that were previously prohibitively expensive to develop and run. Without having to call out to disks to process I/O, the ability to create applications that instantly reflect the status of any process becomes a lot more feasible.

    In fact, Raj Nathan, executive vice president and chief marketing officer for Sybase, a unit of SAP, says there may soon come a time when the line between the application and the underlying database begins to blur when both are running in-memory.

    Clearly, server vendors are in a race to provide these next-generation in-memory computing servers. It may take a while for software developers to catch up with what amounts to a new approach to developing software. But whoever gets there first stands to have a tremendous advantage over any software vendor that is overly dependent on what will soon be legacy server infrastructure.
     

    Mike Vizard
    Mike Vizard
    Michael Vizard is a seasoned IT journalist, with nearly 30 years of experience writing and editing about enterprise IT issues. He is a contributor to publications including Programmableweb, IT Business Edge, CIOinsight and UBM Tech. He formerly was editorial director for Ziff-Davis Enterprise, where he launched the company’s custom content division, and has also served as editor in chief for CRN and InfoWorld. He also has held editorial positions at PC Week, Computerworld and Digital Review.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles