Reinventing the Server

Michael Vizard

As the price of memory continues to fall and the number of processors on a chip increases, the way the IT industry thinks about building servers is slowly being transformed. Instead of relying on hard disks for primary I/O storage, the servers of the near future are going to rely on memory-to-service I/O requests that used to be handled by primary storage systems.

One of the best examples of this new type of server architecture is the High-Performance Analytics Appliance (HANA) that SAP developed in conjunction with Dell, IBM and Hewlett-Packard. Originally designed to augment existing servers when it came time to run an analytics application, many IT industry executives from across the spectrum now see HANA as a forerunner of next-generation server architectures.

Andy Lark, head of global marketing for large enterprises for Dell, says next-generation servers are going to rely on in-memory processing to process workloads in parallel. When an IT organization wants to scale up an application, it will simply add more in-memory server capacity in a way that scales out applications using concepts typically associated with high-performance computing (HPC) environments. Thanks to major drops in the cost of memory, Lark says this style of computing will become especially popular in cloud computing scenarios that need to support elastic data requirements.

Ultimately, SAP CIO Oliver Bussmann says that much of our existing IT infrastructure will flatten because there won't be need for separate systems to run data warehousing and analytics applications. Those applications will access memory directly on the same in-memory server that is running the production applications.

And Amit Sinha, SAP vice president of technology and innovation marketing, adds that not only will the IT infrastructure flatten out, we'll also begin to see new classes of real-time applications that were previously prohibitively expensive to develop and run. Without having to call out to disks to process I/O, the ability to create applications that instantly reflect the status of any process becomes a lot more feasible.

In fact, Raj Nathan, executive vice president and chief marketing officer for Sybase, a unit of SAP, says there may soon come a time when the line between the application and the underlying database begins to blur when both are running in-memory.

Clearly, server vendors are in a race to provide these next-generation in-memory computing servers. It may take a while for software developers to catch up with what amounts to a new approach to developing software. But whoever gets there first stands to have a tremendous advantage over any software vendor that is overly dependent on what will soon be legacy server infrastructure.

Add Comment      Leave a comment on this blog post
Jun 7, 2011 6:06 AM Office 2007 Office 2007  says:
Without having to call out to disks to process I/O, the ability to create applications that instantly reflect the status of any process becomes a lot more feasible. Reply
Apr 10, 2012 2:04 PM Intel Blade Server Intel Blade Server  says:
Nice to hear the story with High Performance storage servers. By reading this it makes sense to the consuming and efficiency on most of the storage servers. Reply

Post a comment





(Maximum characters: 1200). You have 1200 characters left.



Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.