Moving to provide a platform designed for applications that make broader use of parallelization, Hewlett-Packard Enterprise today unveiled a HPE Apollo 6500 server platform that, starting in the third quarter, can be configured with up to eight graphics processing units (GPUs) from NVIDIA in a 4u form factor.
Scott Misage, general manager for High Performance Computing at HPE, says servers based on GPUs are starting to find favor with IT organizations that are looking to optimize analytics applications as well as applications that make extensive use of deep learning algorithms. By definition, those applications typically need to process multiple events simultaneously to drive some form of cognitive computing, says Misage.https://o1.qnsr.com/log/p.gif?;n=203;c=204663295;s=11915;x=7936;f=201904081034270;u=j;z=TIMESTAMP;a=20410779;e=iHPE also unveiled today an Apollo 4520 system, a two-node system based on the latest Xeon E5-2600v4 processors that will ship later this month. To support the I/O requirements of high-performance computing (HPC) applications running on the Apollo 4520, HPE is also giving IT organizations the option of configuring them with the open source Lustre parallel file system using either the edition supported by the project community or the enterprise edition supported by Intel.
As part of an effort to help create those applications, HPE also announced today that it is releasing a new edition of Vertica SQL for Hadoop that is optimized for HPE Apollo and Proliant platforms. Next up, HPE will be making available reference architectures for deploying analytics applications based on the columnar Vertica database and Hadoop.
Given demand for advanced analytics and machine learning algorithms, the high end of the server market is going through something akin to a Renaissance. How that ultimately affects the share of the overall server market that Intel commands remains to be seen. But it is clear that as servers based on GPUs become more affordable, servers in the data center are about to become more diverse.