SHARE
Facebook X Pinterest WhatsApp

On Track to a Hyperscale Universe

Six Trends Shaping the Data Center in 2015 Everybody wants scale these days. Whether you are talking about expanding internal infrastructure or pushing workloads onto the cloud, one of the first questions people ask is, “How well does it scale?” This is understandable given the size of current workloads and the looming specters of Big […]

Written By
thumbnail
Arthur Cole
Arthur Cole
Jan 14, 2015
Slide Show

Six Trends Shaping the Data Center in 2015

Everybody wants scale these days. Whether you are talking about expanding internal infrastructure or pushing workloads onto the cloud, one of the first questions people ask is, “How well does it scale?”

This is understandable given the size of current workloads and the looming specters of Big Data and the Internet of Things. But scale doesn’t happen by itself – it must be carefully planned and executed or else you are left with the same dysfunctional, silo-based architecture that you have now, just more of it.

According to Gartner, IT spending is starting to show a distinct tilt toward hyperscale, with a crucial tipping point expected within the next three years when the traditional data center will no longer be able to meet the demands of the digital business model. Worldwide IT spending is on pace to hit $3.8 trillion this year, a 2.5 percent gain over 2014, although due to currency fluctuations and other factors, this actually represents a slight decrease in the rate of growth.  In the data center, sales of servers, storage and networking systems remain flat, but services and application support is booming on the cloud, which indicates much of the enterprise workload is shifting to hyperscale.

This is backed up by further studies from Technology Business Research, which estimates that the market for hyperscale infrastructure will top $1.7 billion over the next 12 months. Of course, there will be some trickle down into the enterprise market as large organizations seek to build internal cloud infrastructure that can be integrated with third-party services when needed but still provide a home-grown environment for critical data and applications nonetheless. This is where platforms like HP’s Moonshot and IBM’s NeXtScale stand the greatest chance of success.

Still, building hyperscale infrastructure is one thing, architecting it for the appropriate business outcome is quite another. Many hyperscale cloud providers are employing containers of their own design as a means to more efficiently allocate resources compared to standard virtualization. For the typical enterprise, however, the only real container solution is Docker which, as Gartner recently noted, lacks many of the key security and governance features needed to support full production environments. Right now, Docker relies on the security and access features of its Linux host, which must be carefully orchestrated on an individual basis in order to provide anything close to enterprise-class protection.

Data

And of course, enterprises that lack the hardware to implement hyperscale will naturally want to acquire it at the lowest possible price point. But as Storage Switzerland notes, this can cause some organizations to cut corners, producing long-term consequences that can be very difficult to correct. A case in point is storage, which in hyperscale deployments is bound to be Flash. Some organizations may be tempted to use consumer-grade SSDs vs. professional products, but this can lead to a whole host of problems like inconsistent performance, poor reliability, complex maintenance requirements and ultimately higher costs as the need to swap out these drives becomes apparent. The firm is holding a live event on January 29 to discuss the issue.

Like virtually every other technology initiative that has hit the enterprise, then, hyperscale is both a challenge and an opportunity. The decision to deploy hyperscale at home or access it through the cloud is an easy one for large and small enterprises, but those in the middle will have to do a fair bit of analysis to find the right path.

While the desire to keep critical applications and data close to home is understandable, a time will come when the infrastructure needed to support the volumes under management will become so vast that local resources simply will not be able to keep up without massive cash infusions. It is at this point that the enterprise will finally be able to assess the value of its data in real terms, and how much trust it is willing to give to other people to support their most valued assets.

Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata, Carpathia and NetMagic.

Recommended for you...

Top Managed Service Providers (MSPs) 2022
Observability: Why It’s a Red Hot Tech Term
Tom Taulli
Jul 19, 2022
Top GRC Platforms & Tools in 2022
Jira vs. ServiceNow: Features, Pricing, and Comparison
Surajdeep Singh
Jun 17, 2022
IT Business Edge Logo

The go-to resource for IT professionals from all corners of the tech world looking for cutting edge technology solutions that solve their unique business challenges. We aim to help these professionals grow their knowledge base and authority in their field with the top news and trends in the technology space.

Property of TechnologyAdvice. © 2025 TechnologyAdvice. All Rights Reserved

Advertiser Disclosure: Some of the products that appear on this site are from companies from which TechnologyAdvice receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. TechnologyAdvice does not include all companies or all types of products available in the marketplace.