SHARE
Facebook X Pinterest WhatsApp

Still Some Life in Traditional Data Infrastructure

Study Finds Network Admins Juggling Multiple Initiatives Enterprise executives are under intense pressure these days to deliver a wide range of new and expanding data services amid shrinking IT budgets. It seems that the front office has taken to heart the mantra “do more with less” and is buoyed by the notion that the cloud […]

Written By
thumbnail
Arthur Cole
Arthur Cole
Jul 7, 2014
Slide Show

Study Finds Network Admins Juggling Multiple Initiatives

Enterprise executives are under intense pressure these days to deliver a wide range of new and expanding data services amid shrinking IT budgets. It seems that the front office has taken to heart the mantra “do more with less” and is buoyed by the notion that the cloud will come to the rescue when confronting all manner of data challenges.

This is part of the reason why top research firms like Gartner are starting to pull back on their IT spending predictions. As I noted last week, the company recently trimmed its 2014 growth outlook by about a third, from a 3.2 percent expansion to 2.1 percent, even though that still represents a substantial $3.75 trillion market. A deeper dive into the numbers, however, shows a data center hardware market flat-lining for the next year at about $140 billion, while an oversupply of SAN and NAS systems is likely to drive prices down even further. IT services, meanwhile, are looking at about 3.8 percent growth this year, representing nearly $1 billion in revenue.

But is it really that bad? Are we on a one-way street to a utility-style, all-cloud data center? Not hardly, at least not yet. The fact remains that there are plenty of compelling reasons for enterprises of all stripes to build and maintain local data infrastructure, both as stand-alone IT environments and hybrid systems tied to third-party resources.

Plexxi’s Marten Terpstra, for example, notes that while cloud services do solve a number of enterprise provisioning and operational issues, they cannot do everything. The company specializes in advanced networking infrastructure, which requires a lot of testing and development of new applications that can only be done on non-cloud infrastructure. Access performance is a constant issue in the cloud, and in many cases can lead to diminished productivity, so for the time being at least, many enterprises are likely to maintain both data center and cloud environments as a means to ensure all applications and services receive the highest level of support.

Even large multinationals like Intel are firm in their commitment to internal data environments, although this doesn’t mean they can’t streamline hardware architectures as much as possible. CIO Kim Stevenson explained to PC Magazine recently that simply shifting from two-socket servers to one-socket machines provides both a performance boost and a drop in licensing costs due to the lower number of cores. Of course, being Intel, they have ready access to the latest and greatest in processing technology, which at the moment is the new ‘Haswell’ version of the Xeon E3 deployed under the company’s own Hyperscale Design Compute Grid architecture.

Indeed, some in the venture capital industry are already looking forward to a rebound in local data center deployments in 2015 and beyond. Pacific Crest Securities, for one, says catalysts like the new Grantley server platform from Intel and the expiration of Windows Server 2003 will likely produce a spike in hardware upgrades, which should benefit everyone from Intel and HP to Western Digital, Seagate and A10 Networks. The key question, though, is whether this will be a normal, cyclical rebound or just a bump on an otherwise steady deterioration of the data center market.

For traditional data center vendors, the answer to that question could not be more crucial. Many are already transitioning away from standard hardware platforms toward more software and services, but it remains to be seen whether this new approach will provide the same revenues and profit margins as the old.

The divide at this point seems to be between those who say they can deliver full enterprise functionality on a pure software/virtual footing using commodity, white-box hardware, and those who argue that close ties between hardware and software are the best way to ensure optimal, and optimally efficient, performance.

And on that point, unfortunately, the jury is still out.

Recommended for you...

Best Cloud Security Solutions
Aminu Abdullahi
Jun 24, 2022
Strategies for Successful Data Migration
Kashyap Vyas
May 25, 2022
Leveraging AI to Secure CloudOps as Threat Surfaces Grow
ITBE Staff
May 20, 2022
The Emergence of Confidential Computing
Tom Taulli
Apr 20, 2022
IT Business Edge Logo

The go-to resource for IT professionals from all corners of the tech world looking for cutting edge technology solutions that solve their unique business challenges. We aim to help these professionals grow their knowledge base and authority in their field with the top news and trends in the technology space.

Property of TechnologyAdvice. © 2025 TechnologyAdvice. All Rights Reserved

Advertiser Disclosure: Some of the products that appear on this site are from companies from which TechnologyAdvice receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. TechnologyAdvice does not include all companies or all types of products available in the marketplace.