SHARE
Facebook X Pinterest WhatsApp

Big Computing Emerges in Many Forms

In the era of Big Data, the enterprise will have to embrace Big Computing in one form or another. But exactly how this is to be done at a reasonable price point and with reasonable simplicity is still an open question, as is whether the knowledge to be gleaned from all this number crunching will […]

Written By
thumbnail
Arthur Cole
Arthur Cole
May 23, 2017

In the era of Big Data, the enterprise will have to embrace Big Computing in one form or another. But exactly how this is to be done at a reasonable price point and with reasonable simplicity is still an open question, as is whether the knowledge to be gleaned from all this number crunching will prove beneficial in any meaningful way.

When it comes to actual Big Computing platforms, the latest advancements are emerging from familiar names: Hewlett-Packard Enterprise and IBM.

HPE recently took the wraps off of the latest iteration of The Machine, which does away with much of the advanced technologies of earlier generations, such as memresistors, in favor of more conventional tools kicked into overdrive. The system utilizes shared memory connected across nodes by a high-performance fabric, while optical photonics handle the interconnect between components. Processing is done on Cavium’s ThunderX2 ARM SoC running plain, old Linux, not the Linux++ of the earlier Machine. The company says it can share upwards of 160 TB of memory across 40 nodes, with full-scale capacity estimated at more than 4,000 yottabytes or about 250,000 times the current worldwide digital workload.

IBM recently ran a pair of quantum computing processors through the test bed, which the company says could affect everything from business processes to advanced chemical and materials engineering. For development and research, the company is lining up a 16-qubit (quantum bit) device that more than triples performance over the current 5-qubit limit. The company is also testing a 17-qubit system that could reach 50 qubits by the end of the decade. The prototype chip is expected to be available later this year to commercial partners in key industry verticals, and will likely comprise the bulk of quantum performance on the IBM Cloud where it will provide acceleration for traditional enterprise workloads.

Of course, the enterprise does not need to wait for esoteric forms of computing to hit the mainstream before it wades into Big Data. Technologies like serverless computing are available here and now and are providing the means to support data at massive scale within reasonable cost bases, says Unit4’s Ton Dobbe. Severless offers three key attributes that will help guide the enterprise to a digital services model. First, it provides optimal performance even under highly demanding environments while reducing the management burden on internal IT. Secondly, it provides rapid, continuous application development and support without the costly overhead of idle resources, and thirdly, it provides a customer experience that is more in tune with the demands of an increasingly mobile society. And we can expect performance to improve as services like AWS Lambda and Azure Functions incorporate automation and artificial intelligence.

Still, the transition to a Big Data services model doesn’t end with bigger and better computing infrastructure, says ZDNet’s Mark Samuels – it also requires changes in processes and relationships up and down the IT stack. For instance, IT can easily provision new, more advanced resources, but without the ability to directly support the needs of colleagues, it is technology, and money, gone to waste. This means new systems must also be paired with more detailed governance, development objectives, API management, and a host of other factors to ensure that you have the right capabilities, not just the right infrastructure.

More powerful computing is always welcome in enterprise circles, and in decades past it was all that was needed to gain a competitive edge. But in today’s world, even the smallest firm has access to massive processing and storage when the need arises, so organizations of all stripes need to pay more attention to how they use technology, not how much of it they can corral.

In all likelihood, this will be for the betterment of humanity because it means that going forward, it will be the best ideas that rise to the top, not the ones that have the most financial backing.

Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.

Recommended for you...

Top Managed Service Providers (MSPs) 2022
Observability: Why It’s a Red Hot Tech Term
Tom Taulli
Jul 19, 2022
Top GRC Platforms & Tools in 2022
Jira vs. ServiceNow: Features, Pricing, and Comparison
Surajdeep Singh
Jun 17, 2022
IT Business Edge Logo

The go-to resource for IT professionals from all corners of the tech world looking for cutting edge technology solutions that solve their unique business challenges. We aim to help these professionals grow their knowledge base and authority in their field with the top news and trends in the technology space.

Property of TechnologyAdvice. © 2025 TechnologyAdvice. All Rights Reserved

Advertiser Disclosure: Some of the products that appear on this site are from companies from which TechnologyAdvice receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. TechnologyAdvice does not include all companies or all types of products available in the marketplace.