SHARE
Facebook X Pinterest WhatsApp

Google Moves to Make Public Cloud Platform More Flexible

As part of an ongoing effort to allow IT organizations to consume only the exact amount of public cloud computing resources they need, Google this week announced it has removed the memory caps attached to any virtual machine. In addition, Google claims it has become the first public cloud provider to make the latest generation […]

Written By
MV
Mike Vizard
Jun 1, 2017

As part of an ongoing effort to allow IT organizations to consume only the exact amount of public cloud computing resources they need, Google this week announced it has removed the memory caps attached to any virtual machine. In addition, Google claims it has become the first public cloud provider to make the latest generation of Intel Xeon processors, codenamed Skylake, generally available via the Google Cloud Platform (GCP).

Paul Nash, group product manager for GCP, says Google is taking pains to enable IT organizations to consume the virtual machines without requiring them to commit to specific sizes on even an hourly amount of time that ultimately winds up forcing them to pay for unused resources.

A Skylake instance of an Intel processor can now be configured with up to 455GB of RAM. Rather than setting specific memory limits, Nash says IT organizations can now determine how much memory they want to allocate to a virtual CPU instance. That approach is intended to be especially appealing to IT organizations aiming to deploy, for example, in-memory computing databases on a public cloud.

“We’re starting to see more deployments of applications such as SAP HANA databases or analytics applications by enterprise customers,” says Nash.

At the same time, via a new Minimum CPU Platform feature, Google is now allowing IT organizations to select a specific CPU platform for VMs in any given zone. GCP will always schedule a virtual machine to run on that class of CPU family or better.

It’s clear that Google is now spending a lot more time and energy courting enterprise customers. While public clouds have been around for 10 years, most enterprise IT organizations are just now making public clouds a standard deployment option for their applications. That doesn’t mean everything will be moving into a public cloud. But it does mean that before making any substantial commitments, many enterprise IT organizations are likely to be very particular about the terms and conditions offered by a public cloud service provider.

MV

Michael Vizard is a seasoned IT journalist, with nearly 30 years of experience writing and editing about enterprise IT issues. He is a contributor to publications including Programmableweb, IT Business Edge, CIOinsight and UBM Tech. He formerly was editorial director for Ziff-Davis Enterprise, where he launched the company’s custom content division, and has also served as editor in chief for CRN and InfoWorld. He also has held editorial positions at PC Week, Computerworld and Digital Review.

Recommended for you...

Strategies for Successful Data Migration
Kashyap Vyas
May 25, 2022
Leveraging AI to Secure CloudOps as Threat Surfaces Grow
ITBE Staff
May 20, 2022
The Emergence of Confidential Computing
Tom Taulli
Apr 20, 2022
IT Business Edge Logo

The go-to resource for IT professionals from all corners of the tech world looking for cutting edge technology solutions that solve their unique business challenges. We aim to help these professionals grow their knowledge base and authority in their field with the top news and trends in the technology space.

Property of TechnologyAdvice. © 2025 TechnologyAdvice. All Rights Reserved

Advertiser Disclosure: Some of the products that appear on this site are from companies from which TechnologyAdvice receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. TechnologyAdvice does not include all companies or all types of products available in the marketplace.