I’m a big fan of the NVIDIA GRID effort, largely because it places the performance where I think it always should have been, where it can be better managed and protected. I started out in the mainframe age and while there were huge parts of that world I truly hated, the parts we lost when we went to the PC age were better security, better reliability, and the fact that someone else kept it all running for me. I’ve never really been a fan of the idea that a user needed to be an IT guy in training. The thin client efforts were either made by PC guys who didn’t understand servers or server guys who didn’t understand PCs, and the result sucked for the rest of us but the promise was always there.
Then there was Grid, which became a collaboration between PC guys and server guys. Suddenly, we had products that worked, backed by vendors like Dell and Amazon, who understood that the solution requirements were the culmination of both groups. This week, GRID 2.0 launched and the progress is rather compelling. The new systems have double the capacity, double the performance, they’ve added blade servers, and they’ve added Linux to Windows.
Let’s talk about where this is going.
Being Device Agnostic
At the heart of this effort is the concept of being device agnostic: truly being able to run the same applications on any device that you can connect to the service. Because the applications are running on virtual machines on the web, your typical performance limitations are waived. It doesn’t matter if you have a cell phone, a workstation, a tablet or are using the connected display on your car -- the same application is running.
Now there are practical limitations like screen size and network speed still to consider, but we have been constantly reducing latency and increasing bandwidth to a point where more and more locations are able to use these remote services far more successfully. Thanks to the web, developers are also learning how to better sense screen size and scale the application and modify the user interface to the device accessing it. As a result, more users are finding the move to a hosted visualized service not only acceptable but actually preferable to using a dedicated PC.https://o1.qnsr.com/log/p.gif?;n=203;c=204663295;s=11915;x=7936;f=201904081034270;u=j;z=TIMESTAMP;a=20410779;e=i
These recent improvements bring up the performance and capacity, and provide greater variety in server configuration and OS support, but we are still just at the start of this process.
As we look forward, the massive improvements in network capacity needed to handle 4K distribution of content play to this model and many of these advantages are paid for by ads that support the media that flows through the networks. Not only are we seeing a rapid expansion of fiber capacity, we are on the threshold of multi-user MIMO, which provides gigabit-like Wi-Fi performance and 5G, which will increase WAN performance significantly as well, though not yet to Gigabit levels.
In addition, GRID is driven by NVIDIA, which is used to a GPU cadence for performance increases. That means that, for the foreseeable future, each iteration of this technology will exceed by at least 50 percent the prior generation. Right now, it is operating at a 100 percent increase generation over generation, and while that will likely slow eventually, as the bottlenecks are tuned out of the system it should always exceed the far slower processer cadence in the market.
We’ll also likely see better application switching where you can, when disconnected or connected to a slow network, shift to a reduced performance localized application on the fly in order to maintain productivity, even if connectivity isn’t available or adequate, while still retaining state. This last may take a generation or two.
Finally, we’ll see more and more people use their cell phone or tablet as their preferred client device, accessorizing them with keyboards, mice and monitors as they finally realize they don’t need a full-on PC on their desktop anymore.
Wrapping Up: 1-2-3
The second generation of NVIDIA GRID is an impressive improvement, but we’ll likely see the biggest advancements come in the future when we finally realize that where the computing is done isn’t as important anymore. Ironically, in a way, this is a return to the past but without the ugly baggage and with a blend of the concepts that founded the computing industry with the concepts that will drive it through this century. The only question really is how long both users and IT administrators will see the change for what it truly is, a way to massively improve security, and get the user out of the hardware/software update and maintenance business, and focused back on the job they were hired and are paid for. There has been a massive amount of advancement between GRID 1.0 and GRID 2.0, but wait until you see 3.0. The best is still yet to come.
Rob Enderle is President and Principal Analyst of the Enderle Group, a forward-looking emerging technology advisory firm. With over 30 years’ experience in emerging technologies, he has provided regional and global companies with guidance in how to better target customer needs; create new business opportunities; anticipate technology changes; select vendors and products; and present their products in the best possible light. Rob covers the technology industry broadly. Before founding the Enderle Group, Rob was the Senior Research Fellow for Forrester Research and the Giga Information Group, and held senior positions at IBM and ROLM. Follow Rob on Twitter @enderle, on Facebook and on Google+