NComputing and Windows: Rethinking Multicore, Thin Clients and Operating Systems


NComputing takes a standard PC and basically turns it into a thin client host, allowing one machine to service the needs of many. Our Arther Cole wrote on this back in August of 2007, and the product has improved significantly since then.


I met with folks at the company this week and they reported that they are fast closing in on 1 million seats using this technology, largely in education and the third world, indicating that the concept has traction and some solid economic benefits. But part of the reason it is so successful is that Microsoft provides special licensing discounts to third-world countries and education markets, making the result more cost-effective.


We have a number of interesting vectors starting to converge: cell phones that are becoming more like laptop computers connected to thin client-like applications, PC blades, thin clients (including NComputing), and the overall move to the "cloud." This feels like we are at the forefront of another event like the one that created Microsoft. But, other than Google, which seems ill-suited to a platform battle, there really isn't any other platform vendor well positioned for this wave yet.


Multicore: Who Cares?


The big problem with multicore is that after two cores, the benefits drop off a cliff; software, as it currently exists, is still largely not written to take advantage of more than two cores at a time. I run an eight-core machine myself and while I can, from time to time, light up three cores, the only time I light up all eight is if I run a tool that is simply designed to light up all eight cores.


But, in a shared usage model, a multicore solution shines. It can assign cores to each connected user dynamically based on need, creating a much more efficient model, in terms of price to performance. This is the core value that NComputing sells on and it seems to hold up in practice, based on their near million seats.


Traditional Thin Computing and PC Blades


The problem with traditional thin computing solutions revolves around the server hardware that doesn't scale well largely because it wasn't designed for desktop-type loading. Plus, you tend to need expensive servers, and video performance is hardly something that gets designed into any of them, which is problematic given that desktop computers need good video performance.


PC blades provide a better performance match, but the lack of standards at a blade level significantly increases both the cost and risk associated with the purchase, effectively overcoming much of the advantage for many potential buyers.


In both cases, mobile solutions are rare. HP is one of the few firms that has been able to provide a limited mobile solution. Limited because the data wireless services available to it all have some shortcoming and the hardware is only usable when it can connect to a network.


Combined, these problems have prevented what otherwise would have been the rapid rise of a much more reliable, much more secure platform, with a lower cost of ownership.


The OEMs Want DOS Back


After a series of OEM meetings, that is kind of what my takeaway was: OEMs want DOS back, or at least a vastly less feature-rich OS that they could better configure for different target markets with their own stuff. Having said that, I'm thinking what is needed here is a Windows OS and licensing program to work with a multi-user machine. That may be coming in Windows 7.


With the market now starting to aggressively explore this idea of shared processing and graphics, I think we are close to something really creative. The Splashtop OS product has a number of folks suddenly looking with interest at a slimmed-down OS front end that would continue to work without a network, and then a full-featured capability that would be hosted and work when the system was connected to address the mobile side of this, or a laptop with an ultra-low power option. This takes us back to the DOS concept of a light OS on the client.


Wrapping Up


I like NComputing because it gets around the standards problem of PC blades and the graphics problem of traditional thin clients, which use traditionally poor server-based graphics, coupled with a light OS like the Splashtop and running hosted Windows connected to cloud-based applications and storage when connected. That seems to use the greatest breadth of in-market capabilities to create what should be the best match of security, performance, reliability, and cost for the next generation of desktop hardware.


The question is: Who is going to figure this out first?