Rethinking the Desktop, Part 1: Is It Time for a Bigger Change Than Vista, MacOS, or Linux?

Rob Enderle

An audio download recently hit the Web of a talk Bill Gates gave nearly 20 years which, with some exceptions, outlined what actually happened in the subsequent two decades.

 

Twenty years is a long time for anyone to be right, and now that Bill has largely moved on from the role which allowed him to build a company that defined much of the last 20 years in software and hardware, we probably should be looking at what will be coming next. I'm thinking that, while Windows has had a good run, maybe it is time for a different path.

 

Getting Around the Religious Rhetoric

 

One of the problems in having a discussion on this topic is getting around the "mine is better" rhetoric. All of the solutions currently in market are largely based on thinking about the world as it existed in the '80s. Apple is still thinking that folks want a closed hardware/software platform like IBM had at the start of that decade; Linux is UNIX done better; and Windows is Apple that is hardware-independent -- the Bill Gates idea which made him the richest guy in the world. Last time I checked we weren't living in the 80s...

 

The Apple guys do the little dance where they say their stuff looks better and is more virus free, overlooking the fact that they only have 2.5 percent of the market and most of the software out there will run on a Mac only when it is running Windows -- and running Windows it is just as vulnerable as any other Windows box. I just reviewed a 500-unit Apple trial and it failed (machines were running Windows, interestingly enough), as usual, due to support and cost issues, and the Mac does far better in its trials than Linux. Addressing the related problems appears core to the rumored delay of Leopard until October (and remember it has already been delayed several times now).


 

The Linux guys will boast that they are better still: they can actually look at source code, have less crashes and some can even get their girlfriends (real or imagined) and parents to run it. With less than 1 percent of the market, what is surprising is how vocal these folks actually are. Many of the machines being sold with Linux in the Third World actually run pirated copies of Windows in production which overstates, significantly, what is actually in use. Seriously, how many CIOs do you know want a platform where users can muck with source code? They have enough problems with users screwing around with applications as it is.

 

On the Windows side, the industry has been asking for changes for some time, and Vista just hasn't been the market driver it should have been. Most CIOs seem to be saying XP is fine, thank you very much. But, when you have decades of software designed to run on a particular platform, it is really hard for the entrenched vendor to make changes and, over time, you end up being killed by the amount of backwards compatibility you have to build into the offering. In short, at some point, you may have to start over if only to get out from under a massive amount of aging code.

 

But Microsoft has been hearing the "start over" thing for some time, and not only is it impractical, it's probably one of the quickest ways to lose their attention on a topic.

 

Each of the competitors should know that trying to catch Windows from behind is largely pointless, and even if incredibly successful unlikely to drive the kind of amazing changes that Windows did when it matured. In Apple's case, an OS is unlikely to come close to the power it has with the iPod.

 

For Microsoft, if it can't drive excitement back into the company, it will increasingly look and feel like IBM, a good place to have worked but not a great place to make your mark on the world -- in other words, an aging company continually eclipsed by younger (read Google) players. And Google, much like Microsoft did to IBM, is executing a strategy that could be a game-changer.

 

Looking at What Has Changed

 

When we started down the path for a personal computer, these things weren't connected, weren't portable and generally cost, fully configured, in excess of $3,000 in 1980 dollars (around $6K today), which was why for much of the '80s we used PCs from Atari and Commodore for personal use rather than machines from Apple, IBM, or HP. The '90s brought a lot of change and the appliance PC died, Apple largely stalled in market (and almost failed), and Linux on the desktop wasn't really an option.

 

Desktop PC costs dropped by two-thirds, but laptops cropped up and were closer to half the cost of the original desktops and, even more when you factored in higher support cost and shorter service lives (we often forget the first PCs remained in service for up to nine years).

 

As we move solidly into the second half of the 2000s, the appliance PC is back, but we call it a set top box, game console, or smart phone and, with smart phones, it exists in business. We also have the rebirth of thin clients and a new class called blade PCs.

 

Laptops now cost around $1,000, and desktops closer to $800 with large flat screen monitors (which have service lives in excess of 5 years). Averaged over the three-year life of the machine, support costs for PCs have come down sharply and now are estimated at around $1K per year with desktop computers (which are in sharp decline), dropping well below that mark in well-managed shops.

 

This creates a huge barrier to entry for Apple and Linux, both of which lack the economies of scale needed to compete at these price levels because they incur extra support costs just because they are different.

 

Next, we'll talk about what the key vendors (IBM/Google/HP) are doing to change this.



Add Comment      Leave a comment on this blog post

Post a comment

 

 

 

 


(Maximum characters: 1200). You have 1200 characters left.

 

null
null

 

Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.