In Cloud Computing vs. Desktop, 'It's the Data, Stupid!'

Dennis Byron

Do you want your compute power on your desktop? Or in the cloud? The project that began this debate started 40 years ago next month. Some might find it ironic if the anniversary turns out to mark the end of the line for the Sun workstation company - one of the major players in the debate - as a separate entity.


It was April 1969 when Bell Labs withdrew from the MIT-based Multics project designed to develop a computing utility. Before the split-up of the partnership, which also included GE, debates about compute platforms were primarily between providing compute power in batches vs. in time slices. The new Bell Labs project that resulted from the split added workstation/desktop computing as another alternative.


Multics is the granddaddy of many distributed operating systems, including VAX VMS (and therefore indirectly Windows Server), and featured the first use of many file management, security, programming, systems administration, wide-area networking, user-interface and other characteristics of compute systems considered common today. The split-off Bell Labs project - which some claim was originally called the Un-Multics - used some of the Multics concepts that its inventors had worked on with GE and MIT. But it took off in a direction that enabled the personal workstation (not to be confused with the personal computer). The inventors, by the way, do not corroborate the un-Multics story, but simply say "the name 'Unix,' (is) a somewhat treacherous pun on 'Multics' ".


Within a few years, engineers and programmers and others whose jobs depended on unlimited access to intensive compute power (e.g., Wall St. whiz kids) no longer needed either batch processing or access to sliced computer time. They had more power on their desktop than they knew what to do with, at least until the next graphics-based program came out on the market.


So how come the concept of utility computing is making a comeback under the buzzword "cloud computing?" The answer is because now the idea is to provide unlimited access to intensive compute power to everyone in the world, not just to engineers and programmers and other power users (e.g., the guys that are going to straighten out Wall St.)


Does this move back to the 1960s mark the end of workstation/desktop computing - and its weak-kneed little brother, personal computing? I do not think so, because desktop computing has another advantage that computer scientists never seem to talk about in their rush to provide more and more compute power to more and more people. Many of us don't like to put our data anywhere but on our desktop (or appliance or surface or TBD) where we are not a slave to the cloud. I understand that there is no need to store 8 million ring-tone choices on your iPhone or to replicate the Internal Revenue Service tax code in TurboTax, but do you want to trust stuff that really matters to Verizon or AT&T or Comcast?

Add Comment      Leave a comment on this blog post

Post a comment





(Maximum characters: 1200). You have 1200 characters left.



Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.