Pretty good overview, but a couple minor nits...

On Sun, Feb 11, 2007 at 12:28:51AM -0600, T.J. Duchene wrote:
> Stable is the version most commonly used.  It's tested and true.  Great
> for servers, very bad for desktop machines.  The reason being is that
> the software versions are older and not up to par with what people
> expect.

I disagree that this is "very bad" for desktop machines.  I've run a
medium-scale corporate network with Debian stable for all new installs
and, aside from a little initial "it's not Windows!" resistance from
new hires in their first week or two, had no complaints/issues with it.

Based on what I've seen, most of the stuff that's "not up to par with what
people expect" is just eye candy anyhow and adds nothing to functionality
for the average user.  Bleeding-edge, anti-aliased, 3d-accelerated,
composited alpha channel googaws may impress your friends, but there's no
real need for them.  (Even if Vista is attempting to make them ubiquitous.)

> Ubuntu, on the other hand does things differently.  They use a snapshot
> of the Debian unstable tree as a starting point.  Then they divide it up
> into what they will offer support for and what they won't (unlike Debian
> who supports every single package they maintain).  In other words,
> Ubuntu is in effect, a subset of Debian, but since Debian is over 16,000
> packages, you can't blame them. 

Debian is over 16,000 packages on 11 different CPU architectures.
As near as I can tell, it appears that Ubuntu, in addition to cutting
back to 3000 packages, also supports only 3 architectures (x86, x86_64,
and PPC).  But, again, you can't really blame them for focusing on the
CPUs that such a large majority of systems run on.

-- 
Windows Vista must be the first OS in history to have error codes for things
like "display quality too high"
  - Peter Gutmann, "A Cost Analysis of Windows Vista Content Protection"
    http://www.cs.auckland.ac.nz/~pgut001/pubs/vista_cost.html