josh at joshwelch.com wrote:

>Quoting Sam MacDonald <smac at visi.com>:
>
><corrected top posting>
>
>  
>
>>Mike Miller wrote:
>>
>>    
>>
>>>On Mon, 9 May 2005, Ken Fuchs wrote:
>>>
>>>      
>>>
>>>>Please support Free and Open Source Software (FOSS) BIOSes such as 
>>>>LinuxBIOS, before it's too late.  Future BIOSes will execute only 
>>>>authenticated executables.  Will GNU/Linux be one of the 
>>>>authenticated operating systems?
>>>>        
>>>>
>>>Wow.  That is a disturbing suggestion.  Is that really in the works? 
>>>I'll tell you one thing for sure -- I won't buy any motherboards that 
>>>cannot run Linux.
>>>
>>>Mike
>>>
>>>      
>>>
>>I would think that would be configurable and an option for servers not 
>>workstations/home computers.
>>
>>Sam.
>>    
>>
>
>What's being referred to here, in a a slightly inflammatory manner, is trusted
>computing. The concern was raised by Microsoft's attempts to develop their
>trusted computing platform, Palladium, in conjunction with Intel's Trusted
>Computing Platform Alliance (TCPA). Microsft's trusted computing implementation
>would likely be very restrictive and should not be considered a good thing.
>However, Microsoft, not surprisingly, is considerably behind on their
>development of Palladium. Hardware companies are already rolling out various
>components of trusted computing with support provided by third party software
>the hardware company provides.
>
>It is my opinion that trusted computing is going to be driven by the hardware
>manufacturers, not the software manufacturers. Like all other hardware, as long
>as specs are made available for how to interoperate with trusted computing
>features or the community is able to reverse engineer them, this will be a
>non-issue.
>
>Trusted comuting in and of itself is not a bad thing and may be the only way
>that computing can be made available to the general public in a truly secure
>fashion.
>
>Josh
>
>_______________________________________________
>TCLUG Mailing List - Minneapolis/St. Paul, Minnesota
>tclug-list at mn-linux.org
>http://mailman.mn-linux.org/mailman/listinfo/tclug-list
>  
>
If you look at the requirements of 'trusted computing' there is no way
that it can be good for OSS. For it to be effective it has to be
enforced by both hardware and software from the moment the machine is
turned on. The BIOS has to verify that it has not been modified, it has
to be able to verify that the boot loader has been signed by someone the
BIOS trusts and hasn't been modified, which has to verify that the OS
being loaded is from a trusted source, etc. The bottom line is that you
can't run untrusted software on a trusted system and still have a
trusted system.

The obvious way to do that is to digitally sign everything in sight
using certificates that trace back to some 'god' certificate for the
planet :-) For this to be effective you aren't going to just hand these
signing certificates out to everyone who wants to write an OSS
application. Any software that gets signed is also going to have to go
through an approval process that verifies that it enforces all of the
trust requirements and is secure enough that it can't be compromised.

The natural weakness, and strength, of OSS is its lack of organizational
structure. There's no one on our end to sign licenses or NDAs for things
like device drivers, CSS routines, etc. Who is going to manage all of
the OSS submission to get applications signed? Just think of the delay
in getting new versions and patches verified, tested, approved and
signed :-(  Say hello to the return of the monolithic kernel.

The usual argument is that you can just turn off the 'trust' and run
Linux as usual, but the ultimate goal is to build a trusted net where
trusted systems will only talk to trusted systems. That certainly kills
spam, viruses, worms, etc. but it also leaves untrusted systems only
able to talk to untrusted systems. So if you are running an untrusted
version of Linux you can get the latest kernel patch but you can't order
the latest O'Reilly book from Amazon. It doesn't sound like fun to me.

I think the commercial and logistics involved will keep trusted
computing from happening anytime soon. I'm not sure that vendors will be
willing to turn off access to customers running older and untrusted
systems. Windows only uses a small part of the capabilities of its
security model because most developers aren't capable of tracing through
what privileges they need to which resources to figure out the minimal
access required to run their apps. Can you imagine these same folks
operating in an environment where *every* release has to be thoroughly
tested, verified and signed before release?

And of course my biggest fear is that someone will forget to renew the
'god' certificate and every trusted system on the planet will refuse to
boot :-)

--rick