[TriLUG] Green Hills calls Linux 'insecure' for defense
Aaron S. Joyner
aaron at joyner.ws
Tue Apr 13 07:29:44 EDT 2004
Tanner Lovelace wrote:
> Rick DeNatale said the following on 4/12/04 6:54 PM:
>
>> Regardless of the access to and vetting of source, those of us old
>> enough to remember Ken Thompson's Turing Award Lecture "Reflections on
>> Trusting Trust" from around 30 years ago will realize that backdoors and
>> malicious code can be hidden in an open source environment as well as in
>> a closed source one.
>
>
> But, yeah, KT's hack was quite an elegant one. :-) I still maintain
> that Open Source is generally more secure than closed source and
> it's much *harder* (although not impossible) to insert a back door
> in it.
>
Note, Ken Thompson was the original author of the first versions of UNIX
at UC Berkley, and directly involved / responsible for every version up
through Version 7 (of which all modern UNIXes share as a common lineage
to). This article was published in August of 1984, and referenced the
time-frame as "In college, before video games" - i.e. presumably a good
bit before 1984.
For those who lack a good understanding of the time frame:
1991 - Linus announced the release of Linux
1983 - September, RMS announced the start of the GNU project
1978 - Dennis Ritchie (C's author) wrote the book on C
1975 - the first installation of a UNIX at Berkley (Version 6, installed
by KT)
1967 - Dennis Ritchie graduated from Harvard, joined Bell Labs
1966 - Ken Thompson graduated from Berkley, joined Bell Labs
You can draw two conclusions from this, depending on which side of the
coin you like to see. For the conspiracy minded among you, the
technique of such a Trojan Horse has been around for _well_ long enough
for it to have been incorporated into some of the earliest UNIX
environments, and thus definitely around long enough to be accessible in
the general literature at the time most of the Linux fundamentals were
originated. It is thus entirely possible that Linux contains a clever,
yet subtle, Trojan Horse. On the other hand, a closer analysis of the
people and situations involved would lead you to realize that the
community is based upon reputation, and producing good code. No one who
had tried to pull the wool over everyone's eyes with such a scheme,
would have been able to succeed in the first open environments, and the
very distributed nature of Linux (particularly in it's origins) meant
that it's amazingly unlikely that any two current distributions of Linux
share a binary compiler ancestry (i.e. were all compiled by the same
ancient binary - what's required to perpetuate this, as others have
pointed out).
Another corollary to these ideas, is that clearly, this can be
perpetuated in a closed source environment. KT's very own example
proves it. It's entirely more likely that in the compiler department at
IBM (or SGI, or HP, etc), one determined programmer could slip in a
binary trojan of this nature, and then it's _very_ reasonable that
binary trojan could perpetuate itself in the internal build tools even
across major OS versions, perhaps for years after the individual has
left the company. The trojan would need to be sufficiently complicated
as to be able to recognize potential changes in the login program, and
still produce valid output, but in my experience changes to core
functions like login(1) are rare in mature UNIX environments. Closed
source environments and vendors, or commercial vendors in general
(including our dear RH and Suse), are significantly more likely to be
susceptible to this problem. Of course, commercial support is supposed
to come with some level of assurance that the company has "vetted" the
employees in some tangible way, although KT comes right out and says at
the end of his article, to not trust "code from companies that employ
people like me". I'd hazard a guess to say that any manager would look
at KT's resume and offer him what ever he wanted, on the spot. :) By
the same token, so would any modern open source project. But that
doesn't necessarily mean that the project would accept binary
submissions from him, although the company would be much more likely to
grant him access to assist in making the binaries that make it onto a CD.
KT also cautions that no only must care be taken at the compiler level,
but the risk is even greater at the assembler, loader, or even microcode
level. If we don't trust Intel and AMD not to cruft up our bits, who
can we trust? Granted, the complexity of a login-level hack at the
microcode level makes my head swim, but in 1984 - the concept of a
compiler-level trojan made almost everyone's head swim, too. :)
Aaron S. Joyner
More information about the TriLUG
mailing list