Feeds

back to article Can you trust 'NSA-proof' TrueCrypt? Cough up some dough and find out

Security researchers are raising funds to conduct an independent audit of TrueCrypt, the popular disk encryption utility. TrueCrypt is widely used as a tool to strongly encrypt and decrypt entire drives, partitions or files in a virtual disk. It can also hide volumes of data and said to be easy to use. The source code for the …

COMMENTS

This topic is closed for new posts.

Page:

Bronze badge

Honey pot

In these days of cynicism and paranoia, and in view of recent revelations the thought that something like Truecrypt could have originated from one or other of the world's security agencies seems quite valid.

10
1
Big Brother

Re: Honey Pot

Chris G opined: "the thought that something like Truecrypt could have originated from one or other of the world's security agencies seems quite valid."

Indeed! What better way to spy on the world than to offer it free "open source" encryption which (a) cannot be compiled from its open source and (b) contains unexplained encrypted code in its downloadable binary but is nonetheless "guaranteed" to have no back doors ?

"Shake for me girl! I wanna be your backdoor man!"

3
0
Black Helicopters

Re: Honey Pot

Perhaps they might also pretend to do a very public "audit" of the code - with the added bonus that the members of the public are conned into paying for it....

2
0
Linux

Re: Honey Pot

@Noel and Jim:

(i) it's the Windows version that is very tricky to compile from source - and there are both benign and suspicious explanations for this;

(ii) Matthew Green has a good reputation technically and ethically - it doesn't seem likely that he would be involved in some game to pretend to audit TrueCrypt (unless you take the completely unworkable "trust no-one" tin-foil hat position).

1
0

Re: Honey pot

In few years the next revelation will be like how intel was providing NSA a backdoor at level no one imagined.. i.e. closest to metal .. right incide their cpus... they might be mixing known info defused with encrypted info. As known info can be figured out from encryted data hence deriving the key. can anyone calibrate the traffic between NSA and Intel headoffice servers!?!

1
0
Bronze badge
WTF?

Unnamed qualified professionals vs amateurs?

In the decades I've been in the software industry, I have met so many amateur qualified professionals, it's just not funny. "Oh, look, I have this fancy degree but I can't actually do anything that my degree supposedly certifies me as qualified to do."

Not funny.

Kenn White and Matthew Green need to say who it is that will do the audit, and why they've been chosen. I have personally worked on a project where the source has come to me in a 20Mb Zip file, compiled across multiple paths, and was just a pure mess of 2/3 C and 1/3 8086 assembly, and the compiler company had gone out of business. I got that entire product line compiling again. So, yeah, I'm asking the question, "who audits this?"

Sure, I'm all for an audit, but so far this isn't something that I would support with money.

6
4
Silver badge

Re: Unnamed qualified professionals vs amateurs?

They haven't said WHO they're hiring because they're still in the process of contacting prospects. IOW, they don't know yet. Once the funding builds up and they get some contracts, then they'll be able to list names.

2
0
Silver badge
Meh

Re: Unnamed qualified professionals vs amateurs? @Brian

Whilst I understand your view I can help but point out that its flawed. Unless they pick someone you personally know and trust you are a still going to be in the same boat.

Even if they pick someone you approve of - the vast majority of people arent going to be to share or verify that view so as a criteria for funding the effort or not its largely meaningless.

1
0

Re: Unnamed qualified professionals vs amateurs?

This is a very good point. I think everyone agrees that open source is now the only way one can potentially gain any assurance of no backdoors. But you still need to look very closely at the code and how it behaves - and, of course, you also need confidence in the audit process itself.

So a program to publicly audit key pieces of FOSS for security weaknesses looks like a good way to go and Truecrypt is certainly a good test case. But I think the real work that needs doing next is on the auditing procedure.

How do you produce a public audit process that is itself secure against possible attempts to infiltrate it and overlook security weaknesses? I suggest you probably need at least two independent and well-known (and trusted) experts, probably with support, to produce independent and public reports. Then you may need a separate independent committee to review those reports and draw attention to (and investigate) any discrepancies.

I see the involvement of many people as being essential in building a web of trust that can't be easily subverted. We should perhaps start to see support for auditing security software as being just as important as supporting the writing of the code. If we had as many people doing the former as the latter, we wouldn't be in this mess.

At the same time, we'll no doubt continue to rely on penetration testing by individual security researchers, as we know that regularly turns up obscure ways to defeat security. The idea of a bug bounty is a good one here, I think.

Just some random ideas, really, but I think this is a key area of trust that urgently needs attention.

2
0
Devil

Almost unsolvable problem.

Even if you have all the code, and reviewed all the code, you are still compiling it with a compiler in binary form.

So then you would need to compile the compiler again, with another compiler, but you have that compiler in binary form, so you can't trust that one either. So there's a chicken and an egg problem.

This problem is probably solvable by writing your bootstrap compiler and linker in assembler (have fun doing that, see ya in a few years), and using that to compile the gnu-c compiler, and then use that compiler to compile truecrypt.

But then again, you are still using IO functions from your developers os.. So maybe you need to recompile that first as well.. sigh...

I have an easier solution, if you don't want people to know some things, don't tell em, don't write it down, and certainly don't store it on a computer.

5
10
Silver badge

Re: Almost unsolvable problem.

"I have an easier solution, if you don't want people to know some things, don't tell em, don't write it down, and certainly don't store it on a computer."

And if you have a bad memory or the information's not easy to memorize (like random data--poor fit for Memory Theater), you're basically hosed?

Anyway, it is possible to set up some chain of trust. You just need to hand-assemble something that can process a few bits of assembler code, use that to create a means to do more of it, and build up from there. Or you can hand-disassemble one of the low-level steps, verify it, then use the verified tool. Then you can take on a compiler with assembler code and build on up. And you can do all this from a bare-bones OS or from a setup where direct access is used, bypassing the OS. Just saying there are ways that don't have to take years. Weeks, a month or two, maybe, but not necessarily years.

4
0
Silver badge

Re: Almost unsolvable problem.

I agree with Charles, the solution lies in the disassembly of the compiled code.

The algorithm for encryption is known, it should be possible to create a "theoretic "byte code projection of the few funcitons that exist, these in turn should be compared to the compiled byte code.

After disassembly of the public compiled code.

Each individual assembly function should be compared to the logic of the orginal assembler or C code. The logic must match, no extra routines or unnecessary comprisons shoud be seen.

The main elements :

How many decrypting functions exist, more than 1, Red Flag.

Which functions make calls to the decrypting function. Any more than 1 or two, start asking question.

Which functions handle the secret keycode - how does the function handle the keycode. Leading Space stripping, UCASING or lcasing, reversal, byte code comparissons, individual comparissons etc.... Red Flag.

What are the parameters types sent to the decryption function.

etc

etc

etc

I would agree though that it would be painstaking work but definately possible especially when you have the original code.

I would also put a small mention out to +ORC's and his Martini-Wodkas...

0
0
Silver badge
Big Brother

Re: Almost unsolvable problem.

Even if you have all the code, and reviewed all the code, you are still compiling it with a compiler in binary form.

In the real world, this is not actually seriously a problem. It can be defeated in theory....

Countering "Trusting Trust"

...and I actually don't think a compiler exists that has enough swiss army knife functionality to look out for a few dozens programs just to put backdoors into the crypto parts unseen. Ken Thomson's initial idea was to finagle the lowly "login" program, which sounds feasible. Finagling GPG etc. via that method sounds like it needs a AI module in the package.

4
0
Silver badge

Re: Almost unsolvable problem.

What makes you so confident your own memory is secure? There are plenty of ways to get people to talk. Bribery, blackmail, drugs, and let us not forget simple old-fashioned pain.

3
0
Silver badge

@Majid

"Even if you have all the code, and reviewed all the code, you are still compiling it with a compiler in binary form.

So then you would need to compile the compiler again, with another compiler, but you have that compiler in binary form, so you can't trust that one either. So there's a chicken and an egg problem."

Although not solely related to security this approach is actually being used in FreeBSD (only learned this pretty recent myself).

If you compile a FreeBSD kernel (which is part of the source code for the OS) or the entire OS itself the first thing which is being done is compiling the base components which are required for building. These get placed in /usr/obj and from that moment on everything else is build using those new set of tools.

As mentioned this is mainly done for optimization and not so much security, but I suppose one could argue that this could provide a little(?) extra where trust in the build tools is concerned.

Still, in the end a bit offtopic considering that FreeBSD doesn't use TrueCrypt but relies on gbde (GEOM based Disk Encryption) and the geli cryptographic subsystems. (though TrueCyrpt is available as well as a separate program).

0
0
Bronze badge
Go

poor memory

If I am CORRECT you can get on your high HORSE and use BATTERY to STAPLE it into your memory

http://correcthorsebatterystaple.net/

(By the way I got this as an answer:

Bare-Miss-Puzzle-Pile-3

It sounds distinctly like the coder had a sense of humour)

Obligatory http://xkcd.com/936/

2
0

Re: Almost unsolvable problem.

if you want a horror story look at what Ken Thompson did inside Bell Labs, and this is probably childs play compard to what is possible now, 30+ years later.

An early unix C compiler contained code that would recognize when the login command was being recompiled and insert some code recognizing a specific password allowing him in.

He also made the compiler recognise when it was compiling a version of itself, and automatically insert the code to do all this again. Having done this once, he was then able to recompile the compiler from the original sources; the hack perpetuated itself invisibly, leaving the back door in place and active but with no trace in any of the source code.

details are published in “Reflections on Trusting Trust”, Communications of the ACM 27, 8 (August 1984), pp. 761--763 (text available at http://www.acm.org/classics/)

1
0
Silver badge

Re: Almost unsolvable problem.

I was about to comment on the same thing myself. It hardly seems paranoid, when it has already been publicly described, does it?

http://scienceblogs.com/goodmath/2007/04/15/strange-loops-dennis-ritchie-a/

0
0
Anonymous Coward

Let me save you some time and money.

If TrueCrypt is an American company, look elsewhere.

2
5
Silver badge

Re: Let me save you some time and money.

Certain countries on the Mediterranean shall be included in this set, then.

4
0
Anonymous Coward

Re: Let me save you some time and money.

Actually that doesn't save time at all, because nobody knows who heck they are. This could either be a good thing or a bad one.

0
0
Silver badge

Disband the NSA

One way to make things NSA-proof is to disband the NSA.

We are funding a campaign of terror against our own citizens. It will be sure to stop (at least from the NSA) if it no longer exists.

6
2
Anonymous Coward

Re: Disband the NSA

"It will be sure to stop (at least from the NSA) if it no longer exists."

So?

Same people funding, same people spying, same game, different name.

On the other hand if the people at the top were to get locked up for their crimes (after due process of course, e.g. 5 years in Guantanamo), that *might* encourage others not to start up in the same game.

5
1
Bronze badge

Re: Disband the NSA

btrower: "We are funding a campaign of terror against our own citizens."

And you're free funding a campaign of paranoia. Have you stopped to think that maybe you're working in the same direction as the NSA?

1
7
Silver badge

Re: Disband the NSA

I have to say it's kind of sweet that many Americans seemed to have so much faith that their spy agencies weren't spying on them all along. I can see why recent revelations rattled so many of you, and it must have been genuinely quite nice to believe the assurances and platitudes for so long.

I've always assumed that traffic was routinely monitored not so much through paranoia (by all means watch my traffic, really don't care) but more... well they would, wouldn't they? They exist to spy, they are accountable to get best value for money, this way is cheap and effective compared with building up country-wide informant networks, probably nicer for the population too by and large. Pay people to spy, that's what they will do.

5
0
Silver badge
Bronze badge

Re: Or we could give up and just OBEY?

"who am I to disbelieve them?"

Well, hopefully someone who read the paper he edited. Why would you assume that the security services were any more trustworthy or competent than any other government department?

1
0
Silver badge

wait a minute

I thought part of the NSA's mission is supposedly to be protecting Americans. How is it protecting Americans by weakening their privacy and protection from adversaries by weakening the tools they use for protection? Ok fine I get it their only mission is protecting the US government from the people but even then I bet some of their subterfuge has hurt some of the other government departments as well. Just the fact an ex general is running the program pretty much explains everything.

1
1
Gold badge
Unhappy

Re: wait a minute

"I thought part of the NSA's mission is supposedly to be protecting Americans. How is it protecting Americans by weakening their privacy and protection from adversaries by weakening the tools they use for protection? "

Easy.

"We had to destroy their privacy and sense of safety in order to keep their information more private and their lives more safe."

But IRL "Why should we give a s**t how the American people feel? We didn't bother asking them when we took their privacy in the first place. "

1
1
Bronze badge

Re: wait a minute

"Just the fact an ex general is running the program pretty much explains everything."

The NSA is part of the Department of Defense, which is why a General is running it. Its was a spin-off of the various spy agencies created in WW2.

They aren't specifcally fraid of American citizens, but they don't trust them either. There have been a handful of terrorists that were American citizens, so now the NSA et. al. are operating under the assumption that anyone could be a terrorist and thus don't trust anyone, unless they have a security clearance that they have verified.

2
1
Anonymous Coward

Re: wait a minute

The NSA's mission is to protect Americans from Americans.

Rather like the way their Constitution allows Guns to do the same.

2
1
Bronze badge
Black Helicopters

Re: wait a minute

I thought part of the NSA's mission is supposedly to be protecting Americans. How is it protecting Americans by weakening their privacy and protection from adversaries by weakening the tools they use for protection?

"We had to destroy that village in order to save it."

3
0
Anonymous Coward

Very odd...

The Windows version appends 64k of encrypted supposedly 'random' data to the end of a Truecrypt volume, whereas the linux version pads it with 64k of encrypted zeros (which is presumably verifiable by decrypting them)?

The Truecrypt team are meant to be encryption specialists... have they never heard of 'nothing up my sleeve' numbers? Or realise encrypted zeros are indistinguishable from encrypted random numbers?

Definitely something funny going on there. I'd be interested to hear their explanation for the difference in behaviour, as well as why they thought it necessary, and whether they appreciate unexplained data attached to a volume which isn't verifiably information-less is likely to cause concern in todays environment.

11
0

Re: Very odd...

Very worrying

1
0
Bronze badge
Trollface

Re: Very odd...

" have they never heard of 'nothing up my sleeve' numbers?" - you mean like Pi? I don't trust that one either, I've heard the NSA has its hand in geometry too. They say Archimedes was onto them, and they were the ones to get him killed...

0
0
Anonymous Coward

Re: Very odd...

So I'm looking at the format spec and it looks like the last thing in the file should be the backup hidden volume header. If no hidden volume exists, it should be random data. (This is to provide plausible deniability for volumes that DO have one.) But you're saying that on Linux it's zeros instead? I'm not sure what that points to exactly, but it's certainly odd, and sloppy.

0
0
Bronze badge
Windows

Did I miss something?

'The 'problem' with TrueCrypt is the same problem we have with any popular security software in the post-September-5 era'

Sorry to change the subject from all this intelligent chit-chat about cryptography but what happened on September 5th? Sainsbury's delivered my shopping nice and early for a change but I'm sure that can't be it.

1
0
Bronze badge

Re: Did I miss something?

I think maybe that was when Snowden start publishing stuff?

0
0
Anonymous Coward

Re: Did I miss something?

Nothing changed when Snowden starting talking. NSA is what you saw. The shadow government(s) you do not see because you don't want to. It's funny to see everyone shocked and scrambling. It's laughable and pathetic. You'd rather be ignorant. So when they shutdown the NSA will everything be all rosy again? Perhaps you should be a little more simple minded and recognise the pattern. It's not going to change any time soon. Get used to it.

2
1
Silver badge

Re: Did I miss something?

" It's not going to change any time soon. Get used to it."

Sent to this forum by your friends in the intelligence community.

1
0
Anonymous Coward

Hows about?

Linux malloc nulls returned block of memory before returning, Windoze equivalent doesn't?

What is this block of memory used for?

2
0
Bronze badge

Re: Hows about?

Windows gives out zero'ed blocks of memory, so that can't be it, unless they wrote their own memory allocation function.

It could be a decryption key or it could be a checksum, a disk signature, or a copy of track 0.

0
0
Bronze badge

Re: Hows about?

"Windows gives out zero'ed blocks of memory, "

*Windows* may give out zeroed blocks of memory - I don't recall because it has been a long time since I needed to allocate memory directly from Windows itself(*) - but so what? malloc() is not a system call, especially on Windows. It is part of the C runtime library, and on Windows it normally carves up big blocks of memory allocated from the OS. But, being malloc(), it doesn't do anything special with the contents (except in debug builds, where they are often poisoned with some arbitrary value - Visual C++ uses 0xCD if memory serves, to produce a 32-bit (or 64-bit on Win64) value that can't be used as a valid pointer to anything.(**)). In particular, except in the poisoning case, it doesn't vape the previous contents...

(*) It is even longer since I relied on the contents of a memory block allocated by malloc() being anything but 'uninitialised'. If you want guaranteed values in your allocated memory, use calloc().

(**) OK, when running a Win32 program on Win64, with certain marks on the .EXE, pointers could be from anywhere in the 32-bit address space, but this value was chosen when Win32 was king and there wasn't any Win64 on x86 because there wasn't any x64. At that time, .EXEs marked appropriately could have access to 3GB of address space, and 0xCDCDCDCD is in the fourth GB...

3
0
Bronze badge
Thumb Up

Paying for it.

If it was legit, Truecrypt could easily afford the code review. If they volunteered the 25K that would go a long way to reassuring me there was nothing to hide.

1
2
Silver badge

Re: Paying for it.

From the billions they receive from giving away free open-source software?

5
1

Re: Paying for it.

Or maybe from the billions they get from the US government for giving away free open-source software?

That sort of review might not end up being very independent, might it?

0
0
Bronze badge

sounds good

I'll donate.

1
0
Big Brother

It only takes one

One NSA friendly insider is all it takes to break into the vault. Instead of centralized systems that big government loves, we should rely on our own localized systems.

1
0
Black Helicopters

Dependencies

Building TrueCrypt from the tarball TrueCrypt_7.1a_Source.tar.gz comes with several requirements inter alia:

RSA Security Inc. PKCS #11 Cryptographic Token Interface (Cryptoki) 2.20 header files (available at ftp://ftp.rsasecurity.com/pub/pkcs/pkcs-11/v2-20).

An audit should take this into account. BTW FTP is blocked by my firewall by default.

2
0
Anonymous Coward

TC's silence

This is a no brainer to me. If the compiled binary TC supply has an extra 64K block of data that neither the source code nor TC can not or will not explain, then by definition, TC can not be trusted.

TC's silence about this matter proves to me that either :

A) they do understand why trust and openness is paramount in a security tool

B) they have something to hide

TC have just invalidated themselves through their silence. I don't have to be some sort of security expert or even a programmer to conclude this - its common sense.

5
0

Page:

This topic is closed for new posts.