Feeds

back to article Is old code automatically good code?

The "old dog&" in this case is Marcus Ranum, inventor of the proxy firewall and the man who implemented the first commercial firewall product. He’s now the CSO at Tenable Network Security, the company that produces the Nessus security scanner, and author of the book The Myth of Homeland Security. Marcus Ranum He's also on …

COMMENTS

This topic is closed for new posts.

Choice of language also matters

Many of the problems mentioned in the article (and often sited as weaknesses exploited by malware) are tied to using a language (such as C) that does not guard against exceeding the declared bounds of a data structure, following null pointers, following pointers to data that has be deallocated and then overwritten with new data, etc.

Many such errors can be (and are) caught by tools such as those mentioned, but often these tools give only partial guarantees or report so many false positives that people stop using them (I suspect the reason the author didn't think the false positives reported for his old code were a problem is that he codes more cleanly than the majority of C coders).

Hence, I believe you should not code safety-critical software in a langauge that does not, by deafult, guarantee against such error. Ideally at compile-time, but failing that at least at run-time. So:

- Pointer types should by default not include null pointers, and it should be verified by type cehecking that you never assign null to such a pointer. You might allow explicit inclusion of null pointers (making it a different kind of pointer type), but the code would do a null-test before following it or casting it to a non-null pointer type.

- Array and string bounds should be tested when accessing them. Many of these checks can be eliminated at compile time.

- But you should focus libraries etc. around unbounded data structures such as lists and trees instead of fixed-size structures such as arrays. Or make arrays automatically extend if you add elements outside their original bounds.

- When data is deallocated, it should (preferably statically) be verified that there are no live pointers into it. This means, for example, no pointers to variables on stack frames unless these pointers are verified to not escape the lifetime of the frame. Additionally, heap allocated data should be either automatically deallocated or manual deallocation should be checked (by region inference or some such).

And so on. Examples of languages on the right track in this respect are most functional languages but also more "traditional style" languages like Cyclone and Spec#.

0
0

Not just language

Well, I rather agree with Torben - I like SPARK (http://www.regdeveloper.co.uk/2006/09/20/high_integrity_software/). But you could argue that the machine/OS architecture should distinguish between code and data - as in the AS400/iSeries - and keep users isolated from system functions in separate address spaces - as in z/OS mainframes.

But, for whatever reason (cost?) we are currently using less secure architectures and languages for non-safety-critical systems, in practice - and we need to get defects out of this code.

Whether we should also use more inherently secure systems and languages (and I think that Moore's Law says we can probably afford to do so more often these days) is another article....

0
0
Don

This is why...

...it's important to re-evalutate active software regularly. Just because "it passed the audit 2 years ago" doesn't mean it doesn't have flaws.

0
0
This topic is closed for new posts.