Neil Ford's Software Development West presentation, 10 Ways to Improve Your Code, was aimed at Java programmers, but Ford's "advanced code hygiene" discussion had wisdom for coders of many stripes. Ford is a senior application architect and "meme wrangler" at ThoughtWorks, an IT consultant that specializes in development and …
"antiobjects" is a very stupid name isn't it. makes me think i should be programming in "C"...
I think we used to all this "thinking outside of the box" and "turn the problem round"
I also get sceptical of a paradigm which only has one example repeated over many many websites and forums - "how to write pac-man". Next time I have to write pac-man i'll think about doing it the way these people (this person?) suggests. But them maybe i'll fix a bug in Accounts Receivable code instead.
(besides that - pleasant article - i suppose i'm annoyed about the terminlogy not the phlosophy)
Greatful to u like people who find time help others by providing a repository for others
thanks and regards,
what about the old favourites?
Hmm, how things have changed. I was always taught to comment my code. Write simple clear and well-structured source and to reuse existing functionality whenever possible.
It looks like these good practices are no longer needed.
I thought Neal Ford raised some good points here, until I read that he calls himself a "Meme Wrangler".
Now I just think he's a pretentious tosser.
The use of FindBugs is good advice though - as I detailed here on RegDeveloper a while ago: http://www.regdeveloper.co.uk/2006/06/15/findbugs_pan/
maybe, maybe not
"4. Avoid indulging in speculative software development. "The goal should be to build the simplest thing we need right now," he said. The practice increases software entropy, he added, which is a measure of code complexity."
Sometimes when coding a new system it's obvious that it will probably be extended to perform functions far beyond the original scope or requirements.
Planning ahead while developing a version 1 solution often reduces the time spent coding for later versions. I would therefore argue that speculative development often pays dividends when applied with careful forethought. I wouldn't suggest making things horrendously complicated purely on the basis of speculation, but adding a little extra complexity up front to cater for future scenarios can significantly cut coding timescales when developing new versions.
Or alternatively hire competent professionals.
It's quite simple, if they don't know what a regexp is, they AREN'T a professional software engineer. Ditto if they're keen to excessively abstract, or if they dive into implementation without properly analysing and planning.
The notion of choosing a platform is ludicrous, there might be some hyper-elegant syntax for doing something in matlab or PERL, but if you can't purchase a license, or if the management won't have it because there's no-one else to support it, then you're out of luck.
As for the "back alley" of reflection in Java, the feature was introduced specifically so that Java could host it's own debugging tools, and other uses are firmly discouraged in the API documentaion. Production functionality that depends on reflection will be the first things to break during the legacy phase, and will be difficult to understand and fix.
These guidelines might be good for getting something working now, but they'll contribute to a mass of short sighted, unmaintainable and ultimately doomed legacy applications. And I for one hope I never have to maintain or port any of this guy's code.
Some other OO metrics are useful
There is other useful information that you can get from static code analysis - looking for large methods and classes, numbers of class references, cyclomatic complexity. There are a number of tools that do this - there used to be one on sourceforge, there are also relatively cheap commercial products that have been around for a while, like Virtual Machinery's JHawk and the Semantic Designs tools.
Either 8 or 9. Pick one.
Or are we supposed to become experts at a wide variety of languages now?
I always understood the wisdom to be "Familiarise yourself with as many languages as you can, and get as proficient as you can with a very small number.". Also, in any multi-lingual system, there's always the risk of a semantic gap between what one language thinks it has said and the other language thinks it has heard.
In other words, 9, not 8. FWIW, I've been hearing "experts" predict the death of monolingualism for as long as I've been programming, but funnily enough the phenomenon is still with us. Perhaps there's a good reason for that, which has been ignored by those who don't have to earn a living from working programs.
Re: pretentious tosser.
Well, yeah, but the politically correct term is expert business consultant.
Maybe I live under a rock but I have never heard of this guy. Some of his advice isn't terrible, but actually most of it is contradictory, short sighted, and generally, well, terrible. It seems like stuff a guy planning a LONG and fruitful life as a per hour consultant would LOVE management to buy into. While the troglodyte employees just don't get it.
One thing I have to mention as a specific: How do you reconcile exploring the back waters of the language with writing simple code? All sounds like that C contest to write an entire program in 1 line.
Also I just love his let the future mind itself philosophy: "so what if we have to scrap it in 3 months because it is anti-robust sh*te we threw together shortsightedly. Think how fast V1 shipped!"
Yeah, Pretentious Tosser. I can't improve on that!!
"Hmm, how things have changed. I was always taught to comment my code. Write simple clear and well-structured source and to reuse existing functionality whenever possible.
It looks like these good practices are no longer needed."
-- Actually, the're the *same* practices-- they've merely "made the problem hard".
Can I say b*llocks here?
When I read that article, I just thought what a load of.... well you know. It just seemed like a typical sales/management speak list of bullet points that didn't really say much?
Maybe I've had a bad day. But I grimaced muchly!
It didn't interest me and now I've gone and commented about it to waste my time further. What a loser.
Paris - save me....
He should actually read what Fred Brooks writes
Saying we should "simplify essential complexity" shows he either hasn't read what Fred Brooks wrote or completely misunderstood it.
The essential complexity is the complexity of the problem domain and, by definition, cannot be simplified.
The accidental complexity can be addressed, but certainly cannot be "killed". Computer programming is never going to be completely straightforward as long as we have to obey the laws of physics and economics.
Tests first: wisdom of the ancients
"1. Write the tests before writing the code."
Not new, not novel. In fact, old hat.
A variation on this, or perhaps it's a refinement, is that test cases must go hand in hand with the predicted results. None of this "let's do this and see what happens" nonsense. If you don't know what's supposed to happen, then you don't understand the software well enough to be testing it.
A virtue of this refinement is that the results of any test are either what you expected or aren't. But of course "results not what was expected" doesn't necessarily mean there's a bug. It may be that the specification on which test cases are based is ambiguous or incomplete or contradictory or just plain hard to understand. Whichever it is, if the testers can't get it right, you can be pretty sure the progammers had trouble too.
Heart, because I love being retired and not having to do this stuff for a living anymore.
Wot no Cobertura?
No mention of the most important tool in the java toolbox: Cobertura.
If Cobertura is not your most important tool then no amount of the other bullet points will help you.
IT? 'cos there was not a single one of his 1-10 that wasn't wrong, old hat or normal working practice.
ThoughtWonks needs a SLAP.
anti-objects = tools?
Never heard of anti-objects before. What I have found is that it is sometimes better to have classes which act as tools to do a job, manipulating other objects, rather than the more traditional the-object-knows-how-to-do-it-itself approach. But then a tool is a kind of real-world object.
@AC ("Maybe, maybe not")
I think the point is to avoid writing more code than is needed Right Now; the old Einstein-attributed "solutions should be as simple as possible but no simpler" applies just as much as common sense does. "Common sense", as AC indicated, is to have some sense of the entire life cycle of the code you're working on; if you have good reason to believe that Feature X will need to be added at some point but a) it's not needed now, b) taking what will be needed to support Feature X into consideration during the current design is cheaper than refactoring later, and incidentally c) thinking ahead doesn't introduce negatives like dead code or performance penalties into the current system, then go for it. If any of these preconditions doesn't apply, then keep your eyes on the ball that's being pitched at you now and refactor later.
As somebody else noted, 'wisdom of the ancients'. One thing I like about the whole TDD/XP way of doing things is that it reinforces a lot of the habits us OldTimers(TM)(R)(LSMFT) built up back in the day, that whippersnappers with their "Don't Worry, Be Crappy" attitude tried to throw under the wheels.
Maybe you had to be there
I watched this presentation at DevWeek 2008 last week (slightly modified for a .NET audience), and it was worthwhile. Neal talked for an hour and a half, so what you're reading in this article is necessarily a simple précis; he explains and justifies most of his bullet points very well.
Regarding 8 and 9, they're not mutually exclusive: you're going to have a primary programming language for constructing any given system, but it often makes sense to drop into another for certain purposes. I've always used C/C++ for better performance or Win32 API access. I'm going to learn F# because it does functional stuff better than C#, and probably Ruby because it looks right for some of the quick and dirty stuff. A lot of the time, learning other languages gives you some new ways of thinking about your primary language.
Regarding 4, the odds against you actually successfully predicting what you're going to need in 6 months are, as a rule, astronomically high. Obviously nobody would advocate writing code that you know you're going to have to completely rewrite in the next cycle, but it's perfectly sensible to write code that does only exactly what is currently needed, but in such a way that suspected requirements can be added easily by deriving, decorating or altering implementation.
- Analysis iPhone 6: The final straw for Android makers eaten alive by the data parasite?
- TOR users become FBI's No.1 hacking target after legal power grab
- Vid Reg bloke zips through an iPHONE 6 queue from ZERO to 60 SECONDS
- Anal-ysis Buying memory in the iPhone 6: Like wiping your bottom with dollar bills
- Bacon-related medical breakthrough wins Ig Nobel prize