The Duqu malware that targeted industrial manufacturers around the world may have been spawned by a well-funded team of competent coders, but their command of Linux led to some highly amateur mistakes. According to a report published on Wednesday by researchers from Kaspersky Lab, the unknown attackers attempted a global …
Not necessaily ignorant but definitely overconfident
History logging in controlled by some environment variables that are set in one of the files that bash reads when it starts up. Which files are read depend on how bash is started. That choice of files was changed, and the documentation took some time to catch up. Not cancelling bash history correctly when started via ssh on the first attempt is an easy mistake. The embarrassing mistakes are not checking that history was disabled and getting it wrong on an active machine instead of testing the procedure where no-one will notice a mistake.
The man page for sshd_config is over 30 pages long. There are some changes I could make to that file from memory. If I was working on a remote machine, and the version of ssh could be different from my local machine, then I would read the manual on the remote machine. A mistake might make it very difficult to log back in and fix my mistake.
Linux has dozens of ftp clients. If none of the ones I am familiar with are installed, then I would have to read the manual to do something unusual. Iptables is for setting up the firewall. The firewall is made up of many kernel modules. New features are added with new modules in most kernel releases. The syntax for an experimental module can change. Getting it wrong on the first attempt is almost certain. Getting it wrong on a remote machine is dangerous - you could easily add a rule that blocks ssh and so make the machine hard to fix by remote.
The easy way to wipe the disk on a local machine is to boot from a live CD and type 'shred /dev/sda'. On a remote machine, you need to copy all the files for a minimal operating system into a ramdisk and use the pivot root command to run shred from the ramdisk. This is something you have to practice at home until you get it right because a partial failure will prevent further remote access.
PS: Shred is not a good choice to use on an SSD. Upgrading the firmware should do the trick, but you need to pivot root into a ramdisk to ensure the firmware upgrade completes.
Upgrading the firmware wouldn't necessarily help, but TRIM would.
uhh, so they installed a backdoored 5.8 version of SSH, which used the GSSAPIAuthentication config parameter as some internal flag which changes an option in their backdoored version?
The plot thickens...
Let's face it, there are three suspects here, as for Stuxnet.
Ladies and Gentlemen, place your bets please.
You forgot one.
My money's on the Illuminati.
Missed one or two
5. Blame anon and live happily ever after.
My bet is on those not on the obvious list.
That would mean:
1/ Iran (revenge)
2/ Russia (nya-nya, we can do better than stuxnet)
There Should Be Much More Evidence !
If they have found parts of bash_histoy, there could very well /etc/utmp and /etc/wtmp file fragments left. And that could give away the originating IP addresses !
Kaspersky themselves are n00bs
Even Unix admins with 30 years experience will habitually check the manual; if the thing is critical enough (and leaving yourself without sshd on a remote box is fairly critical) they'll double-check on what they're about to do anyway. Failing to do this is the sign of an overconfident hack. This is less obvious to linux admins, nevermind "security researchers" from such a strong windows background, both because the "scene" surrounding both doesn't hold much truck with that sort of attitude, and because where most unices tend to good-to-excellent manpage collections, linux is notoriously not up to snuff.
All in all, the conclusions aren't backed up by the intepretation of the evidence, so this didn't teach us as much about the duqu bunch as it did about the kaspersky bunch.
I agree for the most part, but the comment on Linux manuals says more about you than the subject at hand.
Eh? Why wouldn't you just look up what you needed on your local web browser?
"the comment on Linux manuals"
I said "manpage collections". It's a detail, but important here. Even if I had said "manuals" I could provide some interesting examples, but I didn't. Debian is making inroads with the manpages, but only half-heartedly. The rest tends to carry the usual gn00 grub, which starts off with a snooty comment to the tune of having single-handedly declared manpages obsolete and you are to be readink info, comrade backward dinosaur, you. Oftentimes I then would run the recommended info command, only to be presented by the same text only now in a gn00tary info interface, for said info pages turned up absent.
I happen to find the info interface hateful, the attitude arrogant, and the end result utterly useless. In that, yes, it says something about me. But not in the way you're meaning to imply.
Why not the web browser?
You have trouble thinking of reasons why not? Here's a few possible ones:
Workflow. Typing on a terminal here. Much quicker to type "man $topic" than to pick up the mouse, fire up a browser or find an existing instance, open a new tab or window, click on a field, type a bit, wait for results, scroll through them, pick a few, look through them in succession, all the while repeating that shifting back and forth between keyboard and mouse. I'm a touch-typist, I need both hands at the keyboard, so having to use a mouse is a distraction.
And maybe a browser isn't even available. Doing system-y things here, meaning I might be working in a text-only environment and not have anything to run graphics on started/available/whatever. Perhaps I'm trying to fix a broken graphics driver. On certain other systems, that might easily turn the system into a doorstop. Here, it's fixable, but that sort of thing works better when I know how. That's what the manpages are there to help with.
Text-based browser, you say? Why go through that extra trouble when I can also have a standard tool show me a much more nicely rendered text representation for something that's meant to be shown that way?
The network might not be functional. The local host isn't set up yet, or the local network mightn't be. Maybe I'm somewhere without uplink at all. Or there might be proxies and content filters active, which should not but still could rear their ugly head.
You want me to take a copy of the internet with me for easy reference? I only need the manpages, and I already have those on the local system. Why would I bother with getting things from elsewhere?
Don't get me wrong, sometimes googling for an error message is quicker and easier. But it certainly isn't the solve-all that supercedes everything. If it were then even O'Reilly would've been bust by now.
You don't want to leave a record that can subpoenaed from Google, that has you looking up the command minutes before it was used do you?
The best-paid UNIX administrators type very carefully...
> single-handedly declared manpages obsolete and you are to be reading info, comrade backward dinosaur
Have an upvote sir. I'd happily ram info(1) down someone's throat. Did you know it's own documentation is 104KB of HTML, and you can actually order printed copies from the FSF? Comedy gold.
> Why wouldn't you just look up what you needed on your local web browser?
The commands change over time; using man on the system you're actually interested in gives you a good chance of getting the right version of the manual for the command that is actually there.
Using man doesn't suggest a newbie admin to me; I think it quite a reasonable thing to do. What *does* strike me as odd is that the attacker didn't have a near-identical test system on which to run the man command. That would leave our trace with lots of version-number checking, but probably without man lookups.
"Using man doesn't suggest a newbie admin to me"
Nor me. It suggests someone who is being careful not to fuck up a task which could leave the machine inaccessible. I've consulted manpages for commands I consider myself familiar with in situations where a small mistake could cause ruinous consequences.
On the other hand, not knowing that EXT3 leaves traces or that bash by default leaves a history log, those do suggest that they weren't all that. Unless they intentionally left *misleading* traces, that is.
Why not the web browser?
I mean "Why not the web browser *if you want to cover the fact you were using the manpages*", sorry if that wasn't clear.
In normal day to day admin of course you'd use the man pages on the system. What they're up to is hardly day to day admin so I'm sure they could go the extra mile and fire up a web browser. It's 2011 and I'm they probably aren't using a Commodore PET in a cave with an acoustic coupler modem to do all this - so getting access to a browser isn't going to be an issue.
> I mean "Why not the web browser *if you want to cover the fact you were using the manpages*"
Yep. Got that.
And the reasons are as above - it's slower, more effort, leaves more traces, and likely wouldn't have provided the required level of accuracy to get exactly the *right* man pages.
Whoever this guy is, he made a mistake allowing his history to be captured. But it would have been a much bigger mistake to have tried to use a web browser instead of man.
If the breadcrumbs are few and lead to a conclusion about the nature of the entry, it's not beyond the realms of possibility that the trail is deliberate misdirection in order to produce that conclusion. It's quite advanced and security conscious in other respects so basic shell mistakes seem out of place.
Kaspersky researcher Vitaly Kamluk is a n00b
"The reason for this is that Linux constantly reallocates commonly used files to reduce fragmentation.”
Must have been released
...or returned from a trip home to see Mum?
He's been back for a while.
Is this suppose to make sense?
Everything amanfromMars says makes sense, to those who are enlightened enough to understand it.
Seek and ye shall find, ask and all is revealed .... well, as much as one needs and is able to know
"Is this suppose to make sense?" ….. Anonymous Coward Posted Thursday 1st December 2011 16:40 GMT
Yes, AC, and it does, and any problem[s] you would have in understanding and using the information shared, and which is at many levels for naturally varying levels of intelligence and/for CyberIntelAIgents, sits between your current position, which is probably a chair, and the computer screen/Global Operating Device displaying this message for wider sharing/broadbandcasting.
However, whenever a SMART Virtual Machine, are any such problem spots easily self actuated to seek to resolve and prevent ignorant difficulties with a tad more research and reading on matters of concern and interest, for rapid development of both the personal and public self ……. for a fuller Being.
Inevitably, and quite naturally so, will increased knowledge supply and greater understanding morph one into a being more than easily able and quite capable of being any number of alternate beings, dependent upon whatever acquired knowledge and intelligence systems it would be using.
Dan Goodin in San Francisco, Venus and her consort Mars and Minerva send their LOVE. Wanna Bit Part in a Great AI Game? El Reg Rules Hosting of course.
Venus always satisfies Mars and Minerva Trips into AIMagical Mystery Turing Terrain
Good behaviour guarantees MILF Seventh Heaven, Peter Simpson 1 ...... and that is a place from which you choose not to be released, nor if you have any great sense, would you wish to be, for its delights are second to none and for those who would service and nurture, develop and refine the environment, are immaculate overwhelming powers with sublime controls, just and fair reward for the exercising, beyond your wildest dreams.
Capiche, Spooksville UK, or are y'all still rooting about in the dark in such matters, and in need of just some simple experienced guidance?
Have we had a change in moderation policy? I don't remember seeing AMFM's witterings removed before...
I'm surprised by this article. Every half formed opinion on stuxnet has estimated that the code was developed by some highly organized well funded organiztion, most likely a political state. If that's true in any degree whatsoever, why would they then throw all that organization and compartmentalization out the window and have the developers deploy the C&C mechanism?
An org who doesn't let developers actually run their 'production' systems. I wonder why that is????
Even a moment's thought on the subject easily explains the discrepancy, and reinforces the theory of a governmental organization being the source of stuxnet.( should you be one of the tin foil hat crew )
Honestly, it's like saying the guys who coded ( any ) commercial applications aren't good at windows because a log showed that someone using the software clicked on the help menu.
... mine's the one reaching for the jacket with apples and oranges in the pockets.
It would seem...
...that too many system administrators still do not know what a log server is, and why they should be used in secured environments.
Log servers should be firewalled and totally inaccessible to UNIX admins (and, therefore, hackers) - the security monitoring team should have sole access, via a private network.
That way, you do not have to worry about whether a hacker shreds the disk or manages to erase local logs. There will still be a record - in a place they cannot get access to.
re: It would seem...
you had me up until "secured environment": the systems that were compromised were just random servers strewn across the net, running outdated versions of openssh and such... hardly a secured environment.
If they can't be bothered to keep their main system packages up to date, i'd hardly imagine them setting up the whiz-bangery you describe, or for that matter caring about terms like "secured environment".
- Updated Zucker punched: Google gobbles Facebook-wooed Titan Aerospace
- Elon Musk's LEAKY THRUSTER gas stalls Space Station supply run
- Windows 8.1, which you probably haven't upgraded to yet, ALREADY OBSOLETE
- Mounties always get their man: Heartbleed 'hacker', 19, CUFFED
- Android engineer: We DIDN'T copy Apple OR follow Samsung's orders