A fired computer engineer for Fannie Mae has been arrested and charged with planting a malicious software script designed to permanently destroy millions of dollars worth of data from all 4,000 servers operated by the mortgage giant. Rajendrasinh Babubahai Makwana, 35, of Virginia, concealed the Unix script on Fannie Mae's main …
Long term plan
Sounds like this was something he had planned for quite some time. A script that runs with that specific of a task list isn't really something you can cook in in a few hours between being fired and having your access terminated. I guess his termination wasn't really a surprise to him.
He should never have had the chance to do this.
Once he was told that he was being terminated, he should not have had access to any computer on the network. Somebody from management or Security should have gone to his desk, and logged him out on every computer there, and another admin been instructed to disable all of his accounts before he'd left the building.
To me, this doesn't sound like something he did on the spur of the moment. It's too well planned, too well written for that. It sounds more like something he'd done long in advance, so that if and when he was let go, he could get revenge. Even so, it took him time at his computer, logged into the network, to put his plan into action; time he never should have had.
I've been terminated from computer work, once or twice. In all cases, I made sure that the manager knew that I was logged in, and asked him to log me out, so that there'd be no question later of me having done anything malicious.
Err.. and the problem is?
Not sure what the problem is here. He simply planned to save them thousands of man hours by resetting their mortgages to their actual value of $0.00. As far as I can see he's done a crackerjack job of accelerating a process that might have taken days or even weeks under the current economic climate.
Missed it by this much
While this would certainly have caused untold amount of damage to a great many people and a metric ton of work. I gotta admire the guys work, if it weren't for a few careless bits he would have single handly brought fanny mae to it's knees. Of course he wont be able to work his trade any where for many years (if ever again) and rightly so. But an excellent attempt, so ten points for style but minus three million for poor planning and lack of common sense. But I'm sure he'll be ok after his many years in prison after all I hear various criminal organizations are always looking for talented IT people with a penchant for malicious activities.
Lessons to learn
So, what lessons can we learn from this event?
1. Get yourselve a SECURE way of access, without leaving traces. Difficult but very possible with full root access.
2. Make sure you don''t depend on one single script. Avoid this single point of failure. Spread the payload to several scripts.
3. Take your time. Three months is too short and the link with a sacked sysadmin is easily made.
4. The damage is still recoverable from backups. First the backup procedure has to be corrupted for a considerable amount of time.
5. Afterwards, securely delete and overwrite all scripts and restore corrupted configuration files.
For the rest, eccelent hacking, buddy! ;-)
Obviously he should have encrypted all the files and demanded a ransom :)
Lessons to learn
"Get yourself a SECURE way of access, without leaving traces. Difficult but very possible with full root access."
With *root* access? Trivial surely? Just set up a cron job for several months in advance that will create a fresh account long after you are gone. Whilst you remain in employment, kill the job before it runs and reschedule it, every couple of months or so. Then when you are sacked, the company delete your account and you are clean as a whistle.
Oh, and the above is posted as a warning to the good guys, rather than as a suggestion for the bad guys.
He could teach malware authors a thing or two
His method of hiding his "malicious code" is awe inspiring... You would have thought he would have used to his root access to actually hide the payload properly. Deserved to get caught.
@ IT Drones
Yes, I know you were promised some tasty dog treats for being a good little lemming and working long, hard hours for being a 'team player' and helping increase the company's market value thus fattening the CEO and team's bonus pay but in spite of this where does it state in your contract that this will GUARANTEE you long-term employment?
Unless stated otherwise as a contractor you are a disposable employee. Out the door with you when it's time to trim the staff to offset losses, especially when you are rubbing someone in the upper ranks the wrong way.
As a contractor every day you are allowed to work is borrowed time and that is exactly the demeanor you should come in to work with every day. I could have done the same when I was given hours notice of my termination but for what? To taint my reputation and ensure that NO EMPLOYER would ever want anything to do with me in addition to years in the clink?
Are you one of those losers that goes and menaces an ex-girfriend/wife after you get dumped?
Who is watching the watchers. It's amazing how much power an admin has, especially since they are all to often the only person who actually knows what's going on...
Coat because i'm leaving reg,but I just have some XSS to ....nevermind </sarcasm>
How do they _know_ (read: prove) he did it?
Anyone with 'root' could have left 'evidence' it was this poor bastard. I'd like to see the 'proof', myself.
Mine's the bright orange one with "MCSO" across the back...
fired for cause, per Info Week
Pretty minimal sounding cause, but teh IW writeup says he was canned for writing a script that changed server settings without approval from his "supervisor."
If *that's* why you're canning a high level 'nix guy, you do not leave him unattended after the termination hearing.
Jaysus, no wonder these people are tanking.
"even in the high-stress world of IT administration, where sabotage by disgruntled employees is common"
Funny - I've never seen it in 25 odd years of IT administraton. Where is the evidence for this statement?
Don't see the relevance of mentioning he was a contract employee, other than to dis contractors. Pretty irrelevant whether he was or not.
I wonder what......
else he had done whilst he was working . He obviously had it in him to do much more
and maybe he did.
I always wonder how many of the admins etc go into the job so they will have access to
sensitive data such as credit card details etc. I BET 100% this guy fed personal details and
maybe even CC details to others to use. I hope FBI do a thorough check to see if any
links exist. Maybe his destruction was pre planned to hide incriminating links to previous
actiivity rather than pure revenge.
Even 20 odd years ago when working for an American bank we had separate live and development networks. There was no physical link between the two and access to the live network was heavily restricted. It was not possible to copy files on to the live system other than via a removeable disk controlled by operations.
To get any files on to the live system required change control where all code was compared by another team. The last snapshot was compared to the requested release. All file differences had to be reviewed and signed off by some one other than the developer. The last snapshot was also compared to the live system so any manual edits made on the live system would be found too.
it would therefore be likely this would have been detected quite quickly.
The only way would have been to edit the live files which would have triggered security alarms on the files (as would changing any permissions) and change the snapshot and development systems in exactly the same manner. One person would have found this difficult to do as they would not have access to all three parts - in fact any one person only had access to one part.
It was a pain in the backside sometimes but like all security systems it had a purpose and taught me a lot about change control which I went on to use at many companies afterwards.
Sounds like poor control and security measures.
the guy's a pussy
a real man would hack the compiler - http://cm.bell-labs.com/who/ken/trust.html
...how the hell was he able to have root access to ALL the machines in the organization? Who on earth allows any single employee that sort of power?
Of course, he's got to go down. But the senior security management people at Fannie Mae should also be sacked.
I'd have used a screen-scrape trigger!
His main failing was to require the internal access to this script, if he'd used a screen-scraper script to watch say, twitter, for a message from a throw-away account, to fire his script, and from another admin account.
Bearing in mind the obvious genius of this chap, why did he make these easy mistakes? Perhaps he upset someone and this is a frame? there's always 2 sides to every story, and i'd love to here his side.
Yes, where is all the sabotage?
I've never seen an example of sabotage either, except for the inadvertent kind, in the 11 years I've been in IT. Mind you, I've never worked in the US.
The guy's a pussy
Surely a fanny?!
Paris because I'm sure she has a Fannie procurement department.
"where sabotage by..."
"where sabotage by disgruntled employees is common"
BS - I have been in IT for 35 years and never seen it. I have never heard about a local case from anybody.
There is a difference between the perception and reality methinks. Actually competent BOFH's, DBA's and other folk with reponsible jobs tend to be honest. In fact, I believe honesty is inextricably linked with competency. The corollary is also true.
Obvious flaw is obvious
So, you give a contractor the keys to the kingdom. Every fscking key to every fscking one of your 4000 servers. What could possibly go wrong?
Funny how organizations which give proper jobs with proper salaries to their admins don't usually have this kind of problems...
Anyway, as stated by other(s), the endangered data are probably worth less than their weight in bog paper (given the actual weight of data, that's not much).
Also, doing it from the company-issued laptop from a company-issued IP is stupid. And instead of doing the work in the script, he should probably have added a single line calling a Perl or C routine named "correction_for_annual_inflation" -or, why not "temp_backup_in_case_something_goes_bad"- or something. Much easier to obfuscate than a UNIX script.e company's laptop indicates he was in a hurry and probably didn't premeditate it. I believe that any semi-skilled UNIX BOFH is able to come up with a script like that, able to propagate to all the servers under his juridiction (if not, fire them). Given his access level, doing that kind of stuff (with legitimate scripts) was probably part of his duties, so he probably had all the parts at hand, he just had to assemble them and add a "des-troy all hu-man" payload (not terribly hard to do!). You don't give a super-ultraviolet clearance like that to an adnim if he isn't able/allowed to roll out company-wide emergency patches. The thing is, when you do so, do it with regular employees, not contractors. Pay them decently, and if you wish to fire them, arrange the suppression of their privileges *before* telling them. Ideally, the admin should come see management saying "We have a problem, my account have been compromised I can't log in", to which the correct answer is: "Well, maybe it's 'cause you've just been fired, biatch!".
Of course, the typical BOFH will have arranged something, like planting the malicious script on his second day in, having it "validated" by management. As easy as saying:"Well, there is this script, you know, it was added in 1987* and it doesn't seem to do anything useful, should I remove it?" To which the typical boss will *always* answer "No way, it could be something important, don't touch it" -especially if it's your second day in. AT THIS POINT, KEEP A PROOF. But not too obvious. Asking by e-mail from an independant account like yahoo mail is a good strategy Then you have to fire a neutralizing prog (which needs a passphrase) each month to keep the malicious script dormant (bothersome I know, but job security has a price), and when you're fired, well, no-one knows that the neutralizing script needs to be run (and no-one has the passphrase anyway), so hell's unleashed.
Draft malicious/neutralizing coupled scripts available. Send two stamps and don't tell my boss ;-)
* If you have complete access on everything, you probably know how to make it look like it's been written in 1987 and has been untouched since. If not, your boss hired the wrong person.
The first rule of project mayhem
is that you don't ask questions.
@"where sabotage by disgruntled employees is common"
Brent Weaver and others,
No doubt, the overwhelming majority of IT admins are honest, hard-working and law-abiding. But the fact remains that The Reg reports these types of stories with a fair amount of regularity. A small smattering includes:
@rens groenewegen & AC (Thursday 29th January 2009 20:38)
"the stunning part here is not the ability of planting that script (which probably took several days to make)"
Given the fact that he had root access to all boxes, creating a script that does something like that takes only an hour maybe two to write if the sysadmin is worth his salt. Making sure you properly clean your tracks does take quite a while longer, but obviously he didn't do that.
"Nor is it amazing that he could access one system at the time with root access; that is fairly standard practice although it is bordering on criminality if that is a "always on" privilege without a security dept having to turn it on by request. (one by one, that is....)"
I've been working as a contractor for more than 10 years now and have yet to find a 'security' department where people actually *know* what they're doing (running nmap to scan for open ports from time to time, doesn't make you a security expert) and that's when they actually *have* a security department.
@Nate "How do they _know_ (read: prove) he did it? "
"Anyone with 'root' could have left 'evidence' it was this poor bastard. I'd like to see the 'proof', myself."
You're right, if there was no security dept with separate loghost servers and separate access to those loghost servers by a security officer only, that evidence could have been planted by anyone.
If access to loghost and 'regular' servers is shared (not having a separate security officer saves money), then it's still uncertain if the logs haven't been compromised as well.
Ultimately stupid to do such a thing though. As a contractor, your integrity is what keeps you working.
Cronjobs? Dead-mans switches? Tsk!
The proper way to do this is to plant the bomb in the backups then at the appointed time trigger some failure that's indistinguishable from a hardware fault (eg accidental use of the wrong type of extinguisher in a minor server-room fire; read some BOFH).. Cue recovery from backups and the path to delivery of the payload.
Evil Bill icon, for obvious reasons.
I've been working with computers most of 37 years. I've seen three instances of sabotage that I can recall offhand. The first two in the mid '80s, when a person was allowed unsupervised access to a PC "to delete some personal files" and also deleted a number of work-related files. The second, in the same place, occurred the same way, when the person was again given unsupervised access, and used FDISK to remove the partition on the hard drive. The third was more recent, when someone gained access to a no-longer-used domain admin account and deleted the account of a senior member of management.
The first two were outright braindeadness by the department manager. He was an idiot who'd fall for any sob story, to the extent of covering up for a thieving drug addict despite repeated complaints.
The third was an intrusion, the perp was never identified (we had an IP address for a DSL line, but the local telco would not provide information without a subpoena, which the company did not want to get for fear of adverse publicity). Final resolution was to delete the account and tighten up procedures.
At the same company as the third incident, I was on the corporate IT staff. I had an extended struggle getting the HR department to give us timely notification of incoming or departing employees, much less when there was an involuntary termination. On two occasions, I found ex-employees still logging in remotely and exchanging email with customers--and the head of HR didn't see this as a problem! She was clueless about computers, and so territorial, she wouldn't allow us in the Admin area unescorted--even though we could remotely access every system they had.
So, it does happen, but not often. Incidents are frequently hushed up by companies for fear of adverse publicity, which is why many of you may not have heard of them.
Dumb & Dumber
First off - the chap is a lunatic - getting a job ever again will be well-nigh impossible before he changes his name.
Second - I've seen plenty of root-to-root passwordless ssh login configurations. Even without it, "expect" can do much of your dirty work for you.
Third - Operations should not be doing security. Security should be reporting into the board and it should be auditing what is going on. Yes that means lots of personnel who don't directly contribute to the bottom line. That includes looking at all root cronjobs, atd and anyone's cronjob/atd which references privileged accounts. Yes, they will have to look at all the scripts and binaries that run too. Running a database? You'll have to audit all the triggers in case a public facing website is just waiting for someone to log in as "fred bloggs" before it starts randomly crediting accounts. Its expensive and tedious work.
If there are reasons to do root-to-root passwordless logins, restrict it to a designated admin host which uses two-factor authentication for all user accounts. That stops random workstations from piping nasty scripts to the admin-host's accounts in a non-interactive manner.
Did I mention that you'll need a decent security policy? You'll also have to stick to it. No caving to project managers who just want to open that firewall between the production and corporate LANs because its a cheap way of doing things. Do it properly the first time.
Fourth - backups should be copied to off-line or read-only media.
Fifth - Succession planning - standard procedures which anticipate the loss/firing of everyone. Do not pass go, do not return to your workstation. An extended audit of all systems you had access to.
Of course I'm rather suspicious of "oh poor me -its someone-else's fault that I've been done in" stories from organisations who've been caught being very naughty.
Paris - she got caught and made a lot of money from it...
Time for a wake up call
I'm a professional sysadmin and I make a point of talking about the power inherent in the role with employers.
With very very few exceptions, a sysadmin can access any information on a system they have root access on (usually every system in the company) and can do so without detection. I strongly encourage professionalism in system administration and recommend membership in a relevant organisation (http://sage.org , http://sage-au.org.au , etc).
It's time more companies started looking for professional sysadmins who take their responsibilities and profession seriously. The more reliant our society becomes on computers the more important this becomes.
Would anybody have noticed?
I mean seriously, we are talking about _banks_ not hosting providers. If a banks computer goes down probably nobody will ever notice. If banks actually used computers, money transfers via banks would be as quick, cheap and reliable as sending cash via mail.
"So, you give a contractor the keys to the kingdom. Every fscking key to every fscking one of your 4000 servers. What could possibly go wrong?
Funny how organizations which give proper jobs with proper salaries to their admins don't usually have this kind of problems...?"
Pierre I think you are jealous, knock that chip off your shoulder, get a life or get a contract. Being a contractor has nothing to do with this, he'd been there for 3 years for a start. He was dishonest end of. Good luck if you ever hire anyone, if you make the assumption just cause you have given them a 'proper job' on a 'proper salary' they will be honest good luck to you.
Why not? Slaves are Stupid!
So, you give a contractor the keys to the kingdom. Every fscking key to every fscking one of your 4000 servers. What could possibly go wrong?
Bach when I was going through university I earned extra money by cleaning offices.
The thing about cleaning is that cleaners are considered stupid, unclean people that should not be seen when the important people are around so - they will give you the keys to the entire enterprise, all the codes to the alarms and you are ordered to work outside normal hours where no one is there .... so they do not have to look at real work during work hours ;-)
So you have a combo of: Bright young people, lots of time, no supervision, the passwords can almost always be found within office-chair-rolling distance ..The Boss often has a high level of access(!!)..
i.o.w. the industrial spy will be the janitor, the temp bureau just sent you (and nobody will ever know)!
Why did the perp try to erase data when silently corrupting data in the account databases work soo much better - especially when the Tax people notice the differences!
Professional Incompetence IMO
First of all, with root access, the competent BOFH owns the system, there shall be no evidence other than the evidence planted!!
Second, one does not simply erase data. It is is too easy and the collateral damage is limited!!
Instead, the evil BOFH *corrupts* - f.ex. sticks a database driver in front of the financial database that gradually adjusts the account entries while caching the adjustment factors in volatile storage.
The cache ensures that everyone gets the correct values until well after the BOFH has left and the driver is updated by the good BOFH (After which the IRS and the SEC is anonymously tipped off about accounting irregularities).
'Do not pass go, do not return to your workstation'
Almost everyone here says that fired admins should be escorted out and not return to their workstations.
Here in Holland we have at least 2 weeks (but more often 2 months) notice.
It's impossible to terminate someone and kick them out the same day.
So this would give our BOFHs plenty of time to cook up a very nice script..
Future computer security
If you want to *really* be frightened by what problems will exist for computer security people in the future, read "A Deepness in the Sky" by Vernor Vinge. Imagine backdoors built into systems by their designers, that can be accessed by microscopic nodes in a pervasive wireless network that you can control from anywhere without any visible input..................
>That includes looking at all root cronjobs, atd and anyone's cronjob/atd which references
>privileged accounts. Yes, they will have to look at all the scripts and binaries that run too
I think there's lots of good ideas here, reinforcing many lessons learned.
Day to day review of those files should be easy with a system that looks for executable files, and known dependencies (such as files read by the scripts for variables), then generates a hash. If hashes change, trip an alert to Security to review the issue. I'm thinking a Unix system with root locked down by Security, and well configured sudoers would keep files from being hidden.
There's always vulnerabilities -- Security could be the disgruntled employees (I say employees...I would imagine if you're that worried about a sysadmin having root rights, it should also take TWO people in Security to grant authorization...) I would think if I was a sysadmin trying to be as discrete as possible I'd hide the logic bomb in a SQL database...have some fairly obscure script that just runs, looks up someting in the DB, does it thing...until it sucks in some malicious code similiar to a SQL injection attack on web servers. With an ever changing SQL database, you wouldn't have it flagged merely by a hash. Someone leaves, Security is likely to see no changes to cron, executables, or included files and think all is well if they don't remember about that nightly ETL script...
For remote access, this is a reminder of the importance of two factor authentication. Have all the servers setup to only accept SSH from internal addresses to force sysadmins to come in through the VPN. Since sysadmins often know user's passwords...and have access to password files to run rainbow attacks on them...simply disabling their personal access isn't very secure. But even if they know someone's password, without the token to get through the VPN authentication they're still off the network.
Yes, disable access right away. I have had times before with a co-worker being suspended (which turned into a termination because they took it less then gracefully) where I was directed to secure their company laptop when he was called into the boss' office. Since the most likely place to store the code is on your office PC to cut-n-paste when the time comes to plant it, at the very least secure the input devices if you don't outright remove the computers from their cube while they're in being given the bad news.
Getting a job again, and 'sage'
@P. Lee: When I was a contractor, none of my employers (a couple of which were institutions comparable to the one in this story) ever had enough information about me to do a background check, and nor did they show any interest in doing so. I wouldn't be surprised to see this guy back doing sysadmin work somewhere after he's released. Unfortunately for him, he got caught in the US, so he's likely to be 896 years old when he gets out.
@Robert Brockway: The problem is that none of these professional organisations are widely recognised. I've been a sysadmin for 14 years and I've never come across anyone who's a member of 'sage'. It's only now I work in HE that I know a couple of people in the BCS -- and it really is only a couple.
Surely you could just have a set task to- once a month- run a "stop the malicious script" code. Then when your account's terminated, there's no "stop the malicious script" code.
Declare it as something to do with the backup procedures (helping prevent damage from transient magnetic displacement on the hard drive platters); worst case scenario is someone remembers about it and copies it to thier own profile so it doesn't get deleted- leading to a virtual "disabling" of the payload, but also shifting all blame away from you if they ever decide to delete it.
Alternatively, find a script that, say, defragments the hard disks. Something that will genuinely run and that no-one will notice. And modify that to have the malicious payload run a few months after your job is terminated.
what a turd
What is it with ppl in the IT world who seem to think that they should be so untouchable that they attempt things such as this? If a sacked car mechanic went and sabotaged the brakes on his former companys fleet of vehicles as revenge for having to be answerable to his employer, we'd all think that the guy is a twat. Why should this prick be any different - because he seemingly possesses power way beyond his physical being? All he is doing is re-enforcing the stereotype of the geeky, no-mates sysadmin who has problems with society generally but has a tendency to take it out on employers and women (probably).
Whats the fuss?
"to permanently destroy millions of dollars worth of data from all 4,000 servers operated by the mortgage giant."
Sounds like a normal MS Patch Tuesday to me......
Sufficient advanced incompetence is indistinguishable from malice
I am always asked about “Project Mayhem” scenarios by my friends, and I always reply that damaging something is easy, getting away with it is a lot harder. Also, that total loss situations are anticipated and can be retrieved by resets and restores, and possibly reconfigs.
I work for the finance division of a multinational company (manufacturing) and a few years ago our French division came within 24 hours of being shutdown from trading by the government because their books did not balance (by a very large margin). We were re-running periodic processing time-after-time to try and get the financial disparity down to an acceptable level. They did balance it in the end, by getting it mostly right and writing off the remainder as a loss. We thought at the time “stupid developers!” but now I am not so sure.
If we had lost all of the data, we could just recover it. But because some data could not be trusted, then all data (in this case) was suspect. To verify each transaction against the hardcopy or audit-trail would be time-prohibitive, and cause significant loss of “confidence” in the integrity of the data. As long as the totals work out OK, you can take from one, give to another and mess it up big time.
You can also easily make system-based calls to the Application (despite what SOX dictates) or interrogate the application /database tables for userids and passwords to use to gain credentials, to then spoof. (I have done this within SAP as an intellectual exercise several times; creating credentials, changing postings, modifying Client config from the OS).
It is easy to wallop the service as a SysAdmin, but to do some real damage, become an Application Developer!
*** DISCLAIMER – this is a hypothetical discussion and should in no way be seen as encouraging damaging data integrity ***
Removal of root access
Even if root access was removed immediately, there's still the risk that the admin has left a "dead man's handle" somewhere that requires him to tell it that he hasn't been fired weekly, or it'll go off... I guess the solution is to limit what admins have access to, and try not to piss them off when you fire them.
Am I the only one who is impressed with the speed he executed this revenge?
great lengths to hide his tracks?
I hope that was sarcastic! one page of whitespace and starting your script name with a period is almost an acceptable level of hubris. more like hiding in plain sight
@P.Lee, @Pierre, @AC (Bad Management)
P. Lee - Well I disagree with your contention that this dude willnever work again. Many organisations are very lax at contacting references or even on checking up on whether a person has worked where they say they have, doing what they say they have. Especially for contractors.
Add to this that today's news is tomorrows fish & chip paper, when the guy gets out of the nick he may not have the problems you think he will.
Gutting but true.
@ Pierre - yes, I agree that a simple, single line calling a compiled programme (so not Perl) would be the better way to do it. It's the needle in the haystack (Only one line of code) and a seemingly discreet, small programme waiting to do its thing.
@AC - you might have the world's greatest change control process, but if you have sysadmins with root access to servers in production (which you have to, no way around it) , there's your security hole right there. They don't need to copy files over for release - they can make 'em. No disk needed, just vi and a keyboard.
I know one big Government account.....
.....that used to (probably still does) give call centre staff admin rights to almost every server in the department.
Now, think of your taxes, National insurance, benefits, claims.....scary!
Oh and the admin rights also ment FULL acces to the AD domain controller - LOL.
Read only media
@ P. Lee
"Fourth - backups should be copied to off-line or read-only media."
If its read only media, how do you write to it?
"substantial damage on an entire economy"? I think not
You quote the estimated value of the disruption as "millions of dollars of damage" and a potential week-long shutdown. I know the US economy isn't what it once was, but to describe a few millions as "substantial damage [wreaked] on an entire economy" is hysterical. It's hardly going to make the sky fall.
Next week: Kevin Mitnick can whistle down a phone and set off nuclear bombs. EVERYBODY PANIC!!!1!
All your server are belong to us....
this just shows sloppy security and change control practices!
- Product Round-up Smartwatch face off: Pebble, MetaWatch and new hi-tech timepieces
- Geek's Guide to Britain The bunker at the end of the world - in Essex
- FLABBER-JASTED: It's 'jif', NOT '.gif', says man who should know
- If you've bought DRM'd film files from Acetrax, here's the bad news
- VIDEO Herschel Space Observatory spots galaxies merging