Should have used IIS
Then they'd have remained secure.
Hackers penetrated the heavily-fortified servers for Apache.org in a "direct, targeted attack" that captured the passwords of anyone who used the website's bug-tracking service over a three-day span last week. The breach, the second to hit Apache.org in eight months, also exposed a much larger list of passwords belonging to …
Then they'd have remained secure.
And, just to be clear, use fail2ban. Always.
I wonder just how many 3rd party components that are in use across the internet have these same weak password habits?
Assume the worst and you won't be wrong. And if you are wrong that will be OK.
Only the paranoid survive (if the companies they use aren't stupid.)
It takes big cojones to come clean about your security blunders. The vast majority go unreported - how many of *your* passwords have leaked without your knowledge?
The whole notion of site-specific usernames/passwords is an horrible anachronism. I feel resentment every time yet another crappy site asks me to sign up for an useless account.
Apaches "press release" of the incident was nothing but finger pointing. Regardless of the software at fault it had apaches name all over it and if Apache felt the system wasn't secure enough they didn't have to use it.
This is 100% apaches fault and it's a perfect example of how large corporations and organizations can fail to secure data and carry on while smaller companies would probably be shutdown over it.
One can't possibly even begin to put a number on how much an attack like this is worth depending on what information was successfully stolen.
Apache has said email addresses with passwords may have been stolen. This system isn't only bug-tracking it's also the apache infrastructure communication area where those that help maintain the apache project handle issues. Alot of those people control alot more then a webserver and having access to their passwords and email addresses can really be a home-run for hackers.
When are these big companies going to learn, when is small business going to stop being hammered by rules and regulations that come down after failures like this?
Oh, did I mention it was hosted on Linux? I dunno Apache felt it necessary to start with that, not sure what the point was.
"Oh, did I mention it was hosted on Linux? I dunno Apache felt it necessary to start with that, not sure what the point was."
To help lower the smug level round here?
At least if you are using Open Source software, then you *know* the ways it can go wrong, and so can take precautions to limit the damage.
When you put your trust into a proprietary and caged product, it can go wrong in ways you *don't* know.
Actually, I'd say the infra team did own up to their mistakes. Atlassian had a 0-day exploit -unlucky- but reacted fast once notified and were very helpful. The big mistakes ASF-side were not using fail2ban and some shared passwords, plus SVN's habit of caching passwords. Another weakness was not making the session cookies all httponly -tomcat 7 can do this, so it's unfortunate JIRA wasn't on 6; everyone has learned that.
The other issue is that with cloud computing, anyone with a stolen credit card can buy cluster time for brute force attacks with an outbound network load that a tier-1 datacentre wouldn't even notice. Hence: fail2ban, though it needs log integration with whatever is your web wapp. Welcome to the future.
-Stevel, apache member.
Biting the hand that feeds IT © 1998–2017