Attempting to prevent Malware from infecting computers is an important duty of a systems administrator. If you are attempting to secure systems then anti-malware applications, restricting the use of vulnerable third party applications and browser extensions are all important. But attempting to prevent – or at least contain – …
Best practise ?
I was reading it was better to route to 0.0.0.0 than 127.0.0.1. Something to do with 0.0.0.0 failing without waiting for a time out. probably makes little difference in practise but I am curious to know the "right" way :]
Re: 0.0.0.0 versus 127.0.0.1
I get the impression that this may well make a significant difference if you're running your own locally-hosted server or, based on some anecdote I've seen on various forums, if you've got a weird OS. But at least according to MVPS, any difference is a myth:
...and despite some considerable Googling on my part, I haven't found anyone who has conducted any sort of proper test, only anecdote. On the other hand, if you have an absolutely gigantic hosts file, using 0.0.0.0 does technically reduce it's size a bit.
To those of us who do not run our own DNServers
There are better options than hosts files for those who aren't running thier own DNS servers. Next article...
AdblockPlus / Blocking malicious sites with Adblock Plus
AdBlock Plus addon for Firefox:
Blocking malicious sites with Adblock Plus
"... another layer of protection..."
It was brought up in a recent article of mine. You should read the past four, as they are all about the various layers of protection against web-based threats
Already doing this...
I've been doing this for years with www.dnsredirector.com but today there are other solutions like Google's DNS or OpenDNS
There is no One True Solution. It is all of it merely another part of proper Defence In Depth.
Thinks 'OK, I'll have a look at their service, see if it's better than what I have'.
Goes to http://www.malwaredomains.com/
gets page that's blank apart from 'unable to concect to database' error.
Well that's a great first impression for an enterprise level service provider....
No problem here....
Good article, thanks for that.
IPCop + URL Filter + Adv Proxy
IPCop + URL Filter + Adv Proxy
This setup handles it.
IPCop | Services | URL Filter | Custom blacklist | (Remove all the crap except first column) paste in list | Save and Restart
URL FIlter other things...
ads: adult: adv: aggressive:
agressif: alcohol: audio-video: automobile/bikes:
automobile/boats: automobile/cars: automobile/planes: chat:
cleaning: costtraps: dangerous_material: dating:
downloads: drogue: drugs: dynamic:
education/schools: finance/banking: finance/insurance: finance/moneylending:
finance/other: finance/realestate: fortunetelling: forum:
forums: gamble: gambling: games:
government: hacking: hobby/cooking: hobby/games:
hobby/games-misc: hobby/games-online: hobby/gardening: hobby/pets:
homestyle: hospitals: imagehosting: isp:
jobsearch: leo: library: liste_bu:
mail: military: mixed_adult: mobile-phone:
models: movies: music: news:
phishing: podcasts: politics: porn:
proxy: publicite: radio: radiotv:
reaffected: recreation/humor: recreation/martialarts: recreation/restaurants:
recreation/sports: recreation/travel: recreation/wellness: redirector:
religion: remotecontrol: ringtones: science/astronomy:
science/chemistry: searchengines: sex/lingerie: sexual_education:
shopping: socialnet: spyware: strict_redirector:
strong_redirector: suspect: tracker: tricheur:
updatesites: violence: warez: weapons:
webmail: webphone: webradio: webtv:
Blocked domains (one per line) * Blocked URLs (one per line) *
Allowed domains (one per line) * Allowed URLs (one per line) *
Custom expression list
Blocked expressions (as regular expressions) *
File extension blocking
Block executable files: Block audio/video files:
Block compressed archive files:
Local file redirection
Enable local file redirection:
Network based access control
Unfiltered IP addresses (one per line) * Banned IP addresses (one per line) *
Time based access control
Block page settings
Show category on block page: Redirect to this URL: *
Show URL on block page: Message line 1: *
Show IP on block page: Message line 2: *
Use "DNS Error" to block URLs: Message line 3: *
Enable background image:
To use a custom background image for the block page upload the .jpg file below:
Enable expression lists: Enable log:
Enable SafeSearch: Log username:
Block "ads" with empty window: Split log by categories:
Block sites accessed by it's IP address: Number of filter processes:
Block all URLs not explicitly allowed: Allow custom whitelist for banned clients:
URL filter maintenance:
The new blacklist will be automatically compiled to prebuilt databases. Depending on the size of the blacklist, this may take several minutes. Please wait for this task to be finished before restarting the URL filter.
To install an updated blacklist upload the .tar.gz file below:
Create and edit your own blacklist files
Backup URL filter settings
Include complete blacklist:
Restore URL filter settings
To restore a previously saved configuration upload the .tar.gz backup file below:
Um... malware on the host file?
Mind if I ask why the hell any interactive application has administrator rights in the first place?
@The Original Steve
Stupid computers running Windows 2000 that can't be upgraded and in which nearly everything must run as Administrator. There was an article about it a ways back from me as well as much discussion and debate in the comments. I've since taken further precautions, but let's be honest here: how many folks (especially at home) do you know not only run as administrator, but click "yes" every time the "would you like to run this app" box comes up?
I agree that in an even halfway-well-run and up-to-date corporate network it’s not a practical threat…but not everyone gets to work in those environments. So many networks I know are band-aids on top of band-aids on top of other band-aids held together with tape.
Still, as people move away from the 2000/XP era into a work where running things as limited users becomes more common and practical, DNS blackholing becomes more valid as a defence as a result.
Pint because it's Friday.
Well I'd considered this before but
Baulked at the idea of getting a list of bad domains onto my ISA server with the budget available (ie none). However found a program named on the site that does it and our network is now getting a bit more secure.
I like simple clear advice like these articles - IT management is not my main job.
diddling hosts - I don't get it
I've little knowledge of windows admin but the hosts file should be by default read-only, absolutely non writeable, for users for exactly this reason. Just checked and it is so on my Win2K8 (real machine), win2k (a VM, not that that makes any difference) and Mint linux (also a VM).
Unless you + users are running as admin/root, altering hosts shouldn't be possible. So what's happening??
This is news?
Wow, we've been doing this for years... Our firewall gets a package sent about once a week updating both known safe and known unsafe domains, and we outright block the unsafe and limit access to unknown (not safe). We also add to the white and black lists regularly, and choose filters based on OU.
Large hosts file = performance suck
Web browsing slows to a crawl, so I wouldn't recommend that particular cheap-and-cheerful technique.
I read about similar practices years ago and even tried it for a while. It was high maintenance, and unconvincing.
Personally i'm not fond of a large hosts file - I would prefer it to be empty - for performance, maintenance and security reasons. sadly there are two mandatory applications in our org that require entries in hosts files on all clients. The programmers are assholes about it to boot, so no change forthcoming yet.
I prefer to block before it enters the network with Untangle.com and OpenDNS.com combined.
For some systems it can e.g win2k and I had this problem myself with a half megabyte hosts file. Totally killed browsing. Solution is to turn off dns cacheing, see <http://www.mvps.org/winhelp2002/hosts.htm> and it's fine.
Better to block the IP address rather than the name
The problem with blackholing DNS is that many cyber-crooks know about it and they therefore change the domain/subdomain they use frequently. Thus if you just block certain domains - even if you update the domains from malwaredomains.com frequently - you will fail to block the malware for long. A far better approach is to block the IP addresses of the malware providing hosts because typically the crooks use the same host with the same ip address, they just change/add new dns links to it.
As we mentioned on our blog (er yes this is a commercial plug) a few months back - http://threatstop.wordpress.com/2010/05/10/iframe-droppers-and-other-drive-bys-how-threatstop-protects-you/ - we provide our subscribers with frequently updated lists of known bad ip addresses that may be quickly and automatically plugged into the firewall and which block many malware sources. I'd love to say we block all but then you'd know I was a lying marketing droid instead, I believe we stop most of them though but since the crooks unaccountably refuse to give us a list of compromised hosts for us to check against I can't prove it.
MichaelC above would certainly benefit from our system since stats we have analyzed from DShield indicate that about a third of all threat sources change in a week (and about a quarter in less thna 24 hours). Thus by uploading new data once a week he will be missing a significant portion of the threats he thinks he is protecting against.
First things first
which _sane_ sysadmin would run ISA as their network firewall?
Don't get me started. That was /not/ my idea, and it has taken me four solid years of fighting tooth and nail to be allowed the opportunity to replace it. There are things which make me rage. There are things which make me cry. Then there are things which make me experience desires to commit war crimes. Actually, only one thing has ever fallen into the latter category, and that is ISA.
1. Start | Right-Click Notepad | Run as Administrator
2. Open and edit hosts file
@Mike Bell: the word 'user' is the key here
users aren't supposed to have admin passwords lying around. That's what makes them users.
Being not-admins is where the security comes from.
Unless you're making another point?
What about fast-flux malware?
While I'm sure that malwaredomains do an admirable job, it's pretty certain that there's no way they can capture all of the fast-flux domains used by modern botnets. When you have 20 million domains like dlxfrglh.com and orutyerou.com and so on, the blacklist becomes huge, unwieldy and seriously impacts network performance.
I know, because I tried this a couple of years ago, and Internet access slowed to a crawl. In the end, I simply ended up bit-bucketing anything to do with China, Russia, and most of Eastern Europe - because on the odd occasion when we did get infected, it nearly always came from, and reported to, one of those places. While I acknowledge that this is not a workable solution for many enterprise-level networks, for SMEs whose business is largely local (and whose networks aren't exactly high-powered) it takes a huge amount off the blacklist, leaving only the US and Netherlands as the main offenders, and that is easily dealt with using a much smaller blacklist. It doesn't eliminate every possibility, but good security practices and proper system maintenance should cover the rest of it.
Oh, and @PC Tech: While I'm as big a fan of Firefox, AdBlock and NoScript as anyone, they are not really a good defence in a network context (no client-controllable solutions are), simply because users can disable AdBlock and NoScript, or in the case of NoScript, simply allow scripts from a suspect domain. I actually caught a few users in my workplace running with NoScript in "Allow Scripts Globally" mode, because they complained it was "too annoying" to have to keep clicking "Allow" in Noscript for each new site they visited! So while it's a reasonable supporting plan to have client-side defences in place, it's a very bad idea to rely on security in the hands of your users!
They've done a good job with the Zeus botnet, and there are commercial alternatives coming on-stream to handle it. Again, Malwaredomains.com isn't the One True Solution. It is part of what should be layered defence in depth.
As to no-script, the debate was had in the comments section of my previous article:
Lightweight always on protection, host file, adblock , noscript, manual weekly scans with mbam and superantispywarem and I still boot into ubuntu to do anything secure. Brilliant article. Thanks
- World's OLDEST human DNA found in leg bone – but that's not the only boning going on...
- Lightning strikes USB bosses: Next-gen jacks will be REVERSIBLE
- Pics Brit inventors' GRAVITY POWERED LIGHT ships out after just 1 year
- Facebook offshores HUGE WAD OF CASH to Caymans - via Ireland
- Microsoft teams up with Feds, Europol in ZeroAccess botnet zombie hunt