Been using this plugin for well over a year, now. It is fantastic. I am far happier on the web knowing to who and how I allow my data to be accessed.
Most users think the web works like a television. They go to a website and are presented with images, text and multimedia. What they don’t know is that, unlike a television, some of what they see executes on the server, and some on our own computers, and any time an application runs on our computers there is the potential for a …
A good website should ideally work, at least with reduced functionality without scripting enabled.
But alas, in the real world, this is often not the case.
It is one of those really awkward decisions that need to be taken during development, and sadly it is often cheaper to develop an application without the fall back for no scripting so this decision wins.
Who are these sysadmins who find it too much trouble? What are random users doing visiting random web sites in a business environment? If a random user finds a random web site is broken by NoScript then at least they have been warned, and if they find it a bind to make a couple of clicks then perhaps they shouldn't be allowed to play with toys.
My mother refuses to have NoScript because "it's complicated". She is a smart person, but as soon as you mention computer, dummy mode engages. I wonder if it is some psychological head-in-the-sand mentality? Whatever... My response to refusing NoScript is a limited user account with the least permissions I can grant.
There's just too much nasty crap out there to take any other approach.
If nothing else, NoScript is good at showing what a page would like to do. The one on which I'm typing now would like to run scripts from quantserv, google-analytics and googleadservices, for example.
It highlights some good security holes - if you run it with Verified by Visa, it jumps up and down and gets all excited, and I bumped into my first (fairly harmless but annoying) clickjacking attempt earlier this week. I tend to use Google to look up domain names, and most of the ones I don't know are related to tracking services so they hit the block list fairly quickly. It is a pain to set up, but as the list of permanent inclusions and exclusions builds up, it's less of an issue. However, Joe Public would most likely allow everything through because he doesn't know how to tell what's good or bad, which defeats the purpose.
I'm surprised it hasnt' happened already, but why haven't browsers built in a facility like noscript? It can't be far away. With the resources of the main players they should be able to create a whitelist database (even if that database contains entries for ad sites, which would make sense for Google) of safe sites for scripting.
XSS is strictly a web server issue.
The problem is when website B then goes on to dump that input verbatim to one or more uses without proper escaping.
Now about blocking third party scripting, I think it's a good idea, but a lot of sites break (as you mentioned, google affiliate sites, my bank, and many others). I disagree with built in white listing though, many small developers won't be lucky enough to make the cut. Why should a given script work for google & affiliates, but not for me?
Interestingly I can't get "verified" by Visa or Mastercards 3D "secure" to work when I have NoScript installed. I end up having to swap to a less secure setup...
Otherwise I love NoScript, and constantly amazed at the amount of scripting crud that web-sites think they should execute on my machine...
That's actually an implementation fault - if the person setting up the website had read the documentation on "best practices" they should have a <noscript> fallback that allows "click here to continue" type functionality.
NoScript is one of a handful of browser extensions that I consider essential.
"Why haven't browsers built in a facility like noscript?"
Because, I believe, it would upset the balance too much. I suppose the users would like it, but imagine the outcry among webmasters. This kind of feature would effectively kill all 3rd party analytics tools and advertising networks which rely pretty heavily on the type of "XSS" we are talking about here. All ads would be hidden and nobody would be able to "study the demographics" via Google Analytics et al.
The recent case of Internet Explorer and private-mode-as-default-shot-down-by-advertisers (http://online.wsj.com/article/SB10001424052748703467304575383530439838568.html) is a perfect example of this.
Yes, I'm all for NoScript in all browsers by default too, but somehow I doubt it's gonna happen. Btw, if I block nir.theregister.co.uk on this site, the layout of some articles breaks a little (article background goes grey instead of the normal white.)
The best way I know to prevent Google from running scripts on your machine whenever you visit a site. If you really do need the scripts to run, do like I do and exit the browser flushing cashed information even in the hidden areas (if you use BetterPrivacy) and then restart and temporarily allow the Google site(s) to work. Once done with Google revoke temporary permissions or exit and restart the program (best) and Google can't follow what you're doing. This is what I do when I have to use the mail system of the college I'm attending.
What's really needed is a new HTML tag - something which specifies which other scripts a site can access. These tags should be put within another block (HEAD?) and *only* within that block and only allow one instance of that block.
By restricting where the tags can appear it doesn't matter if someone manages to put them within a comment or something as the block within the HEAD tag will be first and will cause any more blocks to simply be ignored.
Block all scripts which are not specified as allowed by default and use a NoScript type interface to allow them if the website hasn't been updated to use these new tags - it's not much work on anyone's side to implement this.
Shoddily programmed sites prolly wont have that tag. In other cases I'll just have to get my payload up and running before the site writes the tag to the head. This would make things harder for hackers but it doesn't seem bullet proof.
In reality most XSS-injection is possible because websites want to take in heaps of user generated contents and invest a lot of time in building their communities whilst security concerns only get addressed once they where hacked.
We shouldn't try fixing this in the browser when most of it is actually a problem with lax security on the server side!
I think other readers have already commented on this...
The reason to run NoScript is that unless you the user purposely tell it to allow a java script from a site to run, it won't.
Good bye Google Analytics. Good bye annoying flash ads. You get the idea.
When you go to a site frequently like CNN/BBC/ElReg/etc ... NoScript can remember your preferences so you get to see your content.
Of course sites are now catching on and are modifying their pages so if you want to see the content, you have to run their scripts with some stuff inside of them.
The point is that NoScript gives the user control of what runs and doesn't run in his browser.
Trying to modify HTML isn't going to solve any of the problems concerning security. After all, a dodgy site is still going to be a dodgy site.
sites are not just catching on for the purpose of their own scripts - many sites check for the presence of things like googleanalytics before showing you any content whatsoever. It should be possible to extend noscript to effectively "mock" the presence of third party scripts. if the site's script checks for the presence of objects or functions it could work.
We've all got a store of trusted certificates ... so surely some of the most crucial xss (e.g. verified by visa) could be signed? Or maybe we could have a list of secure hashes of scripts that we think can be trusted? The latter could work with a collaborative approach - so NS could build up white, black and grey lists of script hashes.
Just thinking out loud - before downvoting please remember my wife's horse kicked me in the head :-)
I think this is an excellent idea but one of the benefits of NoScript is it blocks downloading of js & swf files, obviously to hash a file it must be downloaded and that allows a user to be tracked.
Also scripts can change legitimately. I suppose in cases of extra-secure sites we would want legitimate changes to occur as seldom as possible.
That would only delay the inevitable. Malware writers would simply obtain their own signatures or, in the recent SCADA trojan scandal, hijack some popular signature (you choose a popular one like Realtek so that you can't invalidate easily--it would break too many things).
Many websites foolishly (deliberately) include third party scripts, such as google analytics, on their own webpages.
It may not occur to webmasters that technically they are giving the keys to all user credentials and contents to google. Google can snoop all traffic, forge requests, and intercept all fields.
Say it aint so!
I mean I love all of the free goo that the chocolate factory spews out in the name of "doing no evil".
How can you be critical of a company that showers all this wonderful stuff upon us mere mortals?
Eric Schmidt is brilliant. I have nothing to hide!
Ok, now I'm starting to taste some bile in my mouth. The above should have pegged everyone's sarcasm meter.
I totally agree with Lou's comments.
Of course Lou forgot that Google of course wants to provide 'free' DNS services. You know so that they can log every DNS request you make ... ;-)
Also huge praise to Lou for not fearing the wrath of exposing Google for what it is and not hiding behind an anonymous post.
I used NoScript for a while, and now rely more on Adblock to block content from pushy sites because NoScript involves too much time making decisions for which you have too little information to make these fully reliably. Adblock and NoScript do different jobs. I know that without NoScript that I'm more vulnerable to XSS, but it still wouldn't protect me against a bad script I've wrongly chosen to trust.
As far as my online banking is concerned, I use a virtual machine kept on a USB stick, on which the web browser and accessible files have never been used and won't be used for any other purpose. This, as far as I can tell, protects me against XSS and many other threats far better than NoScript could.
Options>General>TICK Temporarily allow top-level sites by default (base 2nd level domains)
This allows scripts to run on sites that you visit - but not those from other domains.
I'm assuming that if you have decided to visit, say theregsiter.co.uk, and you want it to work properly, then you want to allow scripts from that site but not any third party sites.
This will allow most sites to run without user interaction. I think its a fair balance between security and convenience.
NoScript (and AdBlock) are brilliant add-ons. I use both at home very happily, but what this article fails to acknowledge is that the SysAdmin is a technical person who has been hired to make the difficult decisions for the non-technical user in a corporate enviroment.
Our objective as IT Managers is to create a safe environment where our users can 'do business' and generate revenue. In most organisations, IT is a cost-centre, not a profit-centre. We trade-off security for convenience and useability based on our organisation's risk appetite and sensitivity to security, and we need to do this with the buy-in of the people who make the money to pay for us.
Relying on users to run NoScript will have the following results:
1. Users WILL get pissed and WILL make your life more difficult. Any subsequent changes you attempt to apply will get vetoed because IT is happily creating the perception that they are pains in the arse and overly-cautious.
2. The majority of users will quickly enable all scripts on the pages that they regularly visit, rendering your efforts worthless and worse creating a false sense of security.
A simple solution like a browser plug-in will only work in a limited number of environments were your users are sufficiently technical. In the real-world, the only appropriate solution is a multi-tiered approach (web proxy at the WAN's edge, AV on the LAN servers and clients, regular patching, etc.)
Spot on: the job of IT is to help the company, not hinder it. Running NoScript in a standard corporate environment makes about as much sense as all the other "because I say so" policies that stop people being able to do their job:
* limits on email inbox sizes
* passwords to be changed every 30 days
* egress firewalls blocking more than just the obvious (faecebook etc).
* automatic blocking of email attachments.
All of these are likely to hinder the users and IMHO none of them is strictly necessary - they're the sysadmin flexing his muscles rather than trying to solve the problem properly, ie by proper virus scanning, IDS, and so on.
As I repeatedly say to our SysAdmins our users are not paid to be computer experts. They are paid to be good at their jobs.
It is the IT departs job to ensure that they are given an environment to do their jobs safely and efficiently.
The Authors attitude is exactly the "IT Crowd" mentality that will ensure that IT staff will never be looked upon as a professionals by their business collegues. .
"The Authors attitude is exactly the "IT Crowd" mentality that will ensure that IT staff will never be looked upon as a professionals by their business colleagues."
I'd love you to qualify that statement. I don't remember once saying that I was calling for NoScript in the bulk of a business environment. The solution for corporates is DNS Blacklisting. (Which is what my next few articles are about.) I in no way expect the users to be computer experts; in fact I have come to terms with the concept that the vast majority of them probably couldn't understand the difference between Firefox and IE if you gave them a ten week seminar.
It's not their job to know that. It is their job to know some basics of how to use computers. For certain individuals in my company, I do encourage the use of NoScript on their corporate machines, but only because these individuals are "advanced users" who are both capable of understanding its use and far more likely to poke around where they shouldn't be online.
When a staff member brings me a personal computer and asks me to set it up for them, you are damned right I try to teach them NoScript. If I am taking my personal (not company paid for) time to help them set up their home computer, then I am going to at least /try/ to teach them good digital hygiene. The vast majority of them rather like the concept, once they understand it.
Seriously though, if you honestly believe I storm around work all day pretending that “root” means “god” then I wholeheartedly encourage you to come down and pay me a visit. You can watch a sysadmin in their native environment, and realise that there are quite a few of us would actually enjoy helping our co-workers.
In fact, the biggest problem I encounter at work is people simply not telling me when something has gone pear-shaped. I can’t help you if I don’t know what the issue is. Sadly, there are always people who get hostile and unreasonable because I am not a god and thus don’t know everything there is to know.
If my “business colleagues” are incapable of looking upon my efforts to assist them in doing their jobs as professional, then I would have to say the fault lies with them. Their prejudice and need to fear and segregate what they don’t understand is the barrier, not a genuine desire to help and enable on my part. Sadly, there are people who cling to archaic stereotypes in any situation. In my opinion those people are a detriment to any business, as business must be capable of adapting to reality. In fact my experience has taught me that businesses that adapt the quickest are the only ones that survive.
I think we can all agree on the value of a script-blocker like NoScript in a home computing environment, but the context of your series of articles is a corporate or enterprise environment ("A SysAdmin Blog") and as such doesn't make a convincing argument. Particularly not for an entire article anyway.
If, as you suggest that NoScript is useful for Power Users and not installed for regular Joe's, then you only receive partial protection based on a non-IT member of staff deciding not to enable a script based on its self-description/domain.
Secondly, where is the value in selecting a group of users determining that they are power users and applying restrictions/hassles to their browsing experience when if they were a regular user they wouldn't have this restriction. Are we punishing Power Users?
When the auditors come to visit can you put your hand on your heart and swear that you have acted with all due diligence in protecting the organisation? Are you confident that the cost/benefit analysis justifies this? I struggle to believe that the effort/cost of installing the add-on, the effort of the Power Users to evaluate every script on every site they visit, and the sense of false-security generated justifies itself. Remember, as your regular users do not have the add-on, you will have to have a full-range of multi-tiered defences anyway.
The context of my Blog is aimed at junior sysadmins, mostly serving in SME roles. It's hoped that I can introduce concepts to these folks that they may not have encountered before, as well as workarounds to "get the job done" as best as is possible in the world of limited resources in which SME sysadmins must play.
It isn't really aimed at someone running a network with fifteen thousand users. Those folks already know everything I could possibly have to teach. They also have access to resources and funding I could only ever dream of.
Could I put my hand on my heart and swear before the world that I have done the best job possible to protect my network and the information it contains? Yes. I have done the best I believe possible with the resources provided for me to use. Not everyone gets to manage their network by whitepaper, and for every organisation that exists with the resources to do things absolutely by the book there are hundreds more that will never have that luxury.
Where’s the advantage to selecting one group of users and training them in the use of things like NoScript? Minimisation of risks where and when possible. I will never, ever be able to teach NoScript to some of the users on my networks, even if I had infinite time and resources. The individuals have no interest in learning it, and thus the capacity to retain what they are shown simply isn’t there.
Thus, as part of defence in dept I minimise risks wherever I can, and work around situations where I can’t. Users who can’t or won’t learn to use tools like NoScript have restrictions places upon their access that others don’t. In an SME IT shop, you don’t get the opportunity to treat all your employees as faceless interchangeable cogs. You must deal with them one on one, assessing the needs of the /HUMAN BEINGS/ that are using the systems you are responsible for providing.
You believe that the use of NoScript is a punishment, probably because you personally do not like the add-in. I don’t see it that way, and it’s certainly not presented to users in this fashion. Users who are willing to take the time to upskill themselves in the proper use of their computers and who are willing to operate in a work environment with at least some basic aspects of computer security in mind actually have far fewer restrictions than those who do not.
If users are willing to work with IT in this manner, I am more than willing to place my trust in them. They will be given local administrative access to their PCs and thus the ability to make systems changes or install applications. They have greater leeway in how the hardware of their systems can be configured, as I can trust in their ability to keep drivers up to date, handle odd hardware and suchlike. I don’t have to manage these folks by forcing completely identical hardware and pushing images down to them on a regular basis.
Additionally, because they are willing to play ball on computer security, they aren’t restricted in their internet access. They have access to websites like Facebook, IT time is put into helping them enable the ability for them to remote control their home computers from work and they are frequently the same people who make arrangements to remote into their work computers from home.
For some people, where they work is “just a job.” They couldn’t give a rat’s about security, corporate concerns, customer information privacy or any of that. They show up, punch in, use the tool placed in front of them and then leave. For other people, where they work is a career. They take pride in their work, have no intention of leaving, and do care about all the various concerns that affect the company. It is those people, the “lifers” if you will, that request more leniency in some areas of IT policy. They are, to an individual, willing to help IT out by in turn taking IT security seriously.
I am sorry if you don’t agree with that approach, but in my experience “one size fits all” IT policies are a fatal mistake. People aren’t the same. Their jobs and requirements aren’t the same. What’s more, companies aren’t the same: how they implement IT in their environments will differ. In my organisation, NoScript has found a place. I am saddened both that you are not only unwilling to consider how it might find a place in yours, but that you would have to be so negative towards me because of it.
With luck the information in this article proved of use to others who have different requirements, environments and viewpoints than your own.
Well, AC... that's a bit of an interesting comment. Let's examine it point-by-point, shall we?
1. You are exactly right on this particular statement. However, there is a caveat to my agreement: users need to know how to properly use a computer "to be good at their jobs" in most modern environments. The problem that I find is people just don't care about what happens to their system when it becomes infected/corrupted (or that it was negligence that caused it - example being people who open executables from within zipped attachments after you've repeatedly attempted to explain to them why it's not a good idea), and then you get the "this is priority one, get it fixed now" mentality from them when it does.
2. Well, this one is a bit trickier as that's a bit of an oxymoron given the current state of the web. We can provide the tools for them to do their jobs safely, or to do them efficiently, or attempt to find some middle ground. The problem is that you lose a fair amount of security in the middle-ground unless you have a large IT staff with a budget to match. I work in a 4-site organization (across ~2300 KM) with a high dependance on IT infrastructure for the backend and frontend operations of the company, but a 3-man IT staff and a budget that's less than 2 of our salaries combined.
3. This I'm just not going to touch as I can't speak for the author, but perhaps you should re-read the entire article to ensure you've properly parsed it before making assumptions (particularly regarding things which weren't actually said).
[Troll]... well, I don't think I need to explain that. ;)
"In most organisations, IT is a cost-centre, not a profit-centre"
That's an interesting (and unfortunately quite common) view of things, however let me provide another perspective:
What would happen if the IT department started billing the various other departments within the organization for both help-desk and systems repairs (or for the systems themselves) - IT doesn't look like a giant cost-centre then, does it?
It's something I've done a fair bit of thinking on, as there is a group of companies in the city I'm from who've spun out their IT department as a completely independant company that does just that.
I'm glad this is something you've been doing a fair bit of thinking on - interdepartmental cross-charging is fairly common. IT departments charge the different business units or P&L's for the services they consume (new hires, number of machines, project work, etc.). Under such a model, it's possible for IT to show a paper profit, but they haven't brought in any revenue to the company.
If IT runs a profit at the expense of the rest of the organisation, IT will quickly find itself outsourced or under serious price pressure.
Computing, the internet and all its related accompaniments bring with it a library of new terms and phrases, that to be frank ,glaze the eyes over joe public.
People like talking simple. Simple they can understand.
We have industries that are full of local jargon, which if you are specialised,is second nature to you but to the rest is complete poppycock.
What the computer industry has inadvertently done is dropped an industry littered with jargon into the laps of joe public - but joe public has to understand.
Like Trevor says, joe public wants a computer to be like a tv. It has an on/off switch, a remote control and thats it.
They do not want to understand about xss, tcp/ip, html5, OS's, trojans, malware etc.
The barriers are automatically put up because of implicit user ignorance.
I cannot recall any aspect of public consumption that requires so much understanding of so many different aspects. It's truly baffling when you look at it from the eyes of a user.
While there is a need for this required level of understanding, there will always be people who fall victim to the many traps in place.
Just a thought
It may not be a matter of public consumption, but to many people it is a vital and necessary skill for getting around in most of the civilized Western world: driving. Understanding not just the workings of a vehicle (particularly a manual-transmission car where more limb coordination is required) but also the myriad notices, signs, regulations, and so on that are requisite to being qualified to drive (I understand that standards vary by country, but based on common USA conditions and the fact it's known to not be the tightest, I assume there's a lot to learn). A few times, I have heard people say people should be licensed to use a computer; I don't entirely agree with this view since cars, unlike computers, can pose easily direct danger of life, limb, and property, but the need to be properly skilled is similar. Then again, computer skills trend towards the young while seniority, naturally, goes the other way--this presents a gulf between those who HAVE the skills and those who actually NEED the skills; and many times the latter is boss to the former.
the significant difference in the remote control analogy, is that should I choose to just use the basic functions (0-9, power, volume etc) then my experience is not hindered.
I accept that my experience may not be as vibrant or as immersive as the technology could allow - there is absolutely no risk. If I mess around with the "advanced" section, the worst I could do is screw up my experience in which a quick turn on turn off could sort. (I'm simplifying here, but you get the point).
With computers, you are required to know most of the "advanced" stuff otherwise you put yourself at risk.
@ Charles. Even cars only require limited knowledge to operate (I'm a typical fuel, water, air person), but couple that with the pre-requisite training (to comply with local laws) and you are set. The laws remain the same (typically) for many many years, give or take an addition or two.
With computing there are no signs, there is so much to be ignored, but which bits are safe to ignore and which bits are not?
Even with 11 years in IT industry I cannot truly offer the suggestion that I know, especially because things are so fast paced and change frequently.
To summarise, tools like NoScript are excellent (and I use it myself) but as an industry it would be a good use of our efforts to dumb-down the tech for the general masses.
I used to laugh at people with old AOL accounts on account of the fact that they didnt get the real internet. Truth is, the general populous need that level of mollycoddling, because without it they are lambs to the slaughter.
(disclaimer - I'm not suggesting we put AOL back in business here!)
It is very difficult to get a happy medium on networks because users constantly want it quicker and innovations constantly introduce new vulnerabilities.
I'm not an IT pro but I do work in systems (of a different kind). I know this - when people tell me - "just give me the idiot's guide to this" - I tell them - "if you're an idiot you shouldn't be doing this stuff".
Calling for new HTML tags, browser redesigns and blaming the server side are going to get you precisely nowhere - the thing that is in your control is the client.
Here's the real deal - it's your data and your bank account that's going to get robbed. Now, where does the responsibility lie?
...with the bank? The technology exists to make online banking secure. It is used by some banks on this earth. Yet for some reason i can't work down to my bank and get secure cards that generate a set of one time keys for authentication, use a combination of biometrics and passwords or any of a dozen more secure methods of accessing my accounts than the crap we have today.
In fact, the banks have introduced a “secure” new chip-and-pin system that they have managed to get legally enshrined as somehow “uncrackable.” It isn’t, and the only reason they’ve spent billions on it was to ensure that when people DO have their identities stolen/accounts jacked/etc. that they aren’t liable for it.
If I had the option, I wouldn’t be relying on a bank where I had to worry about things like NoScript. I am a user too, you know. I want my banking to be just like a TV. Turn it on, pick up the remote and it works. The part that frustrates me is that because I am a sysadmin I know the technology exists to do a better job, but all of the banks available in Canada absolutely and completely refuse to not suck.
I cannot agree with your more on that last comment on banks.
The truth is, banking and charge cards are decades behind the technology. The visa/mastercard duopoly has stiffed innovation and held back consumer smart cards. All I keep hearing is that "real security is too complex for real people", but I'm of the opinion that it's not true. Secondly, even if it were true, why don't the banks offer secure more services to people who are security conscious? It all sounds like an excuse to remain complacent and stick to the status quo.
My bank in NZ offers an RSA token for free with a nominal fee as monthly rental to help secure internet payments. Alternatively you can have it txt a code to a phone number.
My bank in the UK *requires* a mobile phone number to be tied to the account so it can send a similar one off code to the phone to confirm addition of new payees.
Banks may be slow to change, but the sheer rate of fraud has at least forced some countries to adapt, and the mobile txt solution does at least cater to the wider public.
On a side note, a colleague of mine has been contacted twice by his bank to say that his card was allegedly used in <foreign country> and was it actually him making the payment? so they have certainly stepped up their internal security monitoring as well to reduce skimming losses.
My bank in Norway issues a credit card sized list of one time keys and asks for a random one (never the same one twice) of those every time I log in in addition to my own password. If I don't have it with me I can ask for a key to be sent to my registered mobile. This seems reasonably secure to me, why would I need the extra expense of an RSA key generator? What extra security does it really provide?
Note the El Reg: can we have a question mark icon, sometimes we really do want an answer and aren't being sarcastic.
"Each user of NoScript has the difficult task of learning and understanding how the web is constructed, so as to make informed judgments about what bits of it have access to run programs"
I've been using it for years and I actually found it really fun seeing all the different domains involved to run a particular site, third party ads, traffic monitoring, google analytics etc.
I have had friends use my machine and wince in pain when websites don't load all the content, after I explain about NoScript and show them what to allow / keep blocked they generally end up installing it themselves.
Although NoScript doesn't play nicely with Verified by Visa, I have got it to work - unfortunately I can't remember exactly what I did, but it's under the NoScript options, it might be enabling or disabling 'Turn cross-site POST requests into data-less GET requests' under the XSS tab... maybe someone else has got it working too and has a better memory?
"This kind of feature would effectively kill all 3rd party analytics tools"
If you want to collect statistics about your customers, do it in your own server. If data protection had any teeth, it wouldn't even *allow* you to use a third party to collect data -- especially a third party which can and *DOES* share the data it collects on your behalf with your COMPETITORS and usually does so completely without the knowledge and consent of the users that it is spying on.
"and advertising networks"
Again, if you want adverts in your page content, serve them from your own server. But step back and consider why you're using adverts at all. In most cases, your users are doing YOU a favour by visiting your site -- it would cost you a lot more to set up showrooms or provide the information in print. Treat your customers with a bit of respect and don't unnecessarily hog their bandwidth (which *they* pay for, not you) with unnecessary matter that probably benefits your competitors more than you.
Is something like NoScript, but with a built in (if you pardon the term) crowdsourced domain review feature. So when I see that "suihhx.net" is trying to run a script I can submit a request and see reviews for a clue about what they heck it is. Even if they were fairly terse like "Apparently just serves ads.", "OMG it ate my computer ^_^;;" or "Necessary to read comments on Washington Post." it would be better than just guessing.
Note that I specifically didn't suggest it should base the default policy on these reviews or even automatically display them, which would require summiting a list of every domain I visit to their server.
Biting the hand that feeds IT © 1998–2019