Google has rolled out a feature that provides webmasters of compromised sites with samples of malicious code and other detailed information to help them clean up. The search giant has long scanned websites for malware while indexing the world wide web. When it detects outbreaks, it includes language in search results that warns …
It's about damn time. Perhaps they'd like to make it proactive, since they scan nearly 100% of reputable sites on the internet already, why don't they notify the site responsible when they detect the problems.
new features in webmaster tools
There are some other new features in webmaster tools since I last visited. A site keyword frequency analysis for example and ability to view a webpage as googlebot sees it.
Oh gimme a break
Google sends tens of thousands of people to known infected links daily, but now is to be trusted as some self-appointed code officer? Even when complained about they continue to index and serve up links to horribly infected sites. Just search for "shark swim race" and check the top 100 links -- but only with virus and trojan controls on! Google is SO damn full of themselves they can't smell the stink from their own pants.
Google is a profit making organisation who provide a search engine. That is the beginning and the end of their area of responsibility. They serve up links to any site - that's what a search engine does.
Any social responsibility is optional.
They are to be commended for this, not bitched at.
so... what you're saying is that they're doing what a search engine is supposed to (ie, index the web in a way that makes that possible, which means doing it automatically), and that they are somehow responsible for "sending people to websites"? Methinkst thou hast the responsibilities reversed, sir.
A search engine shows you what it's indexed. If those sites were hosting malware during indexing, then, if you run a good search indexing system, you can detect that. if the sites don't host malware when a browser with the signature "googlebot" accesses it, then no such detection occurs (yeah, no matter how evil they are, google's bots do actually send a proper useragent string, and do respect robots.txt indexing instruction).
If you're an even better search engine, you ensure that people can find out which links that you've indexed contained malware at the time of indexing. Not "now", but "then". Getting around google's malware detection IF YOU INTENTIONALLY WANT TO, is really easy. That's why sites designed to lure you in so your machine can get cleaned up for all that it's worth is still there. However, sites that are unaware of the fact that they've been compromised are helped tremendously by an indexing system that detects malware, and can inform the webmasters of such sites where the threats may be hiding.
May way to learn a little more about how search indexing works, and how you can manipulate it, before you start calling kettles black. This approach is guaranteed to not work for intentional malware sites. Luckily, that's also not the goal in the slightest.
[4 Darcy] So would you be happier if Google *hadn't* offered this service?
I disagree. Any help Google offers in the remediation of infections in the websites it points the searcher at is to be applauded. It would be nice if they declared war on the malware-originating sites themselves, but that is an entirely unrealistic expectation given the nature of what Google is and what the malware-originating sites are.
I think you've failed to make your case Darcy.
- Vid Hubble 'scope snaps 200,000-ton chunky crumble conundrum
- Updated + vids WHOA: Get a load of Asteroid DX110 JUST MISSING planet EARTH
- 10 years of Facebook Inside Facebook's engineering labs: Hardware heaven, HP hell – PICTURES
- Very fabric of space-time RIPPED apart in latest Hubble pic
- Massive new AIRSHIP to enter commercial service at British dirigible base