Feeds

back to article Newfangled graphics engine for browsers fosters data theft

Software developers at Google, Apple, Adobe, and elsewhere are grappling with the security risks posed by an emerging graphics technology, which in its current form could expose millions of web users' sensitive data to attackers. The technology, known as CSS shaders is designed to render a variety of distortion effects, such as …

COMMENTS

This topic is closed for new posts.
Anonymous Coward

RIP Flash

A very similar security problem has been encountered and dealt with by Adobe in their Flash Player about 7 years ago. Welcome to the new era of buggy, security holes and crashing "HTML5". And bring on the HTML5 "Skip Intro" pages.

14
2
Silver badge

Is this correct?

"It works by providing programming interfaces web developers can call to invoke powerful functions from an end user's graphics card."

Does this mean that websites will be able to avail themselves of my gpu cycles and maybe manage to get my graphics card to draw 200w or more? Sounds great - just as great, I am sure, as the sound of a gfx card's fan trying to cool an overheating graphics card that is running at top speed caused by visiting a webpage that has been injected with malicious code specifically designed to exploit this new "ability". For, you know, "lulz".

I would be willing to be any sum of money that websites maliciously exploiting these CSS shaders will outnumber websites using them "benignly" by orders of magnitude. And that, ultimately, there will be no real benefit whatsoever for people browsing even the "benign" webpages.

And isn't this what the world really needs, millions and millions of computers now using even more electricity rendering unneeded graphics on webpages that are already too heavily encumbered with jpgs?

(I wonder how long it will take for criminals to develop "cuda zombie-nets" that will enable them to use infected machines to form a kind "crime-cloud" for brute-forcing passwords etc. Well, even if that is not possible, there is one thing of which we can be sure: inventive minds are going to find a way to make the vast majority of web-users sorry that this was ever created).

3
1
Happy

simples

just make sure that the user has the option of disallowing the use of the gpu.

0
0
Anonymous Coward

of course

Said option will be off by default, named something opaque like 'disable GPU-GLSL', and hidden somewhere that only the most technical of users will ever find.

3
0
Bronze badge
Meh

plus

turn the option ON by default.

0
0
Silver badge
Devil

Hanlon's razor

Never attribute to malice that which is adequately explained by stupidity

Same as today - your fan spins into hovercraft mode on webpages where the cretin designing them has put a "DIY" ad in javascript without waiting between frames or has made a "DIY" adobe flash ad or has put flash videos for no reason whatsoever.

1
0
Anonymous Coward

WHY do web page need to cause "distortion effects, such as wobbles, curling, and folding"? There is no need to bring back dancing baloney.

9
2
Gold badge

Re: Why

You need to distinguish between web pages where you are the customer and those where you are the product. The latter need the fancy effects. In practice, there will be switches in your browser and add-ins to block the worst offenders entirely, just as at present.

3
0

Not just fancy effects.

Or beyond that, 3d visualisation - for example on the medical products I'm working on. As with any technology it could be misused (intro hell), but that doesn't mean that technology should just be dismissed because of it.

2
0

@Turtle

On the bright side, when your GPU is frazzled the attack wont work anymore.

0
1
Silver badge
Meh

Not so likely

"Even if you tuned a CSS attack to a given browser whose rendering behavior you understand, it would take many frame times to determine the value of a single pixel and even then I think the accuracy and repeatability would be very low,"

Possibly even lower if the browser does use hardware acceleration, because then you need to also understand and account for differences between makes and models of GPUs and the various driver versions and settings.

Not to mention the make/model of the CPU. Plus the amount and speed of RAM. Oh, and what other tasks might be running.

No, the only way I see this as feasible is if you have a target pool of a million or more devices, all with the same hardware profile, running the same OS, with few to no options to tweak graphics settings, with a lack of or severe constraints on multitasking, and which are only allowed to run one specific browser. But who would be stupid enough to buy something so obviously crippled as that?

8
2
Bronze badge
Go

@Steve Knox

What company sells only a small variety of machines with very limited hardware and software choices?

A certain fruity company comes to mind...

4
0
Silver badge

"No, the only way I see this as feasible is if you have a target pool of a million or more devices, all with the same hardware profile, running the same OS, with few to no options to tweak graphics settings, with a lack of or severe constraints on multitasking, and which are only allowed to run one specific browser. But who would be stupid enough to buy something so obviously crippled as that?"

Sounds a bit like the iPad. I'm using one now to reply to your message.

0
0
Trollface

hehe

instead of "this website best viewed at 1024x768" due to non proportional fonts or non dynamic frames buttons and the what not, now we get:-

best viewed on DX11 hardware...

what about netbooks? oh wait. an actual use for on-live GPU streaming? (pay subscription, datacentre renders page and feeds it down via streaming video to netbook)

this'll work when advertising creative types show off their wares to clients.

Client: "wow awesome! hey wait, this only works on 5% of our already reduced demographioc? you're fired!"

0
0
Unhappy

"best viewed on DX11 hardware"...

Kill me now.

1
0
WTF?

@"In my view, the most promising approach is to find a subset of the GLSL shader language in which a shader always takes the same amount of time to run, regardless of the input."

Seriously, WTF?!. This "same amount of time to run" is doomed to fail in so many ways, it would never be reliable.

For example, what happens on a new GPU that is more optimal, that hasn't even been designed yet. That could easily use a different amount of time and what happens with new GPU instructions added years from now, that could easily change the amount of time used and what about new GPU architectural change that effect execution time and create so much complexity, we can no longer even count cycles as caches and other pipelining considerations change the execution speed (as they do even now).

Plus that is before you even add in the effects of more cores on some cards competing for the same memory bandwidth that could stall some cards more than others. Also GPU board manufacturers use different data widths for memory buses all the time so the code thinks its the same GPU yet the memory is a different speed on some cards. Then what about overclocked cards with different speed memory and newer very low powered GPU's like on phones, that may have to take time to wake up from low powered modes, before they can run code. Plus what of filling different size caches on different cards and what about GPU driver changes, that effect the speed and possibly even what instructions some GPU's could use as they can easily translate shaders into their own internal card specific languages, that could also be different on future generations of graphics cards and even different between driver updates.

There is so many potential ways this "same amount of time to run" would fail that I am utterly amazed he would even suggest such a solution.

So there is no way at all that “same amount of time to run” approach would ever work reliably.

0
0
Gold badge

You're confusing "time to run" with "time between client issuing the command and being informed that it has been completed". You're also confusing "same on this hardware in this particular session" with "same on all hardware anywhere ever". Lastly, you're confusing "eliminate this covert channel entirely" with "reduce its bandwidth to a useless level".

The proposed solution strikes me as both affordable and reliable, in theory.

0
1
Facepalm

If CSS shaders already has security concerns - before even public consumption, bin it and find something more secure.

If you go down the 'fix it' road you're gonna end up with Flash v11.03.10.04 soon to be updated to .05 because of yet another security concern.

It's not frigging rocket science.

2
0

"It's not frigging rocket science."

Getting a graphics system that works across multiple versions of multiple browsers running on multiple versions of multipe operating systems on countless device types? I'd suggest it's not far off rocket science.

That the Flash player only had a few incremental updates is astonishing given how widespread its user base.

0
0
Thumb Up

A. Barth?

Great name!

1
0
Trollface

I don’t think this will ever be a security risk, mickeysoft will implement it’s own interpretation of CSS shaders what will be different from Firefox/Chrome/Opera etc. etc. and that will break the CSS shaders standard, thereby rendering the CSS shaders attack vector as useless.

0
3
Anonymous Coward

I must be old fashioned...

I thought the internet was for text, porn, and funny cat videos. I don't need any "distortion effects, such as wobbles, curling, and folding."

0
0

Old Fashioned?

text: those "letter viewed through a bottle" and "incriminating note curling as it burns" effect from 1930s movies.

Cat Videos: Who wouldn't want to tweak them with custom css to turn the original cat into one's own?

Porn: Likewise, a webpage with a single, not-particularly interesting "actress" could, via CSs, look like the celebrity of your choice. A natural Freemium business model: give away the porn, sell the CSS.

0
0
This topic is closed for new posts.