Feeds

back to article Google open sources JPEG assassin

Google has open sourced a new "lossy" image format known as WebP — pronounced "weppy" — claiming it can cut the size of current web images by almost 40 per cent. CNet revealed the format with a story late this morning, and Google soon followed with a blog post describing the technology, which has been released as a developer …

COMMENTS

This topic is closed for new posts.

Page:

E 2

Think of the HDD makers!

-40% size for similar image quality!?

1
1

Internet Explorer support

Probably by about 2020?

11
1
Gates Horns

Optimistic much?

*Transitional* support by 2020, full support projected for 2035. And then dropped in 2021.

10
1
Silver badge

Not forgetting

It'll be Microsoft WebP (for Windows) format only.

5
0
WTF?

Jpeg2000

What about Jpeg2000 - much smaller sizes, available years ago. Used by nobody.

2
3
Silver badge
Linux

[Paste title here]

Jpeg2000 is encumbered by lots of patents. If this format ever gets widely used, patent trolls will have a field day.

4
0
Bronze badge

Used in Second Life

Jpeg2000 is the image format used by Second Life--it has a few advantages--but it suffers from some poor-quality code in open-source graphics codecs. One advantage is that you don't have to download the whole file to get a lower-resolution version of the image. But the image can end up as just the low-res version in one corner of the full-size bitmap.

Get smaller files with the same image quality, and the advantages for this sort of graphics-intensive net gaming are obvious. But there would be a huge amount of data to translate, in any existing game, and there's always some quality loss. There are alternatives to Second Life, currently using compatible software. But if this lives up to the hype, is the compatibility worth it?

0
0
Silver badge
Welcome

JPEG 2000 is very nice* but

The patent owners have thus far refused to put it in the public domain. My guess is that JPEG 2000 is the benchmark for Google's new format which they might actually be using to encourage opening of the the JPEG 2000 format.

* JPEG 2000 is particularly nice if you have text in your images as text suffers so heavily from artefacts in JPEG.

3
0
Silver badge

Smaller sizes?

I Wiki'd JPEG2000 to see a comparison. I noticed there are potential lurking patent issues (doesn't everything these days? <sigh>) which could be an impediment. However, the thing that amused me was the example comparison image... which was exactly the same size as the standard JPEG.

The software I use for doing my JPEGs (PhotoImpact5 - old but reliable) allows you to play with the JPEG type (progressive/standard), the image colour coding (4:2:2 etc) and the quality. The size of my JPEGs depends upon what I'm prepared to sacrifice, and for some things (monochrome scans of letters) you can get away with quite a bit.

Perhaps this is why JPEG2000 didn't take off? Maybe it was additional complication without enough returns to make it worthwhile?

0
0
Silver badge

JPEG 2000 compression ratio

I get stuff to be about 50% smaller again then JPEG with J2000. But as JPEG is already pretty good I guess it was diminishing returns coupled with the patents that prevented take up. Pity, because it really does produce better quality at lower sizes.

0
0
Gold badge

Jpeg2000 legal situation

Baseline Jpeg2000 is explicitly royalty and licence free, but the patent holders haven't put their IP into the public domain.

This is not significantly different from GPL-ed software, which is explicitly available for use and modification, but the copyright remains with the authors precisely so that the agreement that permits free use can be backed up with legal force if necessary.

Talk of submarine patents is just FUD.

0
0
Thumb Down

Ahh, how we missed you, Cade!

"It's no secret that Google is on a mission to make the web faster — in any way it can."

What a bunch of bastards!

"The faster the web, the more cash Google rakes in."

First, that's not strictly true. There's a point where faster loading hardly matters, because you can only read and see so much so quickly. At that point, increased compression or throughput mainly helps increase options for the web developers, not increase the number of served ads.

At any rate, so what? Google is 'raking in cash' - they're a business. In case you haven't brushed up on your economics lately, that's what businesses -do-. By gratuitously ripping on a company for making money while improving the net for everyone, users/customers or not, all you do is make yourself look like a petulant kid shoving over the chessboard. And you also reduce the credibility of any legitimate criticism of Google that shows up on El Reg, because the very fact that your articles are published in the form they are suggests an inherent bias.

14
14
Grenade

Wait a second

El Reg? Biased? Who would have thought... Anyway, in regards to your first point. You're forgetting that Google is forking out tons of cash to pay off its bandwidth usage. If they can reduce the amount of bandwidth services such as Google Maps and Google Images use then they will be saving a pretty penny or two.

1
0
Pint

i thought

google had bought some massive backbones themselves to avoid bandwidth costs? or did I dream that?

0
0
FAIL

RE: i thought

And are those backbones connected to Average Joe's home? No.

0
1
Silver badge

I thought once you get to Google's size...

...you don't so much buy bandwidth, as negotiate a peering agreement.

0
0
Silver badge
Coat

you still buy bandwidth, but...

No, you still buy bandwidth, you just start to call it "dark fiber". You do stop worrying about the "Gb" and start thinking in terms of "strands" and sometimes "(Coarse/Fine) wave division multiplexing." You also start to spend lots of time worrying about "idiots with backhoes."

Mines the one with the OTDR in the pocket.

1
0
Thumb Up

Wow!

14 up and 13 down... I'm not sure I've seen such a widely reviled *and* agreed-with post on here. And *I* got to write it!

*clasped hands / anime girl happyface*

3
0
Paris Hilton

Ahh, yes

If the quality is as good but 40% smaller, I'm all for it. It will cut the size of my pr0n partition. I don't care that Schmidt, Page, and Brin want to take over the world and probably have tracking codes inserted in the image format. Paris cuz, well, you figure it out!

2
0
Silver badge

It's not really 40% though, is it?

A bunch of JPEGs, PNGs and GIFs were converted to the new format and, across the lot of them, they saw an average 39% saving. A meaningful test would have been to compare it to JPEG only, since otherwise an unknown proportion of the argument is the senseless "we switched from lossless to lossy and saved a lot of space, hence our lossy format is the best format".

10
0
Pint

Lena/Lenna required

It is pointless to discuss a new image compression algorithm without side by side (uncompressed / compressed) pictures of Lena for reference.

9
0
Pirate

I dunno

Most JPGs already suck, especially at smaller sizes. Compressing them with ANOTHER layer of lossy compression doesn't seem like a good idea. At least not to an old fogey like me. But then, I think most of the video quality on youtube is unbearably bad compared to standard-def TV, so what do I know? Smeary, indistinct, grainy pictures - that's why we got broadband 10 years ago, innit...

7
6
Boffin

Not quite

"Most JPGs already suck, especially at smaller sizes. Compressing them with ANOTHER layer of lossy compression doesn't seem like a good idea"

Unless you're directly transcoding a JPEG then that's not how it happens.

Most of the video on YouTube is cack because it's shot through the tiny plastic lens of a £100 cameraphone/digicam at a lower resolution and framerate than standard def TV and with on-the-fly video/audio compression performed by a tiny processor. This probably goes a long way to explaining why it's not quite as good as standard def TV. The compression algorithms themselves are not necessarily at fault, as in this case they are very much limited by the amount of processing power available.

1
1
Silver badge

Did you read the article?

It's not another layer of compression, it's a different sort of compression. Compression techniques have improved greatly since jpeg was developed - it's not surprising that there are now better ways of doing it to the same quality level.

0
0
Boffin

Video quality on youtube

Video quality on YouTube is entirely due to the authoring and mastering processes used prior to upload.

I recently uploaded some footage taken with broadcast HD cameras, down converted to DVD size, encoded using WebM and uploaded to YouTube. The quality is actually really impressive.

Smeary, indistinct, grainy pictures - that's camera phones and webcams, that is.

Plus, I think you miss the point. It isn't the case of compressing already compressed jpeg images. Its about compressing newly authored images - that's the way you keep the quality.

To gain popularity, it will have to get into the camera market, and that will depend on how efficient the codec is. Jpeg is trivial to produce a low power hardware codec for, this being essential for small cameras, phones etc.

1
1
FAIL

Fail

Reading comprehension Fail. You're not compressing JPEGS - JPEG has nothing to do with this.

0
0
Silver badge
Thumb Up

Can you please ...

... post the URL?

0
0
Pirate

@bilgepipe

my failure of comprehension is generally well understood, but perhaps you should learn the basics of reading before spewing your bilge.

From the article:

"Some engineers at Google decided to figure out if there was [sic] a way to further compress lossy images like JPEG to make them load faster,"

"Google has tested the format by re-encoding 1,000,000 existing web images, mostly JPEGs, GIFs, and PNGs, and it saw a 39 per cent reduction in average file size."

2
0
Pirate

@James Hughes

yes, I read it. See my response to bilgepipe. Now perhaps there is more to Google's idea than what was distilled into the article, but judging by the article, it looks like Google is trying to further compress JPGs.

0
0
Stop

Hasty commentarding?

@Bilgepipe & James Hughes 1

I suspect he was referring to people using the command line app mentioned in the article to convert existing JPGs to this new format ; in which case the already lossy pictures will lose even more detail.

ie. think twice before converting your JPG pr0n collection to this new format, just to save space...

0
0
Silver badge

@ Pirate Dave

There's two things with YouTube. Firstly YouTube transcodes your input into their own format (another level of quality loss) plus it seems to transcode to a lowish (800kbit?) bitrate so you can see on-screen the colours are blotchy and flattened. I know, from having uploaded a 1800kbit XviD from a 2500kbit H.263 source recorded from HQ video using a Neuros OSD.

However, the truly terrible videos are from cheap cameras and mobile phones. I'm not sure there's an excuse as my small Agfa digital can do decent looking MJPEG video at something like 820x560 (shame the sound recording is awful). My Nokia 6230i, on the other hand, is just awful. I can barely tell the difference between its 3GP high quality and the low quality, and given the blocky mess that is the result, I'm not sure the word quality even factors into it.

In short, while YouTube introduces its own problems, most of the cack on YouTube looked like that before YouTube got its hands on the video!

FWIW, I think us older timers (who remember what a clear analogue picture looked like) will always be slightly disappointed. Yeah, it's cool that I can watch the video of my choice in realtime from places like YT and Vimeo. It's cool that I can store nearly 200 hours of video on DVD-Rs standing in a pile as tall as a single L750 tape (3h15m). It's cool that we can now have a billion channels with nothing worth watching on any of them. And it's cool that we can fill a ridiculously large screen with a picture with sufficient resolution that you don't see the individual pixels. The flip side? If you know what an MPEG macrocell artefact looks like, you'll see them EVERYWHERE. Bluray/HD suffers horribly from this, from the demos in supermarkets.

3
0
Silver badge

Easy mistake

One quote in the article does make it sound like this would be an additional layer of compression for JPEG.

"Google decided to figure out if there was [sic] a way to further compress lossy images like JPEG to make them load faster"

I nearly came to the same conclusion when I first read it.

0
0
Pirate

@heyrick

well, my point about youtube wasn't so much about the actual quality of the clips as it was that such a level of quality is apparently considered "acceptable" to a large part of the Internet. My kids watch videos on youtube that are barely discernable. It's like 20 years ago when we had those crappy Autodesk Animator flicks. Then we got Quicktime and MPEG videos, and things were much better. Now Youtube and Flash are lowering the standards again.

0
0
Silver badge

@ Pirate Dave

I think the level of "acceptable" is a trade-off between available bandwidth and how much you want to see the content. Don't forget that services such as YouTube are designed for in-situ viewing (which is why add-on software is required for downloading from such services). Because of this, YouTube needs to choose something of acceptable quality which isn't going to saturate your connection. I'm on a 2 mbit link and most things non-HD come through in real-time (my little netbook can do HD video, just not *H.264* in HD, too intensive).

But is this new? Remember in the good old days it was "acceptable" to use a VHS-C video camera to record a ciné screen (and many of them were fixed at a sync rate different to cinema projection leading to flickering) and, no, you didn't watch that tape. If you were lucky you saw a copy of the copy of that tape, which was so degraded it was barely discernable. What's new is that back then you needed to be friends with the video shop guy. Now it's open to anybody who is able to use a web browser and type the immortal words "cute kittens". :-)

Perhaps in the future, when we *all* have 100 mbit connections (I won't hold my breath!), we'll see a return to video of a better level of quality; but given that minority satellite channels are choosing lower bandwidth per channel in order to squeeze in more channels, I won't hold my breath for that either...

0
0
Gold badge

Did you read the linked blog?

Actually, it *is* another layer of compression. From the blog...

"We expect that developers will achieve in practice even better file size reduction with WebP when starting from an uncompressed image."

I read your comment and agreed with it before I read the blog, so I'm as surprised as you are, but I suppose this makes sense in context. After all, in the majority of cases, web sites no longer have have the uncompressed image, so "Can it squeeze my existing JPEGs?" is a fair question.

The blog also links to a gallery of comparison images: http://code.google.com/speed/webp/gallery.html. (No Lena, for copyright reasons apparently.)

0
0

Compressing JPEGs

"Reading comprehension Fail. You're not compressing JPEGS - JPEG has nothing to do with this."

Oh really? Are you're saying that all your current photos are in WebP and your camera(s)/mobile phone(s) produce photos in the WebP format? If not, then it may not be Pirate Dave who has been inflicted with a lack of comprehension.

0
0
Coat

MPEG macrocell artefacts

@heyrick: " If you know what an MPEG macrocell artefact looks like, you'll see them EVERYWHERE. Bluray/HD suffers horribly from this, from the demos in supermarkets."

Err... HD DVD almost exclusively use Microsoft's VC-1 codec, and occasionally H.264; not a disc in MPEG-2.

Blu-ray is almost exclusively H.264 for movie encoding, it was only the really early discs that used MPEG-2 extensively.

So having a bit of trouble wondering why you're seeing MPEG-2 macrocell artefacts, when these formats use variable block sizes ... maybe you're seeing them EVERYWHERE when they don't really exist. ;)

0
0
Silver badge

He said MPEG, not MPEG-2

And H.264 is also known as MPEG-4 AVC, which means H.264 is still MPEG (MPEG-4 Part 10).

0
0

title

Looking at the sample images, the ones where the new format sees the biggest improvements are those with large areas of solid colour or simple gradients (or close enough).

I guess that isn't too surprising since JPEG essentially treats each 16x16 pixel block independently, so there are easy wins for any format that takes a more high level view.

0
0

Great, except

40% smaller? Fantastic. Except that if we used 'Weppy' images on my site we'd need MORE space, because we'd be storing both JPEG and 'Weppy'. It's going to be a long, long time before anyone can retire JPEG.

5
0
Stop

Nah, wait till Opera/Firefox/Safari/Chrome

support it in 6 months or so, then leave IE users with a broken image.

That's the reverse of what bad websites have been doing for years, getting it working in IE and calling it a day.

4
0
LoD
Stop

Space, yes...

...bandwith, no.

0
0
Silver badge
Linux

Yeah But ...

Try explaining this to ordinary website users that cannot distinguish "The Internet" from that "Big Blue E" and see how long you want IE users to be delivered a broken site.

0
0
Happy

@Miek

No, just do a browser sniff and if it's IE then display an upgrade link to Firefox or Chrome instead of the image.

0
0

Uh

"No, just do a browser sniff and if it's IE then display an upgrade link to Firefox or Chrome instead of the image."

Unfortunately not an option, because we want our web site to actually make money. Shame.

2
0
Stop

Isn't there something about supported formats in the http header?

And couldn't you just transcode on the fly and send whatever format the client says they can handle?

0
0
Silver badge

Re-encoding PNGs?

"Google has tested the format by re-encoding 1,000,000 existing web images, mostly JPEGs, GIFs, and PNGs, and it saw a 39 per cent reduction in average file size."

Well it wouldn't surprise me if applying lossy compression to 24bit PNGs resulted in a reduction in file size. It would surprise me if it didn't.

How about you give us something unambiguous Google?

6
1
Gold badge

Re: Re-enconding PNGs?

Maybe even worse than that. The weasel word in that quote is "mostly". You don't need more than a small percentage of those 1,000,000 images to have started life as BMPs to make a 39% *average* reduction look distinctly unimpressive.

3
0
WTF?

See message body

Agreed. It'll be interesting to see this in independant hands.

The "mostly JPEGs, GIFs, and PNGs" is meaningless: re-encoding a JPEG is daft, GIFs and PNGs are solving a different problem (lossless), and "mostly" just undermines the whole thing.

Show us a JPEG and WebP of an actual RAW photo with filesize comparisons.

I'm not a naysayer, but their use of stats is awful.

Anyone else feel like we're back in the 90s, with the image format wars... fun times.

1
0

Page:

This topic is closed for new posts.