Great, now every jpg device is now defunct.
Time to spend money on new stuff.
Anyone want to buy a Zune? How about a Betamax recorder? Laser Disc anyone?
I fall for it every time.
Boffins have devised a new way of squishing data which could herald the end of the trusty JPEG picture format. A team at the University of California, Los Angeles, claim their new compression technique is much more efficient than the olden-days ways. As well as being useful for crushing images or streaming video, the new …
Uhh, at what point was PNG, a lossless format, supposed to kill JPEG, a lossy format? They serve totally different purposes. And PNG has pretty much killed GIF, the intended target, thankfully; no more 256 colours and binary transparency. Apart from the annoying animations, that is.
Actually before PNG there was JPEG2000 (patented that's why it never took off) and various "fractal compression" schemes.
The point is, JPEG is good enough and everybody has it implemented. The Web isn't slow because of the JPEG isn't highly efficient. The Web is slow because of idiotic web developers writing hugely complicated pages and spending more bandwidth on tracking cookies... while putting images onto a new, shorter, domain to save bandwidth, ignoring the time the DNS query takes to complete.
Do you think JPEG wasn't a mess of dubious (and less dubious) patent claims? No doubt with the way USPTO will allow patents for 'a number on a device' this will attract every turd patent trolling company in the USA with spurious 'an operator that changes a number' patent claims.
In many ways the best future for this might be to knock out a few hundred patent claims and let institutional buyers do the funding and fighting whilst granting free licence to noncommercial use, or similar.
> Do you think JPEG wasn't a mess of dubious (and less dubious) patent claims?
JPEG was supposed to be open, but patent trolls still managed to buy patents and use them to extort hundreds of millions of dollars from dozens of hardware companies producing JPEG-capable software. I'm not aware of any patent trolling using JPEG patents since 2008, tho.
OTOH Google could acquire it and adapt it into the WebP standard, make it Android-standard, and hold the patents under generous terms (basically saying they will only pursue if a rival firm like Apple tries to squelch them, like what happened with the WebM stalemate). Google, after all, doesn't have to be a troll to make money; the "side business" works well for them.
PNG had a slow start but is now more common because broswers can render it. JPEG 2000 has also struggled because of the lack of tooling, despite it's massive advantages, browsers don't have a clue what it is, and even plugins don't take advantage of it's "only get what you need to show" nature. JP2 will keep struggling until server technology and browser capabilities align. There is a IIF initiative ongoing so maybe in a few years we'll see a lot more of JP2s, they are being used in many big digitisation projects and in medical imaging. If you want to get a new format established these days you need to start witht he browser, get it there and you have removed all the obstacles as tooling and adoption will follow.
Which was why I mentioned Google. They're in a unique position to be able to (a) simultaneously saturate both the browser and mobile markets with the tech thanks to Chrome and Android, respectively, and (b) not really the type to patent troll: only using patents defensively and getting their revenues in other ways.
JPEG2000 is now largely irrelevant due to the advances in image compression made by the video industry in h264, webp, and even more with the new UHD stuff*. But JPEG2000 didn't take off because it was encumbered in patents from the start.
*Mozilla recently ran a comparison of alternatives to JPEG: http://people.mozilla.org/~josh/lossy_compressed_image_study_october_2013/
WebP is my favourite at the moment because it also for lossless compression where required making it suitable for both photos and text. Just missing a "file-in-file" approach to handling responsive images.
Yes, but JPEG2000 is required for PDF 1.5, which was published back in 2003, so anyone with a modern PDF reader has a JPEG2000 decoder. JPEG2000 is often found in PDFs generated by commercial software. However, I'm not sure if there's any widely-used free software that generates a JPEG2000.
Yes, but JPEG2000 is required for PDF 1.5…
The patent trolls like free-to-read, pay-to-write specs: GIF, JPEG2000, MPEG-2, h.264, etc. The free-to-read model encourages adoption by consumers but actually restricts the market by using licence fees to restrict new entrants to the market. However, the WWW is one of the best examples of allowing a market to thrive by keeping specifications open and free. Yes, it's not been without its problems, with the industry packing committees either to push their interests or prevent innovation from others.
Even if it could store megapixel images in a single byte it would never displace jpeg, because saving space or bandwidth for images is a problem that no longer exists in today's world. The inevitable patents, and even if made freely available, inevitable patent trolls who will claim patents on various things it does, make switching from jpeg to something new not worth whatever storage/bandwidth could be saved.
Now if it has significant benefits over HEVC for video (i.e. at least 2x improvement at the same quality) then I could see it having a chance of gaining traction there. The key would be to figure how to get it built into hardware, because without hardware support a video codec has no chance.
"Even if it could store megapixel images in a single byte it would never displace jpeg, because saving space or bandwidth for images is a problem that no longer exists in today's world. The inevitable patents, and even if made freely available, inevitable patent trolls who will claim patents on various things it does, make switching from jpeg to something new not worth whatever storage/bandwidth could be saved."
Thing is, if you ALREADY have patents for the tech when the trolls come knocking, you can use them as a defense and threaten a patent war. That's what Google did against MPEG-LA concerning VP8 tech. With defensive patents, you can threaten to invalidate the troll patents, and if your primary mode of business isn't patent-related, you have more lenient winning conditions than the trolls: all you have to do is not lose, giving you an advantage if the fight goes to court as a mutual nullification doesn't hurt you.
I think that's one reason PNG was accepted over GIF--when the LZW codec used by GIF was enforced, sentiment swung towards PNG which used the more-lenient Deflation (trading off animation for RGB color support).
Oh rly? The typical webpage loads at less than half my ISP allowed bandwidth. Why? Server side bottlenecks.
The typical camera has an inherent maximum FPS capability. Why? Memory writing bottleneck.
The typical flash chip can't double in density many more times. Why? Process shrink limitations on write cycles.
Space and bandwidth are very much today's and tomorrow's largest concerns with the amount of data humans producing, growing at a rate faster than our storage densities increase. Cameras are popping up everywhere, even kids riding their bicycles and regular folks driving to work are recording video. They used to be SD, then HD, and soon enough 4K. It all adds up. These are video not static images but the same type of compression tech can apply.
Then there's wifi. More and more things connecting together means reduced data rates are quite desirable. Certainly the lowly JPEG pics are only one small factor, but a popular enough one that it should not be ignored.
Biting the hand that feeds IT © 1998–2019