back to article WTF is... H.265 aka HEVC?

When Samsung unveiled its next-generation smartphone, the Galaxy S4, in March this year, most of the Korean giant’s fans focused their attention on the device’s big 5-inch, 1920 x 1080 screen, its quad-core processor and its 13Mp camera. All impressive of course, but incremental steps in the ongoing evolution of the smartphone. …

COMMENTS

This topic is closed for new posts.
  1. Tom 38
    Devil

    GPU encoding

    Current GPU encoding of H264 is of shocking quality. It might go fast (then again, it might not), but it regularly produces daft encoding results:

    http://www.extremetech.com/computing/128681-the-wretched-state-of-gpu-transcoding

    http://www.behardware.com/articles/828-1/h-264-encoding-cpu-vs-gpu-nvidia-cuda-amd-stream-intel-mediasdk-and-x264.html

    Quote from one of ffmpeg's developers:

    In general, developers believe that you generally get slower encoding with worse quality if you are not using the CPU. … The typical case is a very fast CPU with a GPU that encodes slower at a significantly worse quality.

    1. Dave 126 Silver badge

      Re: GPU encoding

      Hiya Tom, I had a look through that articles, and if I'm reading them right, their issues seem to be with the current (Windows) software tools that aim to harness discrete GPUs (and newer Intel solutions) for encoding. A common issue seems to be silly quirks, such as defaulting to near 30fps when the input is 24, inconsistant hardware support, and the need to delve into complicated options dialogues (which rather defeats the object of buying a solution sold on its ease of use).

      I guess a problem is that, unlike other forms of GPU-assisted computing such as mechanical simulation, the result of video encoding is subjective and human-specific; the end viewer is far more likely to notice aberrations on an actor's face than they are on a terracotta vase prop.

      1. James Hughes 1

        Re: GPU encoding @Tom38

        Not all GPU's are bad at this - I work with a mobile one everyday and I have never seen anything nasty from it. Many of the problem described though are nothing to do with the HW, rather the software used to drive the HW, so not actually a GPU problem.

        1. Tom 38

          Re: GPU encoding @James Hughes

          Yep, this is true. The problem is that GPUs will be sold with this feature to consumers, and they will act like the video encoders in todays set of hideously expensive graphics cards - poor quality, and poor speed. With GPUs, this is without doubt due to "good enough" implementation of the software.

        2. Anonymous Coward
          Anonymous Coward

          Re: GPU encoding @Tom38

          Not all GPU's are bad at this

          Indeed not---it's definitely a problem with the Windows software (probably exacerbated by the lack of standard interfaces across GPUs). The Raspberry Pi's GPU and OpenMax IL implementation produces excellent quality output with no artefacts. I'd read the "sorry state" article and as a result I wasn't expecting GPU transcoding on the Pi to be as good as software-based transcoding, but I was pretty amazed by the results. Fast (100fps for SD MPEG-2 to H.264), and with no visible artefacting.

          You do need to buy an MPEG-2 codec for the Pi, but it's dead cheap and worth it considering how easy it becomes to transcode DVDs and stuff off PVRs that can only save stuff as MPEG-2. The transcoding tool itself is at https://github.com/dickontoo/omxtx.

          1. paulll

            Re: GPU encoding @Tom38

            Trying to work out whether you're trolling or just don't know what you're talking about ...

            "Fast (100fps for SD MPEG-2 to H.264), and with no visible artefacting."

            Output res/frame rate? Output bitrate? No artifacting when viewed on what? What size was your sample audience?

  2. Peter Gathercole Silver badge

    Ah, another patent encumbered format

    Let's hope that MPEG-LA are more generous about the licenses, although I would be surprised if they were.

    And to pre-empt people who say that H.264 was freely available, I suggest that you look at the commercial encoding and decoder volume distribution clauses in the license agreement.

    1. JDX Gold badge

      Re: Ah, another patent encumbered format

      I've looked at the H.264 license when we developed software using it and it hardly seemed crippling. I also see no particular reason why something that represents man-decades of work shouldn't require payment as long as that payment is reasonable.

      The roots of the OSS movement, Stallman at least, were NOT about software being free but about it being open source so you could buy/license software and get the source-code in case you needed to alter it. That model of OSS is better in my view than the "software shouldn't cost anything" view most take now... I have no problem paying for software.

      1. Peter Gathercole Silver badge
        Unhappy

        Re: Ah, another patent encumbered format

        I'm perfectly happy with software being paid for running on Open Source platforms, but patenting the codecs such that you can't legally provide them as part of a free (as in free beer) OS puts huge amounts of leverage against projects who want to provide a free OS.

        The problem is that if you stay within the law, and don't ship what may become the de-facto standard for video, then something like Linux will always be seen as not for general consumption.

        Alternatively, if you ship the codecs as part of a distribution regardless, so that the experience to the end user is good, then MPEG-LA can then demand a payment from you. You have no revenue stream because you are providing the software for free, and cannot pay unless you are Mark Shuttleworth (who paid for an H.264 distribution license for Ubuntu, and got heavily criticised for it).

        The problem clauses in the H,264 are the volume, which says something like the distributor has to pay a licence fee per copy deployed if they ship more than 10,000 copies, and the one that says that if you have to pay a fee if you use the encoder to produce commercial videos.

        Bearing in mind how viral Linux distributions can be, how do you measure how many times it has been deployed. I download one install image, and use it to install thousands of systems, and offer re-distribution from my web site. How is that measured? And who should pay?

        And what qualifies as commercial? If one of my kids record the neighbours cat doing something comical, and upload it to YouTube, and Google attaches adverts, is the video for commercial purposes? Should I pay for the encode? Should Google, even though that may not have encoded it?

        Licensing like this is a legal minefield for Open Software since the days of MPEG2 Layer 3 (aka MP3) or GIF. My point is that it would be so much better if the codecs (or even just the algorithms) were available under a permissive license.

        1. Rampant Spaniel

          Re: Ah, another patent encumbered format

          Honest question, can't you just reverse engineer your own version of the codec? Isn't that how it's currently done anyway?

          1. James Hughes 1

            Re: Ah, another patent encumbered format @ Spaniel

            There are loads of reversed engineered implementations - but they all by definition infringe on the MPLA patent pool somewhere.

            That said, the H264 licence is pretty good value (you get a paid for one on the Raspberry Pi, and that's pretty cheap). What's weird is that a lot of much simpler audio codecs are a lot more expensive.

            1. Peter Gathercole Silver badge

              Re: Ah, another patent encumbered format @ Spaniel

              I got it wrong. It's 100,000 units, not 10,000.

              This is a quote from the MPEG-LA H.264 License terms summary.

              For (a) (2) branded encoder and decoder products sold on an OEM basis for incorporation into personal computers as part of a personal computer operating system, a Legal Entity may pay for its customers as follows (beginning January 1, 2005): 0 - 100,000 units/year = no royalty (available to one Legal Entity in an affiliated group); US $0.20 per unit after first 100,000 units/year; above 5 million units/year, royalty = US $0.10 per unit. The maximum annual royalty (“cap”) for an Enterprise (commonly controlled Legal Entities)

              is $3.5 million per year in 2005-2006, $4.25 million per year in 2007-08 and $5 million per year in 2009-10, and $6.5 million per year in 2011-15

              All rights to this text belong to MPEG-LA (just a disclaimer to avoid any copyright issues)

              So, 20 cents for every shipped copy between 100,000 and 5,000,000, and 10 cents after that up to a maximum of $5 million dollars. That's quite acceptable if you are incorporating it into a product costing $20, but not so good if you are wanting to include it in a popular free Linux distribution. I wonder whether the fact that you are not 'selling' Linux is enough to get out of "sold on an OEM basis" part of the clause?

              1. JDX Gold badge

                Re: Ah, another patent encumbered format @ Spaniel

                I take your point Peter but if I make something I want to sell, the fact you use it in software you choose to give away isn't really my problem - give your software away for free but I don't want you giving away mine.

                1. Terje
                  Black Helicopters

                  Re: Ah, another patent encumbered format @ Spaniel

                  The problem here is in my opinion not that you make something you want to be paid for. The problem is that what you want to get paid for is a standard, ie something that others will be more or less forced to use and in reality you have no option to use anything else. TV is not broadcast in multiple formats so you can chose the one you want, it is broadcast in one format take it or leave it.

                  1. Anonymous Coward
                    Anonymous Coward

                    Re: Ah, another patent encumbered format @ Spaniel

                    "The problem is that what you want to get paid for is a standard,"

                    Such as the Compact Disc, DVD, Bluray, CPU's, GPU's, USB, Firewire and on and on and on....

                    "TV is not broadcast in multiple formats so you can chose the one you want, it is broadcast in one format take it or leave it."

                    Yup and the software and hardware to use it is splattered with patents

                2. Peter Gathercole Silver badge

                  Re: Ah, another patent encumbered format @JDX

                  But it is not just the software. In your comments, you're assuming that the people who write an alternative implementation have ripped off your code.

                  This is always about software patents, not the code itself. OK, you write a nice implementation, your code should be protected, I totally agree. But the algorithm used should be open, so that someone can provide an alternative implementation. If you can prove that they copied your code, then I would support you suing them through every court in the land. But if they wrote their own, through their own efforts.....

                  It's a serious dilemma, I admit, but all the time we have ambitions to produce a truly free operating system suitable for everybody,, then we have these problems.

                  I could (although with some reluctance) support going to a model where the OS is free, but the licensed codecs you need have a reasonable cost associated with them (in line with the H.264 charges that seem entirely reasonable). That is how it was in the early Windows days (remember Windows when it could not play media out-of-the-box and you had to buy software to play music and video), although too many people just copied MP3 and DVD packages on Windows.

                  If we go to this model, it should be clear that this is the case to users of all operating systems, and maybe other OS vendors should be prevented from providing the software as part of their OS offerings. But this has not exactly been a totally successful strategy in the Browser rulings.

                  And as long as the alternative implementations abide to the rules on the use of LGPL toolchains, this should not fall foul of any open systems licensing, either.

                  1. h4rm0ny

                    Re: Ah, another patent encumbered format @JDX

                    This is always about software patents, not the code itself. OK, you write a nice implementation, your code should be protected, I totally agree. But the algorithm used should be open, so that someone can provide an alternative implementation

                    This is worth examining. The above would be correct if the effort and work producing this was on the coding side, but it is actually largely on the algorithm side. I haven't looked at the algorithm, but I am a C++ programmer (or I was for some years) and I have some background in mathematics. I'm not at all trivialising the work that goes into implementing this, but if I look at the algorithm, my educated guess is that it wouldn't be that hard for me to turn it into code, just a little time because I'm rusty. But could I come up with the algorithm? I doubt it. I understand the principles detailed in this article and I dare say I could follow a more detailed version too, but my maths simply is not good enough to have done what these people did and nor do I have the large amount of time and effort these people were paid to put in.

                    What I'm saying, is that your suggestion that the code is what needs protecting, that "ripping off" involves copying the code, mistakes where the effort takes place, and thus where the protection should occur.

                    If someone creates a computer game where I am a gun running round shooting aliens in first-person view, well that's a simple idea, but the code will be huge and complex. Thus copyright law prevents me just copying it and calling it mine. But I can freely make my own version. If someone creates a complex series of sophisticated algorithms for video encoding/decoding, then the idea is the complicated part, but the implementation will be (relatively) simple in that I'm just taking the maths and turning it into code (with some parallelization if I want it to be a [I]good[/I] implementation. Thus the latter case isn't looking to copyright law to ensure the creators are fairly recompensed, but to patent law.

                    As you said at one point, the problem is that it becomes a standard. There are only three ways out of that. Either

                    * An Open Source alternative is created that is as good as the proprietary one.

                    * Users pay a very small sum to licence it directly.

                    * Someone pays it on behalf of the users.

                    The first has not happened, unfortunately. That would be the ideal.

                    The second would probably be the fairest second option but it requires more prevalent and easy micro-payments amongst users, so it's a solution for five years down the road. Though you can do it with some success today, so I would advocate this one.

                    The third is all nice and lovely, isn't it. In practice, it probably means Google showering you with ads and mining your data as free services usually do. Though Ubuntu maybe has enough revenue that they will do this in their case, it doesn't help the rest of the distros.

                    What isn't an option, imo, is simply throwing out the patent protection and saying you can just give other people's efforts away for free. The licencing terms are actually already quite generous in that you can give it away a 100,000 times before it is an issue. But surely if you are making money from other people's work (and Ubuntu *is* a business, as are others), then surely those others should have a right to recompense. I mean I actually could legally roll my own h4rm0nix distribution (you heard the name first here) and legally distribute the codec with it up to a 100,000 times. That's pretty cool. But move to a large business like Ubuntu, it's a different story, imo.

              2. Anonymous Coward
                Anonymous Coward

                Re: Ah, another patent encumbered format @ Spaniel

                Isn't there (supposed to be) an Ubuntu store? Why not just sell patented codecs in there? The guys at Ubuntu would probably be willing to write up the code and toss it on the store for 25¢-$1 to cover licensing and other costs. I wouldn't worry about people torrenting the app, since with the current situation, anyone that wants to use h.264 on BSD/Linux already uses an unlicensed codec.

                Perhaps Valve/Steam would be willing to set that up for a more general Linux audience.

                Feel free to pass this idea on to anyone actually interested in implementing it.

        2. Frank Bough
          Meh

          Re: Ah, another patent encumbered format

          Surely your graphics accelerator already includes a license to transcode from MPEG-LA? Simply pass your AV data to your legitimate hardware CODECs.

          1. Peter Gathercole Silver badge

            Re: Ah, another patent encumbered format @Frank Bough

            That's probably true now, although as standards progress, it means that you keep having to update your adapter (or phone or tablet) every time a new codec is becomes 'standard'.

        3. alain williams Silver badge

          Re: Ah, another patent encumbered format

          But I thought that software patents were not (yet) enforcable in Europe ... so we could write and use codecs for free, it would just be those in the USA (& a few places) who could not.

      2. Yet Another Anonymous coward Silver badge

        Re: Ah, another patent encumbered format

        "I also see no particular reason why something that represents man-decades of work shouldn't require payment"

        But it's an international standard and requires payments even if you do all the work yourself.

        The meter required decades of work by French astronomers - but you don't have to pay the French government if you use the international standard today

  3. Buzzword

    Freeview

    For the UK consumer, the timeline looks like this:

    1996: H.262 (MPEG2)

    1998: ONdigital, which became ITV Digital

    2002: Freeview, from the ashes of ITV Digital

    2003: H.264 (MPEG4)

    2009: Freeview HD

    2013: H.265

    Given the current track record of taking six years from tech spec to product launch, can we pencil in the launch of Freeview 4K or Freeview HD+ in 2019 ?

    1. mccp

      Re: Freeview

      As the article points out, you're more likely to see H.265 used to broadcast Freeview HD than Freeview 4K as you would be able to fit twice as many channels into a Freeview multiplex. In fact, the terrestrial TV broadcast frequencies are so valuable for mobile applications, it wouldn't surprise me in the slightest if the 2019 row in your table was:

      2019: Freeview switch off, viewers migrate to FreeSat.

  4. Anonymous Coward
    Anonymous Coward

    I would like to register the following names, they might be up for sale if the price is right:

    HDave

    H Dave

    HiDave

    HD ave

    HDDave

    and all of the above "+1" and "+2"

    1. Dave 126 Silver badge
      Happy

      Sod off! : D

  5. Gordon Pryra

    Patients?

    H264 was ruined by the patients held against it. (mainly by Intel? though I could be wrong on this)

    I assume that H265 will be the same, it may as well be propriety for all the good it will do the community as a whole.

    Not such a problem as Samsung are the company I would be buying from anyway, but for stuff like applications that use it, or could make use of it, it may carry the same issues as the old format. Just too expensive for a lot of people to bother developing on.

    1. Tom 38
      WTF?

      Re: Patients?

      If H264 was so ruined by patents, how come it's the dominant video codec used in broadcast video, blurays, internet video, webcams, digital camcorders, "the scene".....

      PS - 'Patents' not 'Patients'.

      1. Rampant Spaniel

        Re: Patients?

        All those patents aren't worth jack if you price it at a level where it's cheaper to use an alternative. Given this is a standard patents should be available under FRAND terms, there will probably also be a patent pool. Patents didn't kill Mpeg2, h.264 nor will they kill h.265. Now what they did to memory sticks, thats another story but I think that was semi intentional.

        On another note, UHD is not 4k x 2k, it is UHD, 4k is 4k. Sorry :-) They are actually at least 2 distinct resolutions (4k actually does have a couple of flavours, neither of them taste of UHD) but its a minor point. I shall climb back under a rock now lol

        This is good, I can see it being adopted quickly by the likes of netflix, amazon (prime), Hulu et al as it should save them money and it's a selling point for phones. The $1-$2 (probably lower in bulk) in royalties it will add to a handset you should recoup in reduced data costs as a user. I remember reading some earlier documents on h.265 which said they aimed for it to be less processor intensive than h.265. Quite whether that took into account the rate at which processors advance I'm not sure but given that relatively low end arm chips should be able to chew this stuff once its hard coded in I can't see it being the monster that Blurays were initially, where the players needed a relatively high amount of grunt for the time. This should also go a long way to smoothing the transition to UHD discs, coupled with all the newer higher capacity bluray prototypes out there UHD should be relatively easy to do. Sets are now on the market and getting cheaper, hdmi caught up (no more 4 cable bodges) in 3-5 years UHD should be about mainstream (maybe $1200 sets, $250 players, $30 bluray 2.0 discs, streamable on 10mbps that kind of level). Lets see if it sticks better than 3d :-)

        1. Jan Hargreaves
          WTF?

          Re: Patients?

          It's free for up to 100,000 units, 20 cents up to 200,000 units and 10 cents after.

          You are saying this is too expensive? What do you think they should be charging?

          1. Rampant Spaniel

            Re: Patients?

            If you mean me, no I was saying the opposite, that the pricing isn't an issue unless you want everything free. The pricing seems to be set at very sane levels.

            1. Jan Hargreaves

              Re: Patients?

              Ahh fair enough then. The pricing seems extremely reasonable to me and I was somewhat surprised considering the level of abuse the MPEG-LA receive from freetards/ the VP8 brainwashed.

              The previous suggestion of a 20 cent (ish) addon in the Ubuntu store seems very sensible.

              1. Rampant Spaniel

                Re: Patients?

                Sorry, my fault for not being clear :-) I was attempting to say that by adding your patents to a standard you are losing control to some degree of the sale of the patent so in doing so you are saying I am choosing a lot of units sold at a low price over fewer higher priced sales and the ability to control it's sale ala Apple vs Samsung. You cannot withold sales to stomp on a competitor as its adoption as a standard forces people to need it.

                Not only do they have to sell their patents, they pretty much have to do it at a sane price to everyone (even companies they don't like) otherwise you get no sales. If they scuppered h265 was a silly price then webm v9 or similar would become the dominant codec and h265 would fade into insignificance (like memory sticks, but again I think that was semi intentional to keep them limited).

                Now it doesn't mean it's free and it still causes issues for free os's but a patent pack for a few bucks covering all your reasonable media needs is a reasonable proposition (at least to me, probably even a few more bucks if it was cross platform and multi device). There is also the potential for an ad supported media player with codecs by someone like google. Free is nice, but not everything can be completely free in all ways all the time.

  6. Anonymous Coward
    Anonymous Coward

    All those pretty words and not a single mention of WebM

    I'm as excited as anybody about halving the bitrate required for any given quality of video, but there is a powerful, well-positioned, and utterly free competitor also waiting to release in WebM's VP9 CoDec.

    The content industry doesn't really need your help promoting their version, so try to give us a more complete perspective next time!

    1. HMB

      Re: All those pretty words and not a single mention of WebM

      VP8 never stood up against H.264 in quality tests at the same bitrate. We'll just have to wait and see how VP9 does.

      Patent free video codecs are a false economy if you have to pay more in bandwidth and storage than the amount you saved on the codecs.

      1. petur
        Meh

        Re: All those pretty words and not a single mention of WebM

        VP8 may be a bit inferior quality-wise, it is still considered good enough... What kills it is lack of hardware encoding/decoding support, patent litigation threats and H.264 being cheap enough so people don't risk going to the alternative...

        1. Charles 9

          Re: All those pretty words and not a single mention of WebM

          VP8 was just too late into the game. By the time it came out, everyone had already settled on AVC. For Google to get VP9 into this generation's game, it needs to beat H.265 to the punch, probably by adding support for it into Android, and sooner rather than later (meaning rather than wait for Key Lime Pie, incorporate it as far back as possible, into Jelly Bean and maybe even ICS). Doing that will provide a mass of support that will get the content creators to support it more and get the embedded hardware makers on board.

  7. ewan 3
    Thumb Up

    Really good article

    Really enjoyed that - very informative. Nice one.

  8. Pet Peeve
    Boffin

    Math problems

    So, half the bandwidth to support 4 or 16 times the raw data input? There's nothing wrong with the article (on the contrary, it's great content), but this doesn't seem like much of a solution to me. I don't see the average home bandwith doubling, let alone quadrupling, any time soon.

    Who outside of a movie theater even WANTS 2k or 4k video? Seriously, I'm asking. How much smaller do the dots need to get for home video?

    1. JEDIDIAH
      Linux

      Re: Math problems

      > Who outside of a movie theater even WANTS 2k or 4k video

      I have my own.

      The requirements aren't really even that cumbersome. Get yourself a decent sized room with squared off walls and you're half way there. It doesn't even need to be a dedicated space.

      The necessary gear is cheap and getting cheaper.

    2. James Hughes 1

      Re: Math problems @Pet peeve

      The problem is that this is the State of the Art - there isn't really much else in the pipeline to reduce bandwidth. The compression already available is pretty stunning, and there are limits to how low you can go.

    3. Rampant Spaniel

      Re: Math problems

      2K is very close to HD (within a few percent) and HD seems to be selling well. UHD is just as close to 4k. I would suggest you wait until you can see uhd footage on a uhd screen before making concrete judgements about it. Once the cost falls, there is more content and we can see it in action there may be more of an argument for it. UHD should be streamable at 8-12mbps for average quality, closed to 20mbps for crisper pictures. Not exactly impossible or out of reach today. My home connection is nothing special but could probably manage a single uhd stream. It currently handles a couple of netflix hd streams.

      You may be right about people not wanting it, I said the same about 3d, but only after I saw it in action. I've seen true 4k footage and UHD and it's pretty darn awesome but I haven't seen it side by side with HD on the same size screen yet so I'm waiting.

    4. JDX Gold badge

      Who outside of a movie theater even WANTS 2k or 4k video?

      If people think they need 1080p on a 4" smartphone, probably quite a lot of people. The newer iMacs are already at 2K or above and iPad Retina, Chrome Pixel and so on are just about there.

    5. cbf123

      want it for computer monitor

      I would LOVE a decent-sized 4K TV for use as a computer monitor.

      1. Rampant Spaniel

        Re: want it for computer monitor

        They have been around for a while, Barco? and Eizo, they just cost a fair bit :-) Even in the past year it has been 30-40K USD territory.

    6. gujiguju

      Re: Math problems

      @Pet:

      Who wants 2K or 4K video? No one...until they see it, then it's amazing. My guess is we'll keep pushing display technologies until we get close to 3D resolutions of the human eye...

      (Strange that companies waited for Apple to go Retina before others followed with hi-res displays. I guess we can assume Apple will be first with a desktop Retina display, as well.)

      1. Rampant Spaniel

        Re: Math problems

        My understanding (and I could be wrong here) is that detail still is useful beyond the resolution of the human eye. Theres a lot of talk amongst photogs who produce fine art and similar of things like micro contrast, detail that you can only see withou a loupe, but that you can tell is there with a normal eye even if you can't actually directly 'see' it. It affects transitions and tonality. Just because you cannot see the dots doesn't mean that they don't play a part in the image you see. How useful is it to you at home, hell a 50 inch HDTV isn't something you need anyway it's something you want, a 50 inch UHD TV is probably just something you need a little less and maybe want a little more.

        We used to muck about trying to fool each other guessing what camera \ lens \ film shots were taken on, you can normally easily tell apart digital, film, 35mm and 120 film even on 8x10 prints which in theory you shouldn't be able to do between 35 and 120.

        As I said, companies have had 4k displays out for a while, you just needed very deep pockets. I'm not sure if Apple will actually be first with a UHD desktop, certainly not the most affordable, I would guess someone like Alienware might, especially if Dell opt to chuck out a 30-40 inch UHD monitor. I wouldn't be shocked at Apple doing it, I'm just not sure how it would be priced given Apples margins.

        The proof is entirely in the pudding. There are fanboi's out there who will buy it on principal. I would strongly suggest people simply do a side by side test and see if it's worth your money based on your experience. I don't think theres probably a wrong answer as it will be individual to you.

  9. Pet Peeve
    Facepalm

    Yeah, I totally missed that the compression would benefit HD content too. I guess that's a good thing - movie rentals take too dang long to download from itunes, and netflix streaming would certainly benefit from this.

    Carry on then!

    1. JEDIDIAH
      Linux

      The benefits to streaming

      Considering how much of a gap there is between streaming and BluRay, a mere 50% improvement in bandwidth may not be good enough. If you are the sort to opt for the spinny disk instead, it's probably not going to be a worthwhile improvement.

      (too dang long)/2 is probably still (too dang long).

      1. Rampant Spaniel

        Re: The benefits to streaming

        @Jeddiah

        True but it is 50% via compression plus X% via improvements in broadband speed. You cableco updates to a newer docsis or just upgrades capacity and all of a sudden you have the extra capacity you need. Our cableco doubled our connection speeds for no extra cost last year. This is one step in the right direction, others will also help :-)

        1. JDX Gold badge

          Re: The benefits to streaming

          Netflix HD (or superHD as they seem to call it now) is really quite good. I'm sure it's not up to blu-ray quality but on my 42" TV I never notice any artifacts or anything. So being able to get 2X as much data seems likely to make a big difference for streaming, at least.

          1. Anonymous Coward
            Thumb Up

            Re: The benefits to streaming

            > Netflix HD (or superHD as they seem to call it now) is really quite good.

            Here in Canada, Netflix was pretty good on release, then they downgraded the feeds to an appalling rate because not everyone can get the speeds that I can and they had complaints.

            Just recently, I noticed that I can get proper HD on my Smart telly via a NetFlix app. Initial res is lower when you start to watch a program, but I guess due to bandwidth sampling this improves to full HD after a few seconds or so.

  10. westlake
    Pint

    H.264 and WebM

    H.264 royalties are straightforward.

    If you are not into content production or codec distribution on a commercial scale MPEG LA has no interest in you whatsoever.

    http://www.mpegla.com/main/programs/avc/Documents/AVC_TermsSummary.pdf

    The problem with WebM is that it is --- for all practical purposes --- nothing more than a distribution codec for YouTube.

    H.264 has a much broader reach.

    Broadcast, cable and satellite distribution. Industrial and military applications. Automotive. Security. Home video and so on. Google is big but not that big.

    http://www.mpegla.com/main/programs/AVC/Pages/Licensees.aspx

    1. Charles 9

      Re: H.264 and WebM

      Maybe, but perhaps that was more because WebM was late to the game. This time around, Google has a chance to steal the march on H.265 by being first. Suppose, before H.265 is released, Google releases Android 5 KLP with built-in VP9 support, maybe even backports the support to JB and/or ICS. Suddenly, you got a mass of devices with support ready to go. These should also make all those VP8-encoded YouTube videos readily available to them. Now you've got some pressure.

  11. JeffyPooh
    Pint

    Implementation suggestion for smooth as silk frame-by-frame advance etc.

    When the meat machine hammers the Pause button, the decoder should be madly decoding and buffering additional frames going forward, and the recent decoded frames in the negative time direction should already have been buffered. Depending on which way the meat machine wants to go, the decoder can turn its attention to buffering more frames in that direction. So the frame-by-frame advance and back-up should be perfectly smooth and the iFrame-transparent to the meat machine. It should be effectively a series of stored frames, with just enough trivial AI to stay ahead of the meat machine. Being slow-mo, it shouldn't be difficult.

    You'll need to include enough storage for perhaps 30 seconds to a minute of decoded video frames.

    I consider this to be obvious.

    It could have been done already on H.264 satellite/cable PVRs. But they didn't. It jumps around like an idiot when you try to maneuver frame-by-frame. There's no excuse.

    1. Anonymous Coward
      Anonymous Coward

      Re: Implementation suggestion for smooth as silk frame-by-frame advance etc.

      What you want is completely in the hands of the client, it has nothing to do with any of this. There is already a few players for PC that do this, but most avoid it due to the serious overhead, not to mention if a memory leaked occurred during such operations. It's a client problem.

      1. h4rm0ny

        Re: Implementation suggestion for smooth as silk frame-by-frame advance etc.

        What you want is completely in the hands of the client

        If you look at the title of the post above you, you'll see the term "Implementation suggestion". The poster knows this is something for the client to implement. They're just expanding on the original topic.

  12. Anonymous Coward
    Anonymous Coward

    Great, I'll be able to buy all my old media collection again.

    1. gujiguju

      Buy it all again

      Was just thinking that, as well. hahaha.

      (I wonder if Apple or other movie services will be able to negotiate upgrade fees with movie studios, as was done with transition in iTunes music from MP3 -> AAC...?)

  13. quartzie
    Boffin

    hevc facts

    There is a broad alliance of patent holders behind h.264 and even broader behind HEVC. The difference that HEVC made was including HW and SW partners, so that it could be designed for easier processing using combinations of CPU/GPU. Also, the design goal was to produce a codec @ half of h.264 bitrate - this was close, though not quite there last time I checked a couple of months ago (while the reference codec was in Working Draft 6).

    As for UHD content, there are suddenly large swaths of similarly textured areas in each frame (imagine a DSLR shot of someone's peachy complexion) - all the more suitable for larger quadtree-style macroblocks. The large resolution also calls for finer motion-estimation, which is responsible for a lion's share of the encoding complexity increase - actual "compression" in HEVC has been pretty much ported from h.264/avc, and this also means that developing solutions for h.265 is going to be that little bit easier.

    For people unfamiliar with the process, a compression standard pretty much describes the format of a data stream - and leaves the actual implementation to the market. HEVC working group was actually kind enough to also provide a reference encoder/decoder software, modifications of which made its way into many scholarly papers and dozens computer science students' graduation theses.

    The sad truth about free codecs is that the basic technology for video compression hasn't changed significantly since mpeg2 - and the elements in that are very much patented. VP8/9/10.... won't be any different.

    1. Anonymous Coward
      Anonymous Coward

      Re: hevc facts

      You seem to know a lot about this. Can you tell me how well the decoding computations are supposed to fit into a mobile unit that runs on 5v @ 3000mAh? If what I read about a year ago is still holding, the performance hit is and always will be around 40% even at optimal performance. Would it even be possible to smooth out the pipeline of a integrated hardware decoder to decode a 15 minute clip without killing off 100% of the battery life?

      I don't care about H.265 and the mobility world. However, I do wonder why it has often been mentioned in the mobile world, especially when H.264 has been scarfing battery life since day 1 and will continue to do so. There might be more umph' in the math, but every little extra bit of umph' is not helping in a world where batteries just don't seem to be advancing.

      Anyways, I'm really interested in the decoding performance. I can live with a horrifically slow .05fps during crunch, but if I can't playback in realtime without water cooling...well.

      1. quartzie

        Re: hevc facts

        The key is in multicore & multithreaded hardware based decoding. If you ran a purely CPU based, software single-threaded (i.e. reference) decoder, you'd get nowhere near realtime decoding for full HD streams even using h.264/avc, not to mention HEVC.

        Nobody sane does that, though - and today's CPUs and GPUs are proof of the fact, many supporting full decoding of h.264 using specialized instructions or dedicated HW. The real issue is going to be power-efficient encoding of h.265 streams in hardware, because there's no cheating the complexity even with specialized ASICs.

        1. Charles 9

          Re: hevc facts

          That's probably why the hardware makers were part of the conversation. They know the portable market will be key to mass adoption (as it was for H.264). The emphasis on multicore processing shows they know where computing is going (more CPU cores and the mass parallelism of GPUs) and are taking that into consideration. I'll see where this goes as well as VP9 to see if it provides honest competition.

  14. Barry Rueger

    Moot Point

    To those of us in Canada where data charges are so insanely high that even garden variety HD video would bankrupt you. Hell, I think twice before streaming audio!

  15. hamsterjam
    Happy

    I've seen it working, and I liked it

    Last September at IBC I saw a HEVC demo of a 4k x 2k video loop. The picture was flawless, and the bitrate was ~12Mb, which is what a standard 1080i HD channel via satellite will take up. I had a nice chat with a man from NTT DoCoMo about it.

    Also at IBC Sony had a 4k x 2k stream coming from Astra. The picture was not perceptibly better (subjective of course, the source material was different) but the bandwidth was ~50Mb, which I will never see off a satellite unless I install a dish bigger than my back garden.

    This doesn't address the current cost of 4k screens of course, but that's only a matter of time. The first TV I ever watched was 405-line monochrome, now it seems that in my lifetime (my word in God's ear) discernible pixels will cease to exist.

    All the concern about patents is understandable, but the net result is a much better way of imaging the world. Now if we could only enhance the standard with blockers for advertising, party political broadcasts, soap operas and reality TV then we'll be getting somewhere.

    1. Charles 9

      Re: I've seen it working, and I liked it

      Just out of curiosity, what was the content of that 2k x 2k video loop? Content has a significant effect on encode and eventual playback quality, so knowing what was playing will give us a better idea on just how well it's coming.

  16. IGnatius T Foobar

    Patent nonsense

    All this discussion about whether it's appropriate to use or not use a patented codec is irrelevant. The bottom line is that software patents are a form of extortion and should be illegal around the globe. End of discussion.

    1. Anonymous Coward
      Anonymous Coward

      Re: Patent nonsense

      Yes Citizen Smith!

  17. Herby

    The ear is more fickle than the eye

    You can get away with LOTS of stuff when encoding video. The eye's bandwidth isn't that great compared to what can be put on a picture. The ear on the other hand, needs 16 bits at 44.1 ("CD Quality") to get things right. The only thing that you might be able to get away with in the audio realm is cutting off the high end as one gets older.

    The mind is very good at interpolating video, and those applications they use on the CSI shows that zoom in so far are even better. Too bad the eyes can't take it. Fleeting moving pictures are perceived by the mind it truly amazing ways. In the days of analog TV, an uncompressed picture could be rendered with 80Mbits/sec (or so). Compression of a HD picture to less than 1/2 of that shows what you can get away with when you "process" it and throw away that which is considered by the process to be redundant, or not needed. When looking at a many moving picture (football) at high speed, some of the codecs do fall apart. Oh, well.

    1. Charles 9

      Re: The ear is more fickle than the eye

      But one needs to be careful because our eyes are more perceptive regarding matters of contrast. That's why even very brief flickers of images (Wasn't that military test in the neighborhood of 1/200th of a second?) register in our eyes if they contrast enough. Though IIRC the fundamental codecs and algorithms work off contrasts, so this may not be a tremendous issue.

  18. Phillip.

    If tv manufacturers want it fine...

    WebM is superior to H265 in that it's good enough, not that much difference in performance, but has a totally open license. westlake, every single person that wants to upload video to the web is into content production. Imagine if HTML had a license fee for anybody wanting to upload a web page? Would the WWW be the sucess it is today?

    The patent-encumbered H265 certainly has its place. Television broadcasters and set makers are a nice little cartel of manufacturers and a patent pool prevents unnecessary law suits. Much like mobile manufacturers had until Apple came along and went nuts. The Internet is a fertile environment for disruptive innovation that the large established and dominent companies would like to get a strangle-hold on. There are far too many exciting things going on to want to cripple development with patents. Keep H265 away from the web and mobile.

    Phillip.

  19. Joerg
    Stop

    H.264 supports 4096x2048 resolution

    H.264 Level 5.1 supports 4096x2048 at 30fps resolution.

  20. Henry Wertz 1 Gold badge

    Aww man....

    Aww man, I just got most of videos shrunk with H.264 AVC. Oh well. (Actually I don't feel that bad about it, most shrunk by 75% or more, and the H.265 support in mencoder is, well, non-existant as far as I know.)

    "All those pretty words and not a single mention of WebM "

    That's because it's an article about H.265

    "WebM is superior to H265 in that it's good enough, not that much difference in performance, but has a totally open license. "

    Actually, there's a pretty big difference in performance. If you look at comparisons, it's pretty comparable to H.264 (one review says VP8 is better, the next H.264 -- suggesting to me they are pretty close.) H.265 will achieve much lower average bitrate for the same quality compared to H.264. WebM still wins on the licensing terms though 8-)

  21. Magnus_Pym

    Beware of optimized content

    Demo loops to show the 'power' of video codecs are usually optimized for what the codec is best at. As far as I am aware they usually work by not resending static content from the previous frame. The more 'intelligent' the codec the better it as at deciding what not to send. A scene of a bee visiting a flower is a good example. The background is stationary and the moving part of the image is very small, hardly any bandwidth. The next level is predictably moving background the background doesn't need to be resent only a message to move all blocks the same amount. A man walking past a fairly plain wall for instance. The problem is that film directors like jump cuts and car chases. These don't encode very well at all.

This topic is closed for new posts.

Other stories you might like