Reply to post:

Core-blimey! Intel's Core i9 18-core monster – the numbers

Tom 38 Silver badge
Thumb Down

Uhm, no. Hardware encoding is usually better quality as it does the exact same thing but is much faster and therefore can use more iterations...

Uhm, double no. Video encoding is almost always a three way trade-off between speed of encoding, visual quality of the outcome and bitrate of the outcome.

Hardware encoding is more limited in terms of codec features and options, because putting the algorithm in hardware reduces the amount of options compared to the flexibility of software. Especially so in consumer hardware encoders, which are small independent dedicated pieces of silicon in the CPU/GPU.

Now, this is dead easy to see because of CRF (Constant Rate Factor) in x264. You can tell an encoder that you want the visual quality of the outcome to the level indicated. It is trivial to produce one encoding using x264 and one encoding using a hardware encoder, both with the same CRF setting. The outputs will be visually comparable in quality terms, but the hardware encoded video will be larger in size.

So hardware encoders; faster output, same visual quality, higher bitrate. These are lower "quality" videos than a software encoder would produce, for a given meaning of "quality". For "scene" releases, no-one is using hardware encoders, because they produce lower quality videos.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Biting the hand that feeds IT © 1998–2019