With the launch today of the Freeview HD service, the UK can claim to be at the forefront of digital television - though most punters won't be able to receive the service until spring 2010. Today’s launch saw services switched on at the Crystal Palace and Winter Hill transmitters, between them providing coverage for around 22-23 …
"As we all know, Freeview uses a lossless compression technique."
No it doesn't. It's lossy, as you describe in your next paragraph. Just like JPEG and the MP3s you were slagging off earlier (along with half the UK population). And why would we "all know" that anyway? The Reg readership is pretty diverse.
I don't doubt that you are a clever bloke, but you've taken "patronising" to a whole new level, even for a Reg comment - and that's saying something.
I didn't slag of JPEG, but I did slag off MP3's.
Have you actually done a comparison of MP3's and CDs? I have. CDs are superior.
I worked in professional audio with high-end studio equipment. Am I critical of MP3's? Absolutely. Even Apple are aware of the flaws in MP3's, that's why they adopted the AAC coding standard.
Your description of the failure modes of analogue verses digital signals are partially correct but not complete.
A digital signal can degrade partially, or fully.
You are right in the context of a full loss of signal, there is no data stream, so what is the receiver to do? If the loss of signal is long enough then there can be only one result, loss of picture.
In a partial degredation scenario, which could be caused by a number of factors:
A weak signal, resulting in low signal to noise ratio
an interference signal superimposed on the information signal.
The datastream contains redundant data bits, which correct the errors that occur in these scenarios. If the errors can not be corrected, then the receiver has to decide what to do with the errant data. It knows it's errant data but what should the receiver do with it?
This could be just a few pixels in an image frame that are errant.
The receiver could try to interpolate ( and CD players often use this technique), or it could just display a blank pixel.
Your assertion that the pixelation on the image MUST be caused by compression artefacts is wrong.
Let's get some sense and technical truth on the go here.
Freeview is far from lossless. It uses MPEG2, same as on DVDs, but typically at a lower, constant bit rate (vs DVD's high, typically variable rates) and sometimes at lower pixel resolution too.
When it's fed a decent amount of bits, has a good encoder at the broadcaster end, and the material shown is sympathetic to MPG, it looks fantastic, slightly sharper than broadcast and without the dot crawl and slight fuzz that my local analogue transmitter produces. When it has less bits available (channel overcrowding and bandwidth/resolution choices made by the same clueless cretins that think 96kbit stereo is just fine for MP2-based DAB), a poor encoder or more demanding material, it becomes a horrific blockfest that will have you offering your writing arm as sacrifice to get good old PAL-I UHF back. This year's Eurovision repeats and Glasto coverage were a stark case in point - some of the shows were nearly unwatchable thanks to blocking, as it appeared to be attempting full DVD resolution with VCD bandwidth, in order to transmit some very high motion performances with intricate and very colourful detail. A resolution drop might have helped, and a quiet fallback to 25fps progressive (slightly jerky and aliased 352x576 with moderate distortion beats a screenful of 8x8 blocks moving at 50fps interlace in a 720x576 frame), but you can't count on the guys at mission control to think of such stuff.
Your results WILL depend on where you are, what channel you're watching, and when, as the different multiplexes in various parts of the country will have their own bandwidth limits and number of channels fighting for slots within that. As well as different content being shown, and actual number of airing channels differing throughout the day. You may never have suffered this disease, or it may be the bane of your life. I count myself fairly lucky that it only rears its head once in a blue moon, but then I do tend to stick to the main BBC stuff (...and Dave/Virgin1) which gets a bigger slice of the pie. Wierdly, the shopping channels also seem to be smooth as silk.
It's definitely nothing to do with signal strength though. MPG failure results in a whole barrel of other image freakouts, instead of simple blocking. That's one of the weaknesses in the system, at least until the promised power boost that comes with analogue switch-off. An oldskool signal will remain sort-of watchable with crap reception, until the audio becomes drowned out by noise and the image circuits are no longer able to derive accurate sync pulses (by which point, colour has gone, and you're left picking out silhouettes from the snow anyway). It's very graceful and gradual in it's failure mode, and I was very glad of it when living in north Wales, where our student dinner time Buffy fix gave the impression that the hellmouth was, unlikely as it seems, wrapped in a perpetual blizzard. With freeview, we'd have got nothing. Depending on your receiver, the picture will either continually lock up, or it'll go to a blank "no signal" screen, or - like my nan's cheap telly - display the signal in the raw regardless, with all the nightmare fuel that is the unmoderated output of a f***ed up mpg stream. Download some mpg files into your PC, use whatever method you feel like to throw a very small amount of random noise into the data stream (to emulate the sort of noise you'd hardly notice on analogue being _mostly_ but not entirely filtered out by the transport stream's ECC), then drop it in media player and watch the hideousness unfold. "Paint walls", psychedelic colours and distorted faces abound. The effects are often block-based, true, but rarely give a "proper" blocking effect unless it's your decoder being VERY clever in it's error correction.
So lets hope they can implement better signal strength AND higher bitrates somehow.
On which subject, I'm all for the variable bitrate idea, it's excellent for squeezing good quality into smaller filesizes in digital media... but I dunno if it's going to work in the broadcast arena. Or if it's even necessary. What happens if all, or even just a majority of the stations on a multiplex want to go to high-bitrate mode all at once? There's not going to be the bandwidth to allow it. There'll be massive compromise (and how in heaven's name do you balance it all anyhow, unless the encoding is only done all-at-once at the actual transmitter and everything to that point is either uncompressed or full-bore?) ---- and once you hit the point of that happening everyone once in a while, kiboshing the attempt to give a consistent apparent quality*, you may as well stay with CBR instead. Oddly, this is even what the iPlayer does, despite being online and serving up largely non-live material, where you could make good use of VBR efficiency. All the iP streams are fixed-rate. But, in doing so, they've also shown that there may not be need to vary the rate so much - the highest bandwidth offering at the moment is 3Mbit/s, for 720p (25fps, mind... unless it's my media player being wierd, cuz I could SWEAR it's said 50fps and gone smoother occasionally), and under almost all circumstances it looks just fine thanks to the MP4 (H264) compression that DVB-T2 also uses. Treble that to reach 1080p at 50fps (economies of scale occuring in both size and framerate) and you're still under the magic 10Mbit/s "average", which gives you a bit of overhead for those insanely OTT subtitles (MUST have a signing option, surely... I'm sure ceefax 888 got away with about 200 bits/sec for text), audio description, error correction, a slight quality boost, etc.
(It also has "ED" rez stuff (832x468, 25p framerate) at 1.5mbit and SD-ish (640x360) at 0.8...... and it still looks alright, meaning with such encoding we could probably cram in 50% more channels on each existing multiplex and it'd STILL look better and have less blocking. However the effects of interference would propagate for longer, given the bigger intervals between keyframes - a lot more of MP4 is made of "delta" frames that merely encode changes from the previous one, so if your TV thinks one frame is different from how it should be, that mistake's going to stick around for 3 or 4 seconds rather than the half-second or so maximum for MPG2)
Sooooooooo this'll be interesting to watch. It could be made of win.... or divebomb spectacularly. Let's hope it's the former, because like DAB it's unlikely we'll see a U-turn regardless of how horrid it gets.
(Now, can we maybe ditch DAB before it gets too much more market penetration and replace it with something using a more widespread and modern standard, maybe like what they have in europe or the states? Perhaps something like the AAC used in MP4 files. That sounds perfectly spiffy at 128k... Or at least can we use better MP2 encoders? I've used good quality ones that actually turn out listenable files at 160-192k, 128k when downsampled to typical broadcast quality (15.5khz) and 112k if you further compromise the headroom, though 96-stereo is still unsalvageable. Problem is, they're slowwwwwwww (maybe 2x realtime on my modern laptop), but it should still be OK for piping thru a server and out to air. Then we can work on beefing up the signal strength)
and another thing
BTW do they not even compress the subtitle bitmaps or use a "no change" flag? For plaintext with minor antialiasing, PNG crunching and an arbitary 1/2-sec update rate I'd expect to see the entire display covered with text at 200kbit/s. It might be more efficient with that kind of bandwidth to send an actual font file to the reciever every 10 seconds (well, on continuous repeat and spread across the whole 10s... 2000kbits=256kb=one _enormous_ font, whether bitmap or vector) with plaintext for the actual content. Even DVD bitmapped subs don't need anything like that amount of overhead.
Can I also second the plastic cutlery, too much choice argument. There may be room in e.g. the USA for 30+ channels of actual content (the other 50% of Freeview at the moment is QVC-a-likes, dodgy quality music channels, poor-man's-sky-sports and encrypted grot that almost certainly isn't worth paying for), but here that represents 2 million viewers IF each channel was evenly split AND everyone in the whole country was watching (which they're not, and they don't). Those kind of viewing figures, with promise of even more meagre slices to come, means serious bad news for the revenues needed to make new programmes - little wonder that a lot of the rest of the material on the "worthwhile" channels is itself endless repeats of older classic stuff. There's probably 15-20 channels actually worth anyone's time on freeview (and the same was true when my area was getting analogue cable in the mid 90s... including the bizarro foreign channels like Sat Eins and it's 6pm porno), if we can somehow democratically cut it down to that level we'd get a much better viewing experience (clearer picture, more money for making good programmes) and yet nothing of value would be lost. If there were genuinely interesting, niche channels that this would mean the axe for, then I wouldn't propose it... but there aren't. None that you don't have to pay for, anyway, with the exception of Film4 and Teachers TV which could be preserved as "public service" stations.
...not till 2011? Oh well, I guess I'm not missing much, what with the dodgy codecs and bitrates and all.
- NASA boffin: RIDDLE of odd BULGE FOUND on MOON is SOLVED
- Pic Mars rover 2020: Oxygen generation and 6 more amazing experiments
- Microsoft's Euro cloud darkens: US FEDS can dig into foreign servers
- Plug and PREY: Hackers reprogram USB drives to silently infect PCs
- Boffins spot weirder quantum capers as neutrons take the high road, spin takes the low