How big is big enough? It’s a question many of us have asked, as we cruise the aisles of Currys or John Lewis, looking for a new TV. It’s all too easy to be seduced by a special offer, or by extra features like net connectivity, and end up with a TV that’s larger than you anticipated. And while you might make space for a jumbo …
"Hands up who’s rearranged their living room to move all the chairs closer to the screen after buying an HD set? Anyone?"
Strangely with the screen thinner, we actually moved everything further away to make the living room yield more floor space.
On a point of resolution, I have a pretty bog-standard 42" Philips LCD and I have found that a UK comedy prog played from an AppleTV2 still looks pretty good even a very low res of only 448x262. BBC documentaries taken from DVDs are encoded at only 624x352, this appears to be right on the limit of almost "blocking" in very dark areas but is perfectly acceptable to watch. After a while you forget all about any minor niggles and simply watch the content rather than the medium. The upshot is that you can store absolute bucket loads of movie files from DVDs on only a small 4TB NAS unit. No doubt in 4-5 years time that res will be useless, but the TV is only 18 months old and has a replacement warranty for the next 4 years so it's not going anywhere for a while yet.
"BBC documentaries taken from DVDs are encoded at only 624x352, this appears to be right on the limit of almost "blocking" in very dark areas but is perfectly acceptable to watch."
That's less to do with resolution, and more to do with compression artefacts. Have a look at your quantiser settings...
There are flat panel TVs out there with non-square pixels?
Either way it would be very difficult to tell the difference between that and 1920x1080 under most circumstances.
I think the BBC is talking about broadcast resolution. Every digital TV has a scaler that can scale from a large number of resolutions up to the 1920x1080 panel resolution. (Or to whatever your panel resolution is, if you've got an older HD telly then the panel is likely lower res).
Even in the SD case, it's common to broadcast TV channels at two-thirds horizontal resolution and get the Freeview STB to upscale them to normal SD. This is typically used on news channels and the like, to save bandwidth. Premium channels like BBC1 and BBC2 get broadcast at full resolution.
As with all things in life, there's a tradeoff. This one is a trade between picture quality, number of channels, and amount of radio spectrum used.
Or get a tape measure
For a 16:9 screen with a diagonal picture of size D, you can calculate H using Pythagoras.
Or use a tape measure
I could have told the researchers that HD on a 32" set was pointless, since to my (reasonably good) eyes this looks no better than normal PAL on a 28" set. Of course, Americans have been putting up with lower-resolution NTSC on mahoosive TVs for years, so there is more of an upgrade imperative there.
HD is only necessary due to the relentless increase in screen size. When you get to 40" and above PAL really doesn't cut it, and 40" is a smallish TV by new standards.
Anyway, we still have a 28" CRT in our living room. The quality of programming is sufficiently bad that we don't feel the need to upgrade. You can't polish a dog's egg.
It's not just about the resolution
>I could have told the researchers that HD on a 32" set was pointless,
Not necessarily. It's also about the colour range. HD colours are more vibrant and have better degredation. Shiny surfaces actually look shiny rather than just 'a bit more bright'.
What makes HD colours better than SD colours?
Combination of canny marketing and the oxygen-free-copper interconnect effect.
A good question
Good question with two answers:
First is colour - the more advanced video encoding (AVC) used for HD supports a wider range of colours than the older system (MPEG-2).
Second - scene changes. With MPEG-2 SD transmissions there are a certain number of blocks per frame which show up on fast scene changes and are more obvious on HD displays because there is less blurring. AVC supports more blocks per frame and smaller ones with finer movement.
I bought a decent plasma and had to swtifch from Virgin to Sky because Virgin's HD was in MPEG-2. It didn't help that I work with this stuff so I noticed every block in the end.
Perhaps AndrueC is posting from Canada (correct spelling of colour rules out a septic). North America paid the price for being the first to have colour TV by being forced to use the dreadful NTSC (Never Twice the Same Color) standard. US viewers go wild for HDTV as they discover their favourite newscaster doesn't actually have green skin.
Technically you can
Adam and Jamie proved that you could polish that particular item on an eisode of mythbusters
709 vs 601
The RGB primaries used for HD material are more intensely saturated than those used for SD material, hence HD can encode more intensely saturated colours. A lot of screens also support xvYCC, which allows encoding more intense colours still by allowing negative RGB coefficients. So far as I'm aware, no broadcast standards nor mass distribution standards support xvYCC, but if you've a PC displaying photos that use scRGB or similar, that might legitimately be able to exploit it.
Re: Colour range
The newscaster never had green skin, silly. It was deep, bright pink. His hair was green.
Hmm - seems to be quite contentious this one. Looking around on the web it's also contentious there. I'm no techie so I don't know how/why but the colour in HD pictures just looks so much better to my eyes.
I used to watch American Choppers before it became really crap and the first time I saw it in HD as astonishing. The completed bikes suddenly became polished and buffed. I've since seen the effect on many things. Grass is another example. On Time Team crop marks are more obvious. Come to that shades in soil are more obvious. I've no idea whether it's the cameras, post processing or the transmission but to my eyes colour is better when watching an HD program.
Still - if it's not part of the standard then it's possible that my TV (Samsung) and my mate's TV (Panasonic) are just doing something clever for HD. I don't know about my mate's but I turned all that crap off on my Samsung. I'm a strict member of the 'give it to me straight' brigade when it comes to TVs and I've never bought one yet that didn't need a good hour spending turning off features and toning down the colours.
Re: HD Colours
Thanks for the explanations. I hadn't figured on the fact more advanced codecs were being used for HD broadcasts. Here in Australia I believe we're still on MPEG-2.
re:A good question
I always thought Virgin HD looked a bit rubbish. Now I have a reason why. Thanks.
One thing that would make it easier
would be to force manufacturers to quote the screen measurement in proper measuring units, i.e. millimetres or centimetres. You know, just like they do in pretty much every other country in the world.
Well, you know what you can do
If you love silly foreign measurements ("we have 10 fingers so we should base everything on 10 and then have lots of fractions, because people find counting harder than dealing with fractions, duh") so much, you could just go and live elsewhere.
Back in the real world...
Every Tv I have seen has had both size advertised for decades. But there is a real difference in what they measure.
The imperial scale typically measures the tube/screen/panel size of the unit, where as the metric size must measure the picture size.
This has been true since the 1980's at least, as it was a EU directive back then, but it only applyed to metric measuments.
You want people who support decimalisation to pack up & leave? TBH I couldn't give a fig what standard of measurement we use here. I just wish we could make our damn minds up and make a bloody decision.
32", so I'm one of those underendowed watchers. But if I'm playing COD, my nose will be right up to it. If, on the other hand, the wife is watching Desperate Housewives my viewing distance becomes exponentially and progressively larger as the tedium unfolds.
Metric screen sizes
Unfortunately even on the continent, where we measure everything else in metres, screen sizes are still measured in inches diagonal.
This has contributed to the idea that while the exact size of a meter might vary according to the temperature, the imperial system varies with politics (and most would rather trust the weather).
When 14" screens for office PC's were outlawed, the same screens reappeared on the market as 15" screens without increasing in size. That's apparently because someone decided that a screen diameter in inches didn't need to specify size of the picture area of the screen.
So now we describe screen sizes in inches and we accept that an inch is generally somewhere between 2 and 2,5 cm.
To Robert and other Metric inch-lovers
I think we should ban people who hate decimals from using them to express their ancient measurements. 7.625 inches? Nope, banned. You must stand by your principals and use 7⅝ inches. I hope for your sake you're not a software developer. Or maybe you could just bring yourself forward to the 19th century and go with 19.3675cm.
The point abut foreign measurements is also irrational. Unless you're Roman or French then you have no rational choice but to use a foreign system.
And strictly speaking
Given the amount of help the French needed to get it to work, SI (don't call it the metric system, that's a pejorative term) is technically a British invention. We did a sort of cross-licencing deal with France: they let us think we invented front-wheel-drive cars, in return for us letting them think they invented metres, litres and kilos.
I think you are actually refering to the SI unit of a metre
So it's actually 0.193675 m
(reposted after making a complete arse of the arithmetic)
Making it easier
For small angles, measured in radians, the tangent of the angle is equal to the angle.
One minute of arc is 0.00029 radians
For those into target shooting, this is one inch at 100 yards.
It simplifies things a lot.
Radians are defined by a measurement along the arc of a circle. For these small angles the difference between a straight line and this arc is minute.
(I learned this at school, but I keep being told exam standards have slipped.)
@Robert Long 1
Actually, we have two sets of 8 fingers with an extra pair to serve as "carry" and "sign" flags. We should be measuring everything in Hexadecimal--the only number system designed for humans.
Very good, except toes aren't fingers.
Because of the Metric System
"A Royale with cheese".
And in Paris you can buy a beer at McDonalds.
Oh I see, I meant to type two sets of 4 fingers, not 8.
I didn't count the toes each hand has 5 fingers.
5 foot for 46" here
And still seems a bit far at times.
You can see the difference between SDTV, BBC HD & HDV, Games and normal Bluray.
Use direct pixel mapping with 1920x1080p
BBC HD looks a little less detailed - same as HDV
Games are mainly I think 1280 x 720 - again looks good but slightly less detail.
SDTV looks pants in comparison
1920x1080p Blu Rays are best.
I sit closer for games as well!
I thought it looked less detailed, but then they broadcast a hd scan of indiana jones and it looked wonderful. I guess the hd cameras they shoot stuff on aren't any match for a decent film scan.
ITV (STV in my case) looks far less detailed than ch4 / bbc hd.
I sit about 2m from my 40" telly and can definitely tell the difference between 720 & 1080 content (blu-ray is far and away the best).
That said, I've noticed a distinct improvement upgrading from VirginMedia V+ (one of the old Scientific Atlanta boxes) to their new TiVo box thereby demonstrating the importance of the source. Also I can now get HD versions of many BBC3/4 programs via the TiVo iPlayer app.
BBC HD is HDV resolution
1440x1080i rather than 1080i of normal HD and 1080p of BluRay
If you look closely
BBC wildlife progs often have shots of the cameramen at work. You can clearly see 720p on the side of the kit.
@ Chris Miller
"You can clearly see 720p on the side of the kit."
As long as you're watching it in HD...:-)
One is reminded of Amy Wong's naughty tattoo from Futurama.
The natural history film maker these days often favours a Panasonic Varicam, owing to it's excellent CineGamma features and 720/60p mode. Don't worry about it and just enjoy the pictures. 1080p cams are often used for establishing/aerial shots, DSLRs for timelapses etc etc etc
Seeing the extra detail
I can usually see the extra detail when I switch between the SD and HD versions of a channel, but really, aside from on-screen text or headlines I tend to find I stop noticing the extra detail after a while. I think we're about 2.7-ish metres away from a 42" screen (why do we measure screen distance in metric and screen size in Imperial anyway??)
At the end of the day, it's program quality not picture quality that counts. You could broadcast Eastenders in 4k and Doctor Who in 480i and I'd still only watch Doctor Who...
So what about 21:9 screens with proper 2.35:1 aspect input akin to all the photos in the article?
Do your own maths!
I'll post the workings for all of this on my blog later, so you can see how we calculated the figures. It's fairly easy from that starting point to work out the values for a different screen ratio.
You've got it all wrong....
There are two criteria for choosing a TV size....depending on where you live.
1) It must be bigger than the people who live in the council house next door.
2) It must cost slightly more than what you can afford.
Now, being serious (did you realise I wasn't being serious above?), were the tests done with very high bitrate images (I may have missed that in the article)? the main problem I have is the crappy blockiness caused by low bitrate FV channels, although it does seem to have improved since the switchover, but that might be wishful thinking. Unless all the videos were pushed passed the point of blockiness these test would be, like a blunt pencil, pointless.
It must be bigger than the people who live in the council house next door.
All of them? What if they're basketballers?
All in the white paper
The BBC white paper we linked to goes into considerable detail about the methodology.
The main purpose was to determine the level at which the eye can perceive detail, so static images were used, with an un-manipulated one, and one to which various filters had been applied, noting the point at which the test subjects were able to discern the effect of the filters.
The result was, as we say, broadly in line with the accepted acuity, 1 minute of arc, which is the figure I used in working out the rest of the maths.
on freesat there's this ``landscape channel'', which shows (er) landscapes etc for use as a sort of relaxing background or something. You wouldn't want to actually watch it, tho, as even even on my tiny 15in crt tele, I can see blocks and other artifacts.
And when watching some footy the other week, the HD channel version looked detectably crisper than the ordinary one (my freesat box is over specced for my tv).
You made that TV yourself I assume.
As if you have a CRT TV of any size it will not have the required inputs for HD, just the basic arial or scart connections (and maybe composite).
But none of those connectors will privide HD content to a TV, just standard PAL resolution.
To get HD you have to have a TV/Monitor that can handle the digital connectors.
(Note: the above is for TV's, not PC monitors as they can go higher)
letters and/or digits
Not *strictly* true, there were a few CRTs made which did have HDMI inputs. What happened to the signal once inside the box was anyone's guess however.
HD channel -> crt
I didn't think it was a pixel issue, just a bitrate one. After all, footy on tv does involve quite a lot of fast moving tiny figures on the screen.
What I am saying is that the pic spat out onto the scart lead to my tv looked slightly better if the box was using the HD channel than the standard one.
Much as it is often slated there is actually a very useful article on Wikipedia about Visual Acuity - it might have been worth a read before publishing this drivel - especially the section on 'Normal vision' which is NOT the same as 20/20 vision.