When you’re watching TV you want to be directly in front of it, right? So why should a manufacturer make a fuss about a TV's sides, specifically how thin they are? Even more puzzling is the fact that greater thinness actually does make a difference, as this sleek TV from Hitachi proves. Hitachi UT32MH70 32in LCD TV Hitachi's …
Handy, an RS232 connection on your TV...?
"The addition of the hard-drive tuner box, available free by redeeming a voucher, goes some way to redressing this."
What a pain in the arse way of doing things... You'll have to buy the screen, have it sitting on the floor, send off the voucher, wait two weeks for the tuner box to arrive, then finally be able to mount the thing on the wall and run all the cables.
1080 is greater than 720??
Ummm, this won't be downscaling your 1080i video as 720p resolution is actually greater than your 1080i. As you're using an interlaced video mode, 1080i is really only about 540 lines where as your 720p is using every one of those 720 lines. So while your 1080P may get cut down to size, you're 1080i will not.
Also, with at thin as this board is, why no comments or questions about the main boards on these having chips cooking at 400+ degrees and your board layers separating? Your heat dissipation is bound to be horrible unless they've built in some serious chocolate coverage. Anyone pull one of these apart yet to check the situation with the video processing chip on the main?
I may be reading this wrong but isn't it just a monitor with an add on tuner? As it comes without a tuner.
"I may be reading this wrong but isn't it just a monitor with an add on tuner?"
Things are changing so quickly now, that an add on tuner strikes me as a good idea. I don't think that any Freeview equipment on sale so far will support the forthcoming
HD Freeview transmissions. So the built in tuner is redundant. Ditto if you want Freesat.
I'm a bit surprised that Hitachi didn't do for a separate PSU as well, if they thin.
"So while your 1080P may get cut down to size, you're 1080i will not."
So 1080i isn't 1920 pixels across?
sod these expensive lcds
get a projector.. i got an optoma hd-65 for just over £400 and use it to project a 120" widescreen image. still using SD video, really don't see any reason to upgrade (although the harry potter order of the phoenix rental disk was terrible for pixelation).
yes if somebody release "the tribe" with the delectable anna friel in the buff on a decent blu-ray transfer i'd buy a player. for everything else, really it isn't worth it. If you are involved in a show/film you aren't looking at every leaf, its the face. and generally if a person is talking the background is out of focus anyway (to draw the viewer into the action and not be distracted by the background)
HD is a solution in search of a problem.
Oh wow... this isn't new
Buy a 16:9 monitor with a Play TV and an upscaling AV Receiver... and there you go
Thinness means jack in the AV world, quality is paramount and Sammy already hold a pretty respectable position in that field as for SD processing.
For anyone that seems not to know, 1080i needs to be deinterlaced so it is technically 1920x1080 but still a lower quality signal than 720p because of the pulldown techniques required to convert it to work on a progressive display (which all fixed-pixel displays are)
@ @ AC
No, it is, however it is not 1080 lines, it's really only 540 as it's interlaced. 720p is "more" HD than 1080i.
I'm sorry, I don't think I explained myself correctly. The article states that 1080I/P will be downscaled to fit the 720P settings. However, 720P is a higher resolution when you concern yourself with line count and true HD capacity than 1080i. It is not being downscaled in terms of HD image capacity as it is already at a lower overall quality. The resolution may be being decreased, but only the 1080p is really being downscaled. But I'm not sure if it's really going to downscale much as most sources allow you to select the output of your HD in terms of 1080i/p, 720p, 480i/p, etc, so there's really no real need to downscale if your system is already designed to distribute the proper scale.
I would call that a monitor sir... And this is what it is.
why would a TV need an RS232 port shouldnt this be a RBG monitor port as RS232 is a com port used before the days of USB
People commenting here don't understand interlacing...
Nor they do understand signals theory and what it appears on their displays after all the calculations.
Claiming that 1080i would be equal to 540p it's a wrong assumption.
Interlacing was created to halve the needed bandwith yes but the real resolution it's not half the source, the full source still gets reconstructed.. Interlacing was an analogue compression of the signal, a compression in the analogue domain in the '40s while the SDTV standard was invented and when digital devices still didn't exist and when transmitting the whole signal would have been too expensive.
What interlacing causes a serious distortion of the optical flow axis, the image looks fuzzy because information got discarded more in the time domain than in the spatial domain.
"No, it is, however it is not 1080 lines, it's really only 540 as it's interlaced. 720p is "more" HD than 1080i."
That is only true for crap sets that don't deinterlace 1080i and just discard one field. A properly deinterlaced 1080i is the same as 1080p but like SD some source material is hard to deinterlace (video).
so what's the point of a chubby 3 cm TV when you can get one which is that thick with the wall mount attaced or 9.9mm without, and it's 1920x1080 resolution 100hz etc!
interlacing does not decrease resolution
The difference between interlaced and progressive scan is nothing to do with resolution, it is only a different refresh system.
so 1080i is NOT lower res than 720P.
Interlaced scan refreshes every other line @25Hz (30Hz in usa)
Progressive scan refreshes every line @50Hz (60Hz in usa)
So progressive scan gives a sharper, clearer, less flickery picture, but it does not increase resolution at all.
1080i is higher res than 720P
720P is exactly the same res as 720i
1080p is exactly the same res as 1080i, clearer, maybe but the same resolution.
@ @ @ AC
Now there was me thinking that interlaced had the same number of lines as progressive, just half the refresh-rate - first cycle draws in the odd-numbered rows, next one the even-numbered rows. Unless I'm missing something...
Oh, and my lappy screen's only about 6mm. True, it has a box connected to it containing the rest of the useful components, but I'll ignore that for the purpose of statistics...
In pure bandwidth terms you'd still be wrong, unless it were an old style analog signal based on line numbers. BUT 1080i and 1080p both share a 1920 pixel horizontal resolution so they have figues of 1Mpixels/frame and 2Mpixels/frame respectively. 720p (1280x720) only runs at 0.9Mpixels/frame.
Also as the horizontal resoulution of this TV is 1366 it must scale down the image of 1080i (1920x540pixels per frame refresh).
Sorry to ruin you argument with facts.
Actually, AC you are incorrect . 1080i still displays a 1080 scan-lines picture, same as 1080p. The difference is that it takes *two passes* of 540 to do so. Which is why you will notice a blur on rapid movements or, if you convert the DVD ISO to AVI for example, you will see horizontal scan lines if you forget to hit the "de-interlace" option on your software.
So in terms of *resolution*, which you harp on, 1080i is better than 720p. In terms of *quality*, however...
I don't think you can argue that 1080i is only 540 lines. You could certainly argue that the effective refresh rate is halved (ie 1080p only updating at half the nominal screen refresh period of 50 or 60Hz), although it's not quite as simple as that.
1080i has a slightly higher pixel rate than 720p (~62 million pixels per second for 60Hz 1080i compared with ~55million pixels per second for 60Hz 720p), so there is more information available per second, which should lead to a better image. The visual difference between the two is likely to be subjective and dependent on processing, however. I suspect that 720p will generally look better on a native 720 line panel, 1080i will look better on a native 1080 line panel, but again it's never quite that simple...
math win, but tech fail all round! (except Simon Preston)
1080i has 540 lines per frame, true, but one frame is the odd lines and the next frame is the even lines. JUST LIKE PAL AND NTSC. I'd love to see a 1080i signal shown at 540p! Hmm, super widescreen, 1920x540.
frame one is like this
1 picture data
3 picture data
frame two is like this
2 picture data
4 picture data
"all fixed-panel display" are certainly NOT progressive scan only, but to do the job of interlaced display properly it's probably best to display line-for-line, as 1080 and 540 screens do. I have a Sharp "P" series TV at home that's perfectly happy to show interlaced content without deinterlacing (though it does have the option to do so if you want), and I have a JVC DT-V24 at work that does likewise. Either way, the original AC poster got his facts ALL wrong - anyone buying a screen that can't display without scaling is wasting their money. And can we stop all this 720 is better than 1080 bollocks? It isn't - the fallacy comes from the idea that all interlaced (ie 1080i) material needs to be vertically filtered to minimise flicker, but this vertical filtering is by no means consistently applied and - of course - progressive scan material such as most movie discs or transmissions neither need nor use it. 1920x1080 is the best format available, bar none.
John, you terminology is wrong
"1080i has 540 lines per frame, true" INCORRECT. 1080i has two FIELDS per frame, each field being 1920x540, the temporal resolution of each filed being double that of each frame. I'm not a fan of interlacing, but there seems to be an AWFUL lot of confusion about what it is and how it works out there in vulture land.
The shift from a TV 3 foot deep to one 6 inches deep, served a practical purpose. This is hardly a revolution. I really don't see the point unless you've got more money than sense.
The market will eventually shift this way anyway and then all TVs will progressively get thinner, so don't waste your money paying a premium for this. (unless you so some reason need to stand a TV on your mantle piece.
...many LCD and plasma TVs (and set-top boxes) have RS232 ports. It's often a service port for programming, disagnostics and updating firmware.
the 37" version is full HD
which probably makes sense
Now what sort of a word is that?
As another poster said, it's for servicing.
But it's got a USB port -- why not just that?
Who were the early adopters of flatscreens? Airports with 486 Windows 95 or NT 3 displays -- pre-USB. Replacement screens for this sort of legacy systems are still a potential market, so there will always be a need for RS232 in some models. As most units from a particular manufacturer use the same base PCBs, RS232 support is integrated. The only option is whether to put a socket on it or not. This socket costs pennies to the manufacturer, and saves the hassle of multi-versioning their boards: retooling the assembly lines and managing parallel stocking. For this, their commercial customers can buy any model they like with minimal hassle. If it's hard to find a suitable model in your line, they'll go elsewhere. The cost of the RS232 is perfectly justifiable when you consider how many screens there are in your average airport, and how many airports there are in the world....