Good article. I know nothing about Wireless HDMI, though I've seen a few device advertised here and there and never paid any mind to how they operate, and what the pros/cons are.
We’re now all used to HDMI, the digital equivalent of the once-ubiquitous Scart connector, the latest versions of which are capable of supporting multi-channel audio, 3D TV and a return channel, so you can feed the sound back from your TV’s tuner to a surround sound system. It has, to be honest, some rough edges here and there …
Good article but there is another manufacturer you didn't mention, namely brite-view... I use their AirSync HD product which is great (see: http://www.brite-view.com/air_synchd.php).
Uncompressed full 1080p and with LOS you get around 50-60 metres distance, with walls it's down to around 20-30 depending on composition but that is still good IMO.
The transmitter has a HDMI pass-thru which is fab... as an example, I have the xbox in the living room running through it and can either play on the TV in there or I can go play in the bedroom much to the relief of the other half.
Only problem is they are based in the US and while their kit is EU certified they have no EU disribution just yet so you'll have to get them to ship it to you... then you'll need UK to US plug adapters (or step down transformers) but the PSU's supplied are auto switching between 110 and 230 so socket adapaters work fine.
Anyway, go wireless!
Urg, arguments based around 'uncompressed is better' make me angry.
The Brite-view product is based on the WHDI 'standard' (in inverted commas because it's the standard for a handful of manufacturers, while other manufacturers have adopted various other standards as described in the article) which is built on the Amimon technology. This is often touted as being able to stream uncompressed HD and people get very excited about it because of this. While this is technically true, it kind of misses the point.
Amimon actually have a pretty good whitepaper explaining their technology (http://www.amimon.com/PDF/tech_article%20final.pdf). It's a recommended read, but it helps if know who Shannon was. The main thing to take away is that in all but the most perfect of radio environments, their system will introduce distortion to the picture. Admittedly the distortion will occur first on the least-noticeable parts of the sequence (ie. the high frequency information), but describing it as 'uncompressed' gives the impression that distortion will not be introduced, and this is why it's misleading.
Compression with H.264 also introduces distortion, of course, but again encoders are optimised to distort the least-noticeable parts of the sequence first. Bear in mind, though, that both Blu-ray and off-air sources are already compressed, to around 30Mbps in the first case and <20Mbps in the second. Recompressing this at 40Mbps+ with a good encoder then streaming over WiFi is going to introduce no perceptible distortion. The other issue is latency. The Amimon tech has almost no latency, but contrary to their literature, H.264 compression can be done with very low (in the order of 1 frame) latency.
Don't get me wrong -- the Amimon solution is very nice, and with line of sight it can deliver perceptually lossless video at imperceptual latency (particularly good for gaming), but their continued feeding of the myth that 'compressed is bad' really irritates me. I suppose it's the easiest way for their marketing people to summarise the above PDF for lay people, but that's no excuse!
ok, so output from hdmi i can get. but how does the reciever hdmi dongle get power? does hdmi carry power like a usb dongle? or are we talking about units with the tech built in?
for instance if my tv doesnt have wihdi or whatever it is how will the hdmi dongle get powered?
i ask as at the moment i have hdmi via a 15m cable from the pc in corner of the lounge to my hdmi amp on the other side of the lounge. this was done initially to play games on the home cinema from the pc. i am looking at relocating the pc to somewhere in the lounge that i cannot really cable to the amp from.
Since the BluRay or HD source is probably encoded anyway, doing a decent H264 encode isn't going to affect quality enough to be noticeable to most people (apart from those who spend £100 on an HDMI cable of course). That means you can send the signal over 802 n, which is a much better solution. Also means much cheaper devices and much better range - and less problems with spectrum licensing worldwide. Latency could be an issue, but that's down to decent encoding more than anything else. Seen it done with 720p and it was very impressive.
Those two sets of devices don't rely on prerecorded material and generate their footage (especially gaming footage) on the fly. Not only that, some of them can be very timing-sensitive (to the point that many TVs now have a no-frills Gaming Mode to trim the display lag) or produce high-intensity scenes that may not compress very well.
You know they have PMP adaptors that transmit over some local FM radio channel. Why not just do the same thing with video. Put it in the same MPEG2 format that you would recieve it in for broadcast purposes. Just use an unused local channel and "broadcast" in whatever the local TV standard is.
MPEG2 is plenty useful. MPEG2 broadcast streams are quite often better quality than h264 cable streams. While MPEG2 is kind of old and triassic, it also doesn't require supercomputing resources to deal with. If you can't be bothered with wires, then achieving the fullest possible potential of bluray is probably not a big priority.
According to HDTV specifications, the MPEG-2 broadcast stream is limited to 19Mbit/sec. That's only enough for a 1080i60/30 stream. You can handle a 1080p film (at 24Hz) OK, but anything faster than 30Hz is beyond using broadcast HDTV. Not to mention the most likely candidate to break that limit, gaming, will probably involve timing issues (because it'll take time to encode MPEG2, even on the fly) that could affect the gaming experience.
I'm with Bracken. There are plenty of good reasons for wireless connectivity, but not being bothered to hook up a 3m HDMI cable between two completely static devices is not one of them. There isn't an infinite amount of wireless bandwidth available; I live in fear of a neighbour getting a wireless TV extender that lives in the same band as my existing 802.11 and will cripple my bandwidth through the wall. Even the 60GHz solution has the ability to interfere, depending on what your walls are made of.
If you're going to stream video wirelessly, better to transmit the original h.264 bitstream and decode it at the display device. Decoding it wastes vast amounts of bandwidth; decoding it and recompressing it will introduce horrible artifacts (ever used a DVD recorder on the output of a digital receiver?) Better yet, just use some cables, then they won't interfere with (or be interfered with by) anything. Doing this wirelessly is a solution to a problem that doesn't exist nearly so much as the average consumer might think it does, and people have been trying to punt various version of it for years - fortunately with limited success.
If you really want to get an uncompressed video from the back of the room to a display at the front, make your neighbours happy: buy a projector.
“If only HDMI didn’t need a pesky wire to connect things up.”
I don't know. The reviews of TVs now always complain that they don't have enough HDMI sockets (or they are on the back, not the best place to plug in your camera / laptop).
Wireless HDMI (in theory) could overcome the limits of how many HDMI devices you are allowed to own (more than 1 console, Bluray, AV processor, Media tank...). It would also make inpromptu demonstrations from a laptop or camera easier.
But the downside is that public displays will be hijacked like in Iron Man!
Of course, you'd still have to buy a wireless display-port to wireless HDMI adaptor to interface the slightly different interpretation of wireless HDMI from an Apple macbook to your TV :)
There are times when the TV is changed out before the amp. In which case, the TV supports HDMI while the receiver doesn't. Most TVs support an audio out of some form, which you can then rig to the amp. Not the best solution in the world, but when it's the picture you're concerned about more than the sound, it's better than nothing.
Our big telly sports three HDMI inputs, but the AV Amp only switches component video. However the TV has a optical audio output that I've fed back to the amp, and that actually works quite nicely. I have far less issues with audio sync than when I used to have the amp connected direct to the Sky HD box.
I can see two main uses for the technology and it's not hooking up the V+ to the TV given that they're within a metre of each other...
The first is to hook up devices that do move, ie your laptop or your HD camcorder. But I don't think that's a killer application.
The second is to hook up the V+ box to a projector. Clearly a projector isn't going to be just 1 metre from the V+ box (unless you only have a projector and designed the whole room around that). Trailing a cable across the room from box to projector (high up, wall or ceiling mounted) would be either difficult or aesthetically unpleasant (and a trip hazard). Doing it wirelessly would require less plastering and leave less chance of tripping.
As you say, both of these are potentially useful, but projectors aren't that common in the UK, certainly.
It will be interesting to see what comes of the Wireless GigaBit standards - that may potentially allow things like the laptop to talk to the TV for display purposes with the same chip that's being used for wireless net access.
I've only used projectors for the last 7 or 8 years and found a fairly simple solution is to put everything at the back of the room. My projector, AMP, games consoles, Freeview box, etc are all housed in a unti behind the sofa, meaning only short cables are needed and I can use the entire opposite wall as a screen.
The "downside" is I can't use Kinect, but I don't really care. I use a wireless sensor bar for the Wii, but that's just a pair of IR torches. It doesn't emit EMR in a range where it'll affect anyone's data.
My concern with wirelss HDMI (beyond not needing it) is that I'd expect the picture to degrade if there was any interference from surrounding flats, etc. I take it there's sod all you can do in that situation, apart from moving back to using cables. Is that right?
Can't wait to see how brilliant these things are in a high density environment. In an apartment block, if this ever took off, it would be a nightmare. It's bad enough with WiFi, and that's not transmitting all the time!
Also as previously pointed out it's dumb as hell to be re-encoding content and then transmitting it, just shuffle the encoded content over the air so quality isn't compromised.
I can see these things working if they join the WiFi fabric, and stop trying to be an HDMI replacement, just make them a tranport for encoded content! Could work well for laptop/projector use as well.
we'll go in reverse order
cons..... It WILL give you cancer (Pulsed Microwave Radiation is used to splice up DNA in PharmaTek Labs)
Pro;s... You will be able to see your cancer in HiDefinition on TV :D (whilst you are left lying bedridden and dying in your bed)
don't ever believe the crap the industry pumps out, its the same misinformation as the tobacco industry used, the only difference is you don't even need to be in the same building to be affected.
"If, that is, you’re one of the few people who’s found a problem that can be solved by wireless HDMI."
While admittedly this does not happen often, I sometimes want to hook my laptop to the TV to show something to many people at once. At the moment, this involves a long, partially hidden HDMI cable which restricts the position of the laptop to the side-table where the cable ends. (movies/video clips are not a problem, I have a WDTV-Live plugged both to the hi-fi stack and the home network.)
The same problem also arises if I want to show something from the PC.
And yes, the situation doesn't arise often - but often enough to be a pain in the a## when it does.
The Quicklink only support 720P. Also I am guessing it gets it power from the charging base meaning it has to be unplugged from the HDMI and charged up regularly (Or requires a battery change). Since a lot of HDTV's have USB connections these days I don't see why they can't just use that to send/receive the signal. At least you can power the USB device from the port. It should be plenty fast enough for a bit of HD video.
I was in Toronto a short while back staying over in an apartment on the 15th floor of a building. Scanning for service I found I had 71 WiFi signals detected using a TP-Link directional antenna. These included 5 coffee shops, two McBarf outlets (not the same one), and other commercial establishments along with all the domestic installations.
There are simply insufficient channels to accommodate all these co-located signals.
Likewise, at a very nice hotel in the Far East, all the rooms had TV and audio equipment from the same manufacturer and every so often the programme would be interrupted by signals from an adjacent room.
Things are unlikely to improve from an accommodation standards point of view so it is incumbent upon the standards people to enable systems that at in close proximity to operate without interference. Infra red, possibly, is an answer.
Biting the hand that feeds IT © 1998–2020