HDMI cables used to connect a multitude of every day home audio and video devices may become a thing of the past if a new standard by the HDBaseT Alliance catches on. "HDBaseT technology is poised to become the unrivaled next-generation home networking transport to meet the ever-changing trends in the digital media market," …
PoE+ is about 25, some non standard systems that use all wires are 50 but 100 without using dangerous voltage levels?
So when do the special super-duper $200 6' cat6 monster cables come out?
Ask Denon, it will probably be 1999$ though...
over 100W is possible
I designed a PoE system for a major Japanese supplier which put over 100W down the cable, using 3 of the 4 pairs, the other pair was audio data.
All the voltages were ELV safe (i.e. touchable) and the current density was within NASA's safety limits.
It dissipated about one Watt per metre in the cable at full load, that's the problem, there's hardly any copper in CAT5 cable.
Getting all this and Gbit data will be a trickier prospect though. It would be most useful if they do it. Best of luck to them.
on the one hand, i'm sure they'll spend more time worrying over rights restrictions than making sure things actually work. On the other hand, hdmi is a fail and a half, with the cables that can't hold themselves in and all that.
How are they going to persuade people to buy gold plated oxygen free cat 5 cables for $100,
Oh wait, I know! the same way they do with HDMI, use a VCR for the "non monster cable" TV :D
Won't be popular
Shops won't want to stock it, most people are familiar with ethernet cables.
Why, Why, Why...
Didn't they do this originally. Look HDMI cables are nice and all that, but why not have nice easy RJ45 (8 pin modular) connectors in the first place. It would have saved EVERYONE lots of grief with expensive connectors and difficult to place terminations.
Actually a better system might be to have a SINGLE coax corrector for the video for this hi-def stuff (junk). Much easier to handle, just like the NTSC/PAL connector that was usually yellow and paired with left and right audio.
Great news, we finally have gig-E to link to the cameras rather than $500 camera link cables and grabber cards, npw we can get away from multiple DVI-D to do 120Hz 1080p to the screen.
ps. The $500 ethernet cable is already here http://www.usa.denon.com/productdetails/3429.asp
I gotta get me some of that...
$500 (£350 or so if you Google it in Blighty) for a 1.5m patch cable. Now THAT is impressive.
Finally something decent
I'm sorry, but I have tried HDMI, it just won't work. I have a connection of a mere 10 meters, but I already need a repeater, and even then I cannot do 1080p. It's really a shame.
Works for me
I also have a 10m HDMI cable. I think it was the cheapest one I could find, and was from somewhere like Dabs. It just works. No problem, even at 1080p.
However I had problems with stupid BlueRay playback and messages (in 1080p) telling me that I wasn't priviledged enough to watch the BlueRay film I stupidly paid for!! (It actually turns out I was priviledged enough, I just needed some software on my laptop to read the disk directly, and re-create the video without the stupid messages!)
But then Displayport was supposed to replace HDMI and hasn't.
Same connector, different use
This is such a bad idea I want to cry. Didn't we learn anything from the PS/2 ports? Ugh.
HDMI does work. This new standard is going to fail consumer wise
It will be very useful only in business environments without the need to buy multiple HDMI->Ethernet switchers/converters although for years still converting will be needed to attach current HDMI ports equipped hardware until the new HDBaseT ports will be included too.
@Christian Berger: you should use a properly shielded HDMI cable certified for 1.3a or 1.3b. I have a 12meters 1.3b Atlona HDMI flat cable for my 1080p Sony projector and it works flawlessly, no repeaters needed, no dropouts, HDCP is working smoothly with no disconnections.
2Kx4K standard (4096-by-2160)
is it just me - or is that name the wrong way round?
I have always thought so
I always called 1024x768 1024 res, 800x600 800 res, 640x480 640 res etc (as did my friends)
When I first saw 1080p I assumed it would be ITRO 1080x800, but no, "they" count the lines as more important. Therefore this 2Kx4K is probably right for "them", and will likely become known as something like "2Ki"
Stupid, useless fecking connection standard
This is nowt more than some big players trying to get licensing rights on something so they make more money. They get this in then everyone else has to start paying them a license fee to put the new ports on the back of their products and we'll all need to go out and buy new tv's, blu-ray players, consoles etc as our old uns will be incompatible.
Then, just as we get used to the new standard, having spent a fortune getting the new kit some other consortium will come up with something else and the process will start again.
Re-inventing the wheel ?
Isn't this "just an inverted HomePlug" protocol proposal ?
Put some uPnP love into each of the devices - connect to home electricity supply,
So this is going to carry HDMI (which may be up to 10Gbit/s - they reckon they can scale to 20Gbit/s), plus Gbit network, plus 100W of power... all over a cat5e cable? Huh; a 2 foot long one, maybe. If you're lucky. This might work for short distance interconnects (eg devices in a media center), but I just can't see this working for any distance. Cat6a requires all 8 cables just to push 10Gbit. Umm.
Perfect for watching YouTube
Where is the need to send 21.6Gbps of video long distances? Not even in 10 years will there be a consumer device that can source that much REAL data. It makes more sense to send the video in its original storage format, which will remain 5 to 100Mbps for quite some time, and decompress to high bitrate pixels at the point of display.
100W over Cat 6 - It would take about 45V @ 1.2A on each of the four pairs to survive 100m. Fire icon, because thats what would happen if you had a coil of extra HDBaseT behind your TV.
100W not enough for a big tv??
Why would the power need to go toward the highest power drain device in the network. Why not have the TV plugged into a power and HDBaseT and let the 100W originate at the TV then it can power the BluRay / Media Centre / other low power device. Really if your writing is earning you a living you ought to be able to figure out things like this.
One look at the major players, Samsung TV/BluRay maker, Sony TV/BluRay/PS3 maker, tells me this is going to happen. Other manufacturers already pay a premium for HDMI if the premium is the same for HDBaseT then we have a winner. Even if the premium is a little higher consumers will mop it up because the cables are so cheap. If you think this is not going to become the next standard your a nuttier than squirrel shit.
- Asteroids as powerful as NUCLEAR BOMBS strike Earth TWICE YEARLY
- Review Ubuntu 14.04 LTS: Great changes, but sssh don't mention the...
- Vid CEO Tim Cook sweeps Apple's inconvenient truths under a solar panel
- Got Windows 8.1 Update yet? Get ready for YET ANOTHER ONE – rumor
- Feature Reg man builds smart home rig, gains SUPREME CONTROL of DOMAIN – Pics