LTE can knock out cable TV, according to new analysis funded by Ofcom, which is bad news for anyone who likes to talk and watch at the same time. The study confirms suggestions last year that LTE deployments could interfere with cable-TV. Ofcom's research found this could indeed happen if the LTE handset is exactly the same …
Did this as a test with my hand held transmitter only kicks out 4 watts but killed the box untill i de keyed.
Was some where in the 400mhz band
Only 4 Watts?
Only 4 Watts?
25bDm is 316mW, you were pumping more then 10 TIMES the power into your STB, I don't see this as any meaningful datapoint.
It is exceedingly poor design on the STBs side to not shield their equipment enough to withstand 316mW at 1M. The CableCo's have noone but themselves to blame.
I guess they could back-track!
Keep the bandwidth for TV, expand freeview HD and give the platform a future.
Sounds similar to the issue between Hams and homeplug users
although in the instance going from the wireless world to the wired.
Perhaps before we go too far, some new standards need to be introduced (and enforced) for all signal related kit. It wont silence what is already in use, but after a little while it will stop the problem getting worse, and as the old stuff dies off things will get better.
Either that or fibre not just to the home, but all the way into the back of the TV!
"[a]ll of the STBs were found to have significant rectangular holes (apertures) in the metalwork that can allow unwanted frequencies to pass through"
Block up these holes and I can fry me supper while I watch the footie.
And power efficient too.
They could have easily solved the airflow problem and maintained a reasonable RF shield by the use of some metal mesh or grills... as long as the gaps are less than the wavelength of the 4G signal (IIRC), then you have win!
I admit this probably does add to cost/complexity somewhat but its not rocket surgery.
I reckon it's more like brain science.
On a serious note I'm curious to know whether the problem constitutes an EM susceptibility fail under CE-marking approvals, or if the field strength is above CE test levels. If the former, blame the STB/cable modem vendors, if the latter blame everyone else.
Cable viewers, please note
Don't sit less than a metre from the big screen TV you've got when on the blower. Simples.
"standards need to be introduced (and enforced)"
The important bit is *enforced*.
The standards (the CE EMC standards) are here today but there are significant vested interests who prefer not to see them enforced (importers of cheap Chinese junk, BT Vision, powerline networking in general, the list goes on).
Similar issue is DECT phones and Satellite IF cable. Can knock out 5 to 10 channels. Very difficult to cure if you have the problem.
It can be the screening of the coax or a sightly "iffy" connector. Not just the box screening. http://www.techtir.ie/blog/watty/dividend-hurts-cable
Cable TV... Broadband?
So would this affect the broadband signal as well, which IIRC is squirted down the same line in the same format?
Surely Virgin are OK?
"Cable TV uses the same frequencies squeezed down a wire, but if the LTE signal is strong enough it can sneak into the wire too, generating interference."
<tongue in cheek>
I read some Virgin press-release / adverts that they use fibre, so they will be OK, no?
RE: Surely Virgin are OK
Most cable companies provide fiber to the house; the inside of the house still uses coax.
Actually its Fibre To The Cab currently and Coax to the house - its still better than ADSL though unless you live by specific places.
Re: RE: Surely Virgin are OK
You felt the need to explain that even though the OP had both clearly stated it was meant "tongue in cheek" and used the "coat" icon to boot?
Here's another icon for you to ponder the meaning of....
Colour me surprised...
Although good luck finding that felt tip pen.
DECT phones have been known to inferfere with Sky boxes and whatnot... surely a simple mitigation would be some form of Faraday shield? Wrap it in tinfoil, perhaps?
Shabby Cable equipment
Stuff leaks noise everywhere.
Defective cable TV wiring can even block air traffic control frequencies
In some countries cable TV systems use signals in the RF aeronautical band and a cable defect, such as a shallow buried drop to a house getting 'shaved' by a grass mower, is in the vicinity of an airport this interference can be quite severe because of the decreased land/aircraft separation.
Guard bands have a purpose and in selling them off in the interest of enriching a countries treasury may prove to be a dangerous/expensive thing.
The easy solution, IPTV.
This was an issue with analog cable service when they used the same frequency as the broadcast channel. You would have a ghost picture and the cure besides digital was to use a different frequency so that the ghost would not appear.
Just skimmed the report...
They *only* tested at 1m separation, but the required power to cause the same interference should change with 20log(d), where d is the separation in metres (that's just the definition of dB, right?). No idea what that means in practical terms.
When they are analysing the STB and CM design (page 45), they say that when they talk about "slots" and "apertures" they mean rectangular openings (maybe containing plastic connectors), NOT "grids of small cooling holes". It seems that the main problem is holes for connectors where the connector itself isn't surrounded by shielding.
They refer to the different boxes by codes throughout. They don't (AFAICT) say which box is which in any identifiable way. They do describe the internal design in reasonable detail in Appendix C, though. I guess someone dedicated enough could work it out.
This is the sort of bureaucratic twaddle that pisses people off.
Bureaucrats are allegedly paid to work these details out BEFORE making decisions to sell spectrum, not sell the spectrum and see where the crap flies.
What's the problem?
First, this is a problem at 1M distance, when the phone is at maximum power. I can tell you that my phone (CDMA admittedly) typically operates at -20 to 0dbm (1/100th of a mw to 1mw.) I would guess you'll find there's very few locations where a phone would EVER reach max power; I've seen it at one rural location (i.e. in a place like this you probably wouldn't have cable anyway). In the case of thick buildings, basements, elevator shafts, etc., I've found my phone loses reception WELL before it's commanded to use maximum transmit power (I'm sure this is due to the cell site's larger antenna, as well as having much more expensive and better receiver and amplifier than any phone could afford to have in it.) And even then, just don't set the phone on your TV. Simple!
Second, wouldn't this ALREADY be a problem with the current GSM mobiles? I know here in the states, between analog TV, digital TV, and cable internet, it tends to use everything from 0 to at least 1ghz. I would think if they pulse so hard you can hear that "blap-blap-blap-buzzzzz" from speakers and such that it'd really interfere with nearby cable boxes too.
Regarding the fiber -- albeit a joke, it could help. A shorter coax run would mean you're more likely to be getting higher signal strength, so interference would have to be stronger to, well, interfere. Also, it could mean a lower bit error rate compared to longer coax runs, multiple chained amps, or the like -- if less error correction capability of the digital cable video stream is "used up" correcting errors already in the received signal, it can be applied instead to correct errors from minor interference.
"I'm sure this is due to the cell site's larger antenna"
That antenna is used both to transmit and receive - so if it results in receiving signals further inside the building then the commanding signal from the base station antenna will also get further into the building!!! The transmit signal of the device may also increase to try and overcome interference - it's not only a signal strength issue. So there will be places where maximum TX power is used, especially indoors where most Cable TV boxes sit the last time I looked! That said most handhelds struggle to get within 3dB of the maximum power stated in standards due to crap design, so that issue may save the day.
Well designed networks are balanced or only have 1 or 2 dB imbalance between uplink and downlink - to do otherwise would be a waste of money/resources. Essentially bigger transmit power (than the mobile) from the base-station is balanced by better reception (than the mobile) at the base station by using more (not bigger) antennas and other techniques.
- Geek's Guide to Britain Kingston's aviation empire: From industry firsts to Airfix heroes
- Analysis Happy 2nd birthday, Windows 8 and Surface: Anatomy of a disaster
- Review Vulture trails claw across Lenovo's touchy N20p Chromebook
- Adobe spies on readers: EVERY DRM page turn leaked to base over SSL
- Analysis The future health of the internet comes down to ONE simple question…