The US Federal Communications Commission says it wants to reclaim "over" 150 MHz of spectrum from television broadcasters as part of its plan to beef the country's wireless broadband. The FCC is scheduled to deliver its national broadband plan to Congress and the president in mid-February, and the man in charge of the plan - …
How does this work then?
Who's going to explain how you use the broadcast TV frequencies for broadband then?
The reason they were used for broadcast TV is they have decent range. The area covered by each transmitter covers hundreds of square miles, and quite possibly millions of people (or, in the middle of nowhere, a few dozen people). You can tinker with lower powers and directional aerials and stuff like that but the fundamentals remain unchanged.
If you manage to get a nice bits per Hz ratio on the kit you may get a few hundred Mbit/s bandwidth (total per transmitter) downstream. That works ok between a few dozen active customers, not so well between thousands of active customers per transmitter. The more customers per transmitter, the more profitable the service. The fewer customers per transmitter, the better the throughput. Hmmm.
And how does upstream work?
References most welcome. After all, nobody would propose a scheme like this without having thought it through, just to get a pile of VC money for the yachts and things, would they?
most != best
just because america has the most tv in the world I doubt that it classifies as best... Anyone seen fox news?
Paris, because well, who the fuck cares!?
Fox news is not Broadcast TV
FYI: Fox News in the United States is a cable only channel and does not use any broadcast spectrum.
You seem to be mistaking the network topology for the frequency.
900Mhz (top of UHF TV Band) seems to work quite well for a cellular network - or the 800+ networks have got it badly wrong The reason TV goes so far in the UHF band is because the masts (towers) are 200m + (1000ft) high and the transmitters are in the 100MW ERP range. So for dense broadband networks something closer to the cellular model 15M towers 100w transmit should give decent results.
Buy back? Who were the assholes that sold a perpetual license in the first place? To a public resource none the less. Now taxpayers have to buy it back? Meet the new king, same as the old. I wonder if the US government is embarrassed by their impotence, or they are just too enamored with big business to notice.
This just continues to prove what a joke the FCC is and it's auction idea in the first place. There's no shared bandwidth technology because there's no market incentive for it. No need to share what the government gave you a monopoly for (and will apparently buy back).
I thought that we just went through a nightmare of converting Analog to Digital broadcasting to free up spectrum for other uses?
No, don't be silly
It was done to line the pockets of the usual suspects. It's the same with this taxpayer funded boondoggle really.
How is another 150MHz of spectrum going to give everybody fast broadband? Even if mankind invented a holographic electromagnetic modulation that improved efficiency 1000x... Well, high performance fiber optics are here now and the telcos aren't rushing to deliver that. Levine and Kirjner saying that we shouldn't rely on new tech says a lot about US bandwidth plans.
Why steal the bandwidth from broadcast TV? Because everyone uses cable TV instead? Cable TV coax has far more bandwidth yet customers receive a few megabits and threats of disconnection for abusive utilization.
The way the FCC does their job, "the country that has the best broadcast TV in the world" is too lofty of a target. All of those $40 DTV upgrade coupons exempted high-definition tuners, thereby ensuring that antiquated MPEG2 streams will encumber our airwaves for as long as the old copper wires trickle data to our homes.
Indeed, going digital freed up a lot of spectrum. Now the FCC wants to use that spectrum for something other than TV, but the TV companies still own it. To make sure they don't fill it with more TV or something else, the FCC have to get it back and assign it to the use they believe is most important. In a democratic, capitalist economy, the right way to do that is to buy it back from the owners. In a socialist tyranny, the authorities would of course just seize it.
Hey, I just re-wrote the story you just read!
If there is excess bandwidth...
Let's see more tv channels.
My local (las vegas) TV environment has over 20 channels, some in spanish, one in korean. Two of them are shopping channels, a couple run great old movies, two others show classic TV shows.
So the government wants us to PAY for TV also? We need more TV channels, not less.
@vodoo trucker: the "Free" airwaves were assigned to radio and TV broadcasters based on a premise that they would meet certain standards of free entertainment and news as well as public service requirements. When bandwidth is auctioned off, it becomes a license to steal.
FCC eyes 150mHz of TV spectrum
I am curious as to where they expect to get that spectrum. Originally, TV channels went out to 83, The upper channels went to 800 MHz cell and mobile use. which made the upper TV channel 69. And now with the DTV conversion, and last year's FCC auction, the upper TV channel is now 52. So then, where is that spectrum supposed to come from????
Broadcast engineers have already recognized that the low VHF channels (2-6) are not DTV friendly, and that even the other VHF channels (7-13) have some issues. It seems that DTV is at its best when it is on a UHF channel. I personally think that the FCC should have recognized that DTV was not going to work on low VHF channels, and assign channels 2, 3 and 4 for low power analog translators. And then, there are a few people that would like to see channels 5 and 6 used to expand the FM band. That opportunity has been lost.
Now, as I understand it, in the US, a DTV channel can broadcast multiple (supposedly up to 6) separate programs. Not of all that capability is utilized as my own local area exemplifies.
For example, in my area, there are 14 separate OTA (over the air) stations, two of them I have difficulty getting a stable signal, and that is a consequence of my antenna limitations (cheap bastard landlord).So, of those stations that I can receive (12):
3 stations (two of them PBS by the way) deliver 4 distinct programs on their associated channel;
1 station delivers 3 distinct programs over its channel;
3 stations deliver two distinct programs on each of their channels;
2* stations that deliver the same program (one HD and one SD stream) on their channel;
and 3 stations deliver only ONE program over their channel.
Does the FCC plan to make some of these broadcasters either offer additional programming; or force a single program delivering station to vacate its channel and share one of the other channels??? (I can just hear the 'wailing and gnashing of teeth' if some GM gets a letter that says that he HAS to accommodate A COMPETITOR ON HIS CHANNEL.) Do they expect to move analog translators out of the UHF band, and move stations around??? I know that the broadcasters will scream like hell if they have to expend any more money out of their pockets to make changes to their transmitters and antennas AGAIN. So, again, I must ask, wher does the FCC expect to get this spectrum???????
* One of these stations recently shut off its SD stream, and now joins the ranks of a single program per channel.
It's not quite that simple, because we also need to eventually stop using SD resolution for TV
Depending on whether those subchannels are broadcast in 480i, 720p, or 1080i, the number of subchannels available varies. Basically, you can put 6 480i channels in one 6 MHz channel, the bandwidth assigned to one DTV license. In that same space you can fit 1 720p + 4 480i, 1 1080i + 3 480i, or 2 720p (the only way to get 2 "HD" subchannels in 6 MHz with ATSC).
Getting the spectrum back...
Yea, this is a REALLY bad idea. Have any of these fools taken the time to look through the MOUNTAINS of responses from the AUDIO INDUSTRY about the analog-to-digital swap that BUTCHERED our available spectrum for wireless microphones?
I just attended a huge conference on lighting for the theatre and film industries. The newest and shiniest kit was ALL designed to use wireless data signaling for the various functions. Thats all well and good... But what if this wireless-internet-for-all shite happens? All that kit may be useless.
I DO think I know how the fools at the FCC intend to 'recover' that spectrum which they sold for 1995 dollars ($$), and now will buy back for 2010 dollars ($$$$$$$$$$$). Compression. If the different TV stations of a given region all were to broadcast from the same tower, the signals could be all sent on the same digital frequency... And the individual channels could be digitally compressed so they would all fit within a current 6mhz band of a single TV channel.
That is a great idea... IF you don't give a rip about the quality of the picture, or that the individual stations are ALREADY putting multiple signals down an existing channel for 'extra services' like 15 languages of the same program.
This just proves that the FCC has no real idea of how the tech is being used... And they are simply being dazzled by the shiny money waved at them from the AT&T group that wants even more monopoly of the air. But listen to the howling from the public when they won't be able to hear their favorite show or even their preacher on Sunday b/c the wireless microphone no longer works. Its all been sold to AT&T for the internet.
But the bandwidth-sharing optimists run afoul of the physicists. According to the latter, there is a fundamental ceiling to the amount of data one can transmit on a given radio band, based on the minimum amount of energy needed to transmit a 1 or 0 and have it reliably picked up at the other end. And the physicists have the data to back up their theories.
I clearly remember when modem transmission speeds over telephone lines were considered to be limited by physics to about 2400 baud, based on the amount of data one could encode into an audio frequency of around 3 to 4 kilocycles per second, the maximum phone lines were designed to handle. Nevertheless, ways were discovered to increase this to 56,000 bps (using phase shifting I believe), even before new digital techniques were found to deliver megabits per second over the same wires (our current landline broadband). That required new equipment at the exchanges and between exchanges, but the wire between the exchange and the subscriber is often the same decades-old copper.
Oh my it was wrong once!
Physics must never be trusted again!
2400 baud is still about the rate you'd be able to push in the 3-4khz range with the signal processing capabilities available to consumer electronics of the time. Consider that an ISDN could do 128kbit during roughly the same time period.
And since those days there has been a fair amount of money poured into putting data over analog waves of various sorts, and it is rather safe to say that no surprising huge advances in data per hz are going to crop up. Also keep in kind that speed always costs more. Do you think they didn't know about the technology that goes into 802.11N back when they released the first drafts of 802.11B? They knew all about the physics, but there was no demand, and the gear would have been far more expensive, so they went with a slower connection because it made sense economically.
Actually, no it wasn't wrong
The physicists NEVER said "You can't get more than 2400 bps through a phoneline!". The engineers never said that either. However, what has been technically feasible within economic reason has always been a limit.
The limits of what you can and can't get through a phoneline are down to SNR and frequency response.
When the ISDN standard was written, it was decided that 8 bits per sample at 8KHz was satisfactory to provide a decent phonecall quality, reproducing tones up to the ~4KHz ceiling experienced on typical analogue phonecalls (i.e. fulfilling the Nyquist Criterion). They never said that the wire was the limiting factor.
The physicists have always known you can get more than 2400 bps through a phoneline, it's always been the technology at the nodes that have been the limiting factor.
No, 33600bps was the highest standard to run analog signaling. What made 56000bps possible was the fact that phone lines are not actually analog anymore, but in fact 8bit @ 8khz (64kbps) digital, with a-law or u-law sampling. If you know that's how the line works, you can take advantage of that to make more out of the line. If you treat it as a purely unknown analog channel capable of up to 4khz sounds with about 8bit levels worth of sampling levels, then 33600bps is really pushing the limits.
They didn't break Nyquist-Shannon's limit, they simply realized that for the modern phones lines that theorem didn't apply the same way anymore since they could treat it and use it as digital rather than analog.
56000bps was of course downstream only, while 33600 was both ways. It was the beginning of asymmetric internet access for consumers (not counting the 1200/75 bps modems an ancient history or the HST modems with their odd speeds).
- Nokia: Read our Maps, Samsung – we're HERE for the Gear
- Kaspersky backpedals on 'done nothing wrong, nothing to fear' blather
- Episode 9 BOFH: The current value of our IT ASSets? Minus eleventy-seven...
- Too slow with that iPhone refresh, Apple: Android is GOBBLING up US mobile market
- NASA to reformat Opportunity rover's memory from 125 million miles away