You say hectar, I say hectare - let's call the whole thing off
That is all.
It's ten days until the FCC will tell us the hows and whos of white space spectrum, but Microsoft has already switched on its campus-wide white-space network and is expecting great things. The Microsoft network was demonstrated a month ago, as part of the company's propaganda war to convince the FCC to allocate the white …
A .lot of spectrum regulation dates from a long time ago, back in technical pre-history. Back then radio transmitters broadcast a continuous signal on a fixed frequency and you tuned your receiver to get the broadcast. Regulation of transmitters was designed to stop transmitters from interfering with each other. (In the UK at least the regulations also seemed to be designed to make it difficult for spies to communicate secretly with the Central Powers....)
This type of regulation, like the technology its designed to manage, is very wasteful of resources. Modern radios are so different from them that it is going to take some time for the regulatory bodies to catch up. I'm not a MSFT fanboy but I do appreciate their efforts to push things along. They were, after all, primarily responsible for making WiFi usable and they did this by pushing out the technology campus-wide (and dragging the various IEEE groups along by getting the technology deployed).
The role of national regulators like the FCC, Ofcom, Bundesnetzagentur etc should be to use the white frequencies for economic broadband internet access.
This requires lots of nitty-gritty regulation and planning, but that is exactly what I expect from thousands of government employees living off tax.
Just directed WLAN (and also free-space optical) links (using a Yagi antenna) could help to bring fast internet (5 Mbit/s on average) for 100% of the population. Careful planning of wireless links is much cheaper than digging up the road.
10Mbit optical over 1400 meters (about one anglo land mile):
Remember: The great Shannon figured
with BR=Bitrate [bits/sec], SNR=Signal to Noise Ratio [1/1], BW=Bandwidth [Hz]
So if one could use just 40 MHz "whitespace" and have a SNR of 3 (that's not much), you had a channel with 63 Mbit capacity. 40 Mhz is just a small fraction of the terrestrial spectrum from 1 MHz to about 2000MHz.
Of course there must be a good architecture for different types of links/channels and different populated areas. Very much as it is the case with water, sewage and electrical systems/grids.
Radiated power could be pretty high on sparsely populated areas, but must be quite low in more densely populated regions. Standards for long-distance links must be drawn up. All of that must be simulated using geomodels and registered in a proper database.
Also, signalling provisions should mandate some sort of caching and broadcasting of otherwise dedicated links. If you download a youtube video over the air somebody else might be interested in caching that to save bandwidth in the future. Storage is now dirt-cheap, but network access is still a huge problem in large portions of populated areas.
All of that is somehow similar to mobile phone networks, but also somehow differen, as long-haul, microwave and optical free space links would be involved.
that the FCC is agressive in terms of limiting what they consider "white space." I have to pull in channels from ~75 miles away, several I get I am considered a little outside the service area. The last thing I need is some dick Microsoft product firing up over my TV broadcasts and killing reception.
"This type of regulation, like the technology its designed to manage, is very wasteful of resources. Modern radios are so different from them that it is going to take some time for the regulatory bodies to catch up."
It's not that wasteful in this instance, it's to avoid harmful interference. And, in fact, modern radios are different but RF propagation, interference, and so on, are still just as big a problem now as they ever have been. The FCC gave these a chance, several times, to detect signals and avoid interfering with them, and they completely failed. Microsoft's excuse was the units they provided were faulty (twice!) Well, so what? If every time a unit is failed it is going to blast out interference, that's a big problem. The fact of the matter is, detect and avoid would not work, the device could fail to detect signals because of a local dead spot, but produce plenty of signal to interfere with neighbors reception of that signal.
Frankly, there's easy solutions to this -- there's a large block of the 700mhz band unusued, there were auction terms where a company would get some spectrum for their own use if and only if they built out a public safety network with the rest of the spectrum -- the buidlout terms were pretty untenable so there were 0 bids. In addition, the gov't has 225-420mhz blocked off for basically military use, and reportedly an awful lot of this is NEVER used. That's almost 200mhz!
This sounds like the same story technology pushing forward with engineers being creative and the standards commitees who spend a year and a day to shuffle their papers and agree on what type of coffee to have at the meeting to decide what type of letter head they will use to send out the invites to delegates.
I say use the traditional USA cowboy mentality to get the whole thing started then reign it in when its established, starting off with a standard focused aproach usually stifles development take a look at the whole wi-fi max, 3G and other mobile technologies and the massive delay in HD videos and broadcasting, it took so long to agree on standards that it became obselete before its implemented and consumers will probably stop caring.
Biting the hand that feeds IT © 1998–2020