The lack of radio spectrum is probably the most debated subject amongst the delegates at the ITU's annual talking shop, with the military, emergency services and broadcast TV all in the firing line as punters want more data. Mobile data consumption is rising, and the popular airwaves are getting full, but strategies for …
There's a reason for space capacity in the emergency services
and I would greatly suspect that this applies to other users such as the military, utilities etc.
Routine radio traffic on routine days is at a routine level.
If you get a major incident then it goes up at some huge rate, and to be quite frank the current systems struggle (and fail) to cope with very busy periods that are not significantly large emergencies. I was on airwave (OK, I understand that frequencies in this case is not a direct relationship) in London for New Years Eve and the system was regularly overloaded to the extent that we went silent for up to ninety minutes at a time. There was more than one prolonged outage. I've heard an interesting rumour as to why but don't know the truth of that particular one.
Not everything can be done by the data messages that the sets are capable of, and given that our backup is in effect the GSM system which was, as usual for this event, also overloaded its all a bit creaky.
I can recall the days of everything being analogue and confirm that exactly the same problem applied there and for those users still on such sets, the same problems pertain.
There comes a point where you have to decide that some "resources" should be ring-fenced from commercial considerations in the public interest.
There is no shortage.
Basically there _is_ only a limited amount of spectrum, but what the doomsayers don't like to think about is the fact that bandwidth is spectrum/((cell size)**2). I don't only mean cell phones - a cell is just the range of a radio... . So if you have a cell that was 1 km across, then making it 100 m in size not only saves 95% of the transmission power, but it also increases bandwidth by 100 times. In actual fact in tall tower regions of cities the formula is cell size cubed, as you can stack cells up.
If you make devices smarter, so they only talk to the closest device, and use all the reflected stuff that makes wireless n work, etc - then as you add devices bandwidth naturally goes up, power required goes down and everyone is happy.
The trick is the smart devices. But 'they' are getting nearer on them. It will soon be really really fast everywhere, then faster as more devices pile on.
Economic incentives to backhaul providers.
Sharing your own WiFi and getting access to others' only takes you so far in this direction, e.g. as with Fon, as the provision and use of your own and others bandwidth allocation isn't a binary one or zero. Having smaller cells requires as many cell providers as possible, but people will only participate optimally if the rewards for providing backhaul are tailored to the frequency, volume, quality and extent of provision.
well said ac
the septics learnt this lesson (for the nth time) over Katrina.
You reserve space for emergency services; otherwise when the excrement hits the rotator the emergency services are also screwed. And while you're at it; you ensure your service contract for said emergency service covers everything 24/7 with a decently quick guaranteed response time.
Chuffing stupid politicians.
- Vid Hubble 'scope snaps 200,000-ton chunky crumble conundrum
- Bugger the jetpack, where's my 21st-century Psion?
- Windows 8.1 Update 1 spewed online a MONTH early – by Microsoft
- Google offers up its own Googlers in cloud channel chumship trawl
- Interview Global Warming IS REAL, argues sceptic mathematician - it just isn't THERMAGEDDON