802.11 ay?
So, we've already had 802.11a, ah, ai and now ay. I look forward to seeing 802.11aye, i, eye, eh, r, ar, arr, and argh.
Not trying to confuse, eh?
Forged nearly 20 years ago, the 802.11 wireless networking standard was responsible for cutting the cord and letting us roam. During that time, 802.11 has evolved as devices using it have both proliferated and got smaller – while the data they swallow has grown in quantity and in size. In March the IEEE OK’d the latest chapter …
"Just run an ethernet cable ffs!"
Because we don't want ethernet cables trailing all over the house. We use home plug because the wifi in our area is slow due to so many people using it plus hackers are less likely to attempt to hack the signals coming off from ethernet along mains cables.
Anyway, no offence but your hobby is a bit niche and you do have quite a number of bands to choose from which won't be affected, so sorry if the 21st century is impinging on it but thats just kind of tough luck really. We all have to make sacrifices for progress at some point. I'd love to see a blue sky and silence once in a while rather than the contrail polluted mess and jet engine noise over where I live but its not going to happen anytime soon.
To all the ham acts who modded me down - just bear in mind that plenty of ham enthusiasts in the past didn't give a rats backside about the interference they caused to old analogue TVs, radios and VCRs when running high power. They were working within the law so tough luck, put up with it etc etc was their usual response.
Well what goes around comes around and if they now have to put up with a few milliwatts of leakage from next doors wiring when using AM on shortwave then too damned bad.
> To all the ham acts who modded me down - just bear in mind that plenty of ham enthusiasts in the past didn't give a rats backside about the interference they caused to old analogue TVs, radios and VCRs when running high power.
I'm not a ham (but I do know a couple). You attitude is like someone who wants to hear their radio at the end of the garden so turns up the high power system in the house to 11 - and bugger the neighbours who might be trying to enjoy their own garden and perhaps listen to gardener's question time (quietly) on the portable radio.
> They were working within the law so tough luck, put up with it etc etc was their usual response.
Actually, if it was then they would not have got much support from other hams. The usual response would have been "your equipment is at fault, take it up with whoever you bought it from" - which you probably mis-interpreted as "I don't care". Proper hams do care, but they are not responsible for people buying cheap sh*t that's not fit for purpose.
The reason I say that is because in almost all cases, it is the equipment at fault for having cheap and nasty tuners that are swamped by radio signals that any competent design would cope with. But from the era I suspect you are talking about, suddenly the "good" designs were being outsold by new cheap tat from Taiwan and China, with cheap tuner frontends designed by cretins, which took a wideband signal (including the not very close in frequency terms) ham bands and amplifying that as well as the TV signal. Result, the tuner is swamped and doesn't work - something a competent designer with a clue would have avoided.
There are a few cases where the ham's equipment has a problem. Non of them that I know would have any issue knowing about that and doing something to fix it. But that's not the same as their legal use interfering with your faulty equipment.
And BTW - it's not just hams that are bothered by these blatantly illegal devices. Everyone (except the poor users duped into buying them) knows they are outright illegal - but those who should be doing something have put an awful amount of energy into trying to pass this hot potato onto someone else. They are not, and for fundamental reasons of physics cannot, be legal. They only "pass" the tests they have to pass by deliberately fudging the tests - a bit like passing a noise test for a car or motorbike with no exhaust silencer by testing it without the engine running !
The BBC are bothered as they interfere with broadcast radio - but that's OK, why should you bother if people round about can't get radio 4. The aviation authorities are bothered - but I assume you don't fly so aren't bothered about flight safety. People using ADSL should be bothered - it's known to interfere with that. I could go on, but I suspect you just don't care - you can't see why turning the in house radio up to 11 so you can hear it in the garden is in any way wrong.
@Simon hobson
"You attitude is like someone who wants to hear their radio at the end of the garden"
Err no, that would be the hams running a couple of KW through antennas the size of their back gardens. I'm someone who's simply using a legal low power networking technology.
"he reason I say that is because in almost all cases, it is the equipment at fault for having cheap and nasty tuners that are swamped by radio signals that any competent design would cope with"
Oh riiiiight, so its not the fault of the hams causing interference, its the fault of the people who bought cheap equipment that wasn't cantankerous-old-man-with-a-transceiver proof! Got it! These plebs should all be rich electronics experts who can shell out for Meridian kit or sort out their own RF front ends! But of course!
Can you actually hear yourself? Do you know how stupid, arrogant and out of touch you sound?
"And BTW - it's not just hams that are bothered by these blatantly illegal devices. "
Err no, they're not illegal however much you'd like otherwise, they all have CE certification and are legal for sale and use in the UK.
"The BBC are bothered as they interfere with broadcast radio - but that's OK, why should you bother if people round about can't get radio 4."
Oh give it a rest with the hysterics. I've got an FM radio 6 foot from mine and never had a problem.
To sum up - wall warts arn't going away. Ham radio is a hobby, your mates radios are just big expensive toys. Sorry if modern life is causing a *tiny* amount of interference but unfortunately toys have to take a back seat to more adult requirements in life.
Originally 802.11a and 802.11b used different modulation schemes. 802.11a used OFDM which offered the promise of higher bandwidth but the trade-off is that it used more of the band (it uses a number of parallel carriers). Once the FCC allowed OFDM to be used on 2.4GHz that became 802.11g.
There is a notion that you don't get something for nothing in this world. This is especially true for WiFi. The cartons containing 802.11g gizmos advertised "54MBits" but this was the coding rate for packet data and conveniently ignored the packet preamble (coded at 6MBits/sec) and stuff -- because however you slice and/or dice it the symbol rate never changed, all that changed was the amount of data packed into each symbol. With shorter data frames the preamble/payload ratio got to the point where returns were more than diminishing so, surprise!, that 54MBits kit would only do 26MBits, and that was on a good day with only one or two users going at the access point.
The push to higher speeds means increasing the coding density further which means getting a better signal to noise ratio. This has to come from somewhere, and that 'somewhere' is starting to appear in the form of phenomena like people flying model aircraft in what they thought was unlimited space finding their planes suddenly crashing for no reason at all.
WiFi is an interesting and complex subject. Its also political. We've had innumerable users crammed into a useless slice of spectrum -- the ISM band -- because that band was deemed useless (because its absorbed by water -- hence used by microwave ovens). We desperately need the politicians to free up more spectrum. WiFi has shown that even the useless can be made into gold, its proved its worth beyond anyone's imagination, so now its time to liberate that technology.
Does wifi still in n, and ac versions, allow a single weak user to hog the band for ages?
One thing I liked about Wimax, was that it allocated a limited timeslot for each user wishing to transfer data, and if the signal was so weak the user only got a single packet through, then so be it, on to the next person in line to use the band.
As for spectrum, I'm not sure freeing up more spectrum will help much with congestion, people will just find more crap to remove wires from.
AT&T still sell/rent you 802.11 g kit. And since they use the Router for network authentication you can't BYOD. Then they disable bridging so you are truly stuck with double-NAT or 802.11g. But they have no problem selling you 20Mbit internet. Unless you live in the middle of Kansas there is no way you can get 20Mbit over 802.11g.
"Does wifi still in n, and ac versions, allow a single weak user to hog the band for ages?"
Yup! 802.11 had a "Coordinated Point Function" option, which meant essentially the access point assigned timselots to clients, but it was virtually never implemented and I've never heard of an AP or client that supported it. It's up to the AP and clients "playing nice" still with 802.11n for sure, and ac as well as far as I know. I think having an option where the AP has tighter control over channel access would be really good for heavier used access points but there doesn't seem to be one.
"What's your objection to double-NAT? Just asking."
The problem I've seen is, at least with the DSL modems CenturyLink has for rent, is that even though I would have thought NAT has been a solved problem for decades, each and every one of these DSL modem models seems to have repeated reports of randomly locking up and crashing until they disable NAT and turn the DSL modem into a dumb DSL modem. I.e. run it to a PFSense or DD-WRT access point or something and have *it* do the DSL login/password and so on.) Or (with ADSL2) buy an aftermarket modem. Unfortunatley with VDSL2, the only 2 companies using it with the north american banding are AT&T and CenturyLink, so the only available DSL modems are like used ones "for" AT&T or CenturyLink, unlike ADSL2 where there's dozens of aftermarket models available... although in CenturyLink's case, the VDSL2 modems apparently also support "dumb DSL modem" passthrough mode.
The problem with this with AT&T, is they have evilly replaced the standard DSL login/password with some kind of key-based authentication using keys they burn into the DSL modem firmware, so no after-market DSL modem will work... and if the DSL modem *does* support passthrough "dumb DSL modem" mode, it won't work due to the non-standard authentication.
To be fair there are vendors out there that are trying to mitigate the effect of poorly connected (or old standard) clients 'hogging' airtime. Mostly referred to as 'Air Time' fairness, in my experience it can work quite well (depending on vendor...)
It used to be the preserve of enterprise grade managed wifi (where I spend most of my working life) I now notice from Youtube that both Draytek and Netgear are offering the function.
Can someone explain to me how MU-MIMO "will solve the poor phone performance problem" as it seems to me the phone is still stuck using only one antenna, although the AP will now better utilise it's multiple antennas allowing the AP to communicate concurrently with devices other than the phone.
However the phone itself will still perform as poorly as it would have done without the AP having MU-MIMO, so how exactly does MU-MIMO "solve the poor phone performance problem"? It doesn't do anything of the sort, at least not based on the description in this article... unless the article meant to say the phone is a performance lead weight for AP throughput?
Yep, generally they will be just as shit as they are now because they still have a single rubbish antenna surrounded by lots of other stuff.
The writer says "In our three-antenna scenario, it would allow the access point to use each antenna for a different device, or, perhaps one antenna for a phone and two antennas for a laptop." This is fundamentally incorrect when it comes to MU-MIMO. Think it through. In MU-MIMO the access point wishes to send different data to different devices at the same time. But in a typical access point all the antennas are omnidirectional and mounted in close proximity. So different signals sent from different antennas will normally interfere at the receiver. What is required to deliver MU-MIMO is beamforming - where the access point sends each stream from multiple antennas in such a way as to create a zone of constructive interference (for the wanted signal) and destructive interference (for the unwanted signal) at the receiving antenna. You need a minimum of two antennas to beamform. So an access point intended to serve two single antenna phones with MU-MIMO will need a minimum of four antennas not three.
In the real world we will have additional challenges to deliver the theoretical benefits of MU-MIMO:
* Any residual unwanted signal will reduce the signal to noise and hence lower the achievable data rate
* The access point will need to match up devices which have the necessary physical separation to obtain the benefits of beamforming
* The access point will need to match up devices that have simultaneous traffic flows
* MU-MIMO only works on the downlink (from access point to device)
* By splitting the transmit energy between multiple destinations you are effectively reducing the signal to noise ratio to individual devices. Again this will lower the achievable data rate
So MU-MIMO is probably only going to provide significant improvements when (a) we have access points supporting a large number of antennas and (b) the access point is supporting a large number of simultaneous devices.
"Other extensions coming down the pipe include 802.11mc, which will enhance device triangulation indoors between wireless access points, enabling precision indoor location tracking. Quite what that’ll do to technologies like Apple’s iBeacon remains to be seen."
Marauder's map! But, better (if done well) is precision positioning for your Roomba.