All nice and well
But what will happen to the latency? How much time needs to pass before a smaller frame is sent out?
A group of researchers from Greece's University of Thessaly and the Centre for Research and Technology Hellas believe there's scope for energy consumption reductions of as much as 75 percent if 802.11n's energy saving extensions are combined with frame aggregation techniques. The reason this is important? Because, as wireless …
If so, large packets won't help. Turning on a lightbulb needs 1 bit. Perhaps 8 bits for a dimming.
The majority of monitoring applications only needs a timy amount of data. What matters though is latency. Don't want to flick a switch and wait 10 seconds while single useful bit makes its way through the NSA and friends.
In the main yes, which raises the question whilst for proof of concept the use of WiFi is reasonable, it's use in the field is largely inappropriate. This is for two main reasons: Firstly the one you mention, many of these new applications don't require anything more than a very low speed (sub 9.6kbps) serial connection, and the use of something designed to carry ~1.5KB packets at 600mbps isn't going to be a very efficient carrier. Secondly, because WiFi, particularly 'n' is designed to transmit up to 70 metres indoors (approximately 250 metres outdoors). In the typical home, I suggest the ability to transmit up to 20 metres is more than sufficient for many of these devices. There is a third reason, a WiFi device can't be passive like an RFID tag, it needs to consume power to maintain a listening process and to tx a "I'm alive" beacon back to the AP to maintain it's live status in the network. We should also remember the research work pre-dates IPv6 which will increase the per-packet overheads and hence is likely to further increase power consumption.
Quote
As the number of end user devices skyrockets, so will the amount of energy consumed in the access network to serve them.
Do we really have to connect everything up?
It is not rocket science to say that the more devices you have connected, the more power you use.
Then the inverse square law comes in wrt distance/power of connections.
Pah,
We really don't have to connect everything up do we?
There is an OFF switch, use it.
@ Steve Davies
Try telling that to my young neighbours. We share a wireless network so if things slow down I can check who has wireless turned on.
Answer-- all of them -- phones, pads, laptops. Most, even if they are out of the house.
When I've visited, the TV is on as background. When they went on hols I had the keys, to water plants, and went from room to room unplugging idle power adapter cubes.
Is it just the immediate postwar gen like myself who had economy drummed into us by parents who'd experienced rationing ? All I know, is my phone goes into flight mode after midnight and I only turn its wireless on, or my computer on, as needed.
Some of us in the generation after had it drummed in to us as well; my father's proudest moment was visiting me after I'd moved out of home, whereupon to show him around I had to move a light globe from the kitchen to the laundry and back. I had spares, but found it easier to encourage my housemates not to waste electricity by not providing the means.
The design philosophy behind BLE is to keep airtime down to a minimum, so this kind of change would have little to no effect on it. As it stands a Bluetooth 4 LE module will run for more than a year from the energy in a button cell, so I doubt there's much need to improve on that.
No mention by Richard of .11n's successor, which among other things, substantially reduces device-side power requirements.
The IT industry doesn't do the looking back thing well, so there's unlikely to be much effort put in to making .11n more efficient when Cisco et al can sell you shiny new .11ac things.