So what's the difference between this and Zigbee?
Among the causes of the serious bushfires that are still burning near Sydney, Australia, was a tree branch falling over a powerline. Some Internet of Things (IoT) boosters suggest this kind of thing could be avoided by placing connected sensors just about everywhere so that news of a single failure could quickly trigger a …
different layers of the same underlying standard?
Zigbee is a networking standard which defines all of the layers. 802.15.4e is a wireless standard which only defines the lower two layers - you could run TCP/IP over it to create your network.
Thanks. That clarification was what the article was missing.
That's about 20 micro A current draw.
Of course the questions (as always) are a)How many updates per second (minute, hour, day, week,year) do you need and b) much is going to be transferred.
"Give me as much as possible as often as possible" is not helpful
But let's see what they can do.
Nice overview for non-networky types.
"the low-rate communications in sensor networks run between 20 Kbps and 400 Kbps"
The control systems and sensor networks I was involved in the 80's did remarkable well with 9.6kbps serial communications. Likewise many geographically distributed sensor networks use 9.6kpbs over GSM...
This isn't to say that the data rates wouldn't be welcome, but I'm a little surprised at the relatively high threshold rate.
Well this is a radio network and those are signalling rates.
So by the same logic your GSM sensor network would have around 270kbits per second.
Whilst it is a neat solution for the 'internet of things' where hundreds of devices could end-up in the range of a home WiFi hotspot, having an addressable chirp would make spying devices incredibly cheap and virtually undetectable. Baking in a ping-chirp that would wakeup and identify all devices would minimise spying.
Spying here is not the snowdam/NSA kafuffle, but the like of the Barclays/Santander criminal keyboard-USB intercept to pickup passwords.. when devices are cheap enough, every internet café/shared PC will get key-tracking USB sheaths.
To offset the spying risks, the standard really needs a 'ping-chirp' plug-and-play stack
"The protocol is fully acknowledged"
If the underlying protocol is fully acknowledged, wouldn't that make running TCP over it (where each packet is also acknowledged) be slightly redundant?
Honest question, I'm out beyond the limits of my knowledge here.
>If the underlying protocol is fully acknowledged, wouldn't that make running TCP over it (where each packet is also acknowledged) be slightly redundant?
Correct, however, full acknowledgement is only confirmation of delivery not guaranteed delivery.
From the available information it would seem that one of the intended operating modes is to support application-to-application communications directly over the MAC service. This trying to achieve a balance between the energy efficiency design goal and real-time communications. [Aside: The MAP Enhanced Performance Architecture specified something similar for real-time factory floor communications back in the 1980's, but used the 'off-the-shelf' IEEE 802.2 LLC type 3 service over 802.4.]