8051
Intel's take on a PIC.
PIC architecture isn't pretty, but I hated the 8051 in 1983.
The Internet of Things (IoT) is growing an estimated five times more quickly than the overall embedded processing market, so it's no wonder chip suppliers are flocking to fit out connected cars, home gateways, wearables and streetlights as quickly as they can. However, the sector is so new that there is considerable …
The 8051 is estimated to be a $1.5Billion market annually. Most of the sales are in SE Asia as it's an open-source core perfect for the countless low-end applications out there that don't need more than 4MHz to run.
If I told you how many $$$ Atmel sells in 8051 each year the number is so high you wouldn't believe me. Silicon Labs 8051 sales is even higher.
The 'sector' doesn't really exist and because they have no idea what the projected 30 billion things on the internet are going to be doing they have no idea how much processing power they will need.
Same as the recent moaning about lack of standards for communication between things. Hard to write a specification when your requirement is little more than a wet finger stuck in the air.
"Will this force usage of IPv6 soon?"
No.
When people speak of the IoT theny don't really mean that these devices are connected directly to the internet via IP.
What they mean is that these would be accessible via the internet. ie. you might have a "home controller" on the actual internet, but that uses Bluetooth etc to comunicate with the hundred or thousands of devices in your house (if you believe the hype).
All this means that Quark may be a strong challenger to some ARM-based embedded processors…
Where's the evidence for this assertion? Is the Quark being used anywhere in volume? And if, as the article goes on to argue, even cheaper open source hardware is starting to appear, how does Intel stack up there?
I thought Quark was supposed to be the gateway drug from Intel for embedded Its power consumption is still well above that of the M-series so it waves the x86 instruction set to attract attention. Personally, I think the ARM has an increasingly attractive argument about the same toolchain across devices.
Milliseconds?
Most of the lights in my house are compact fluorescents - they seem to have a delay of getting on for a second or so before they do much anyway!
I could rub two boy scouts together and get a fire going quicker than some CFs come on ------------------------------------------------------>
"The M7 supports 64-bit data transfer and can execute two instructions in parallel, which will be important for markets where instantaneous response to changing conditions is required, such as lighting management."
I actually laughed out loud at this. I hope this is just copy/pasted from a press release and not the result of any serious or considered analysis.
The whole "article" is just cut&paste out of variouse press releases. The Quark stuff directly from Intel, the 8051 directly from Silicon Labs.
Only ARM covers all the bases from Coretex M0 (with devices starting at 28c that run for a year or more on a coin cell) to multicore application running Linux/whatever.
The Cortex-M7 mostly adds DSP instructions over the Cortex-M4. Even that was no slough. For digital I/O and simple control tasks even a Cortex-M0 is plenty. You only really need an M4 or even an M7 if you want to process data streams like audio.
Yes, the M7 narrows the gap between microcontrollers and "real" processors. But the bottom end of the "processor" market, with CPUs like Broadcom SoC (Cortex-A based) as in the Rasperry Pi, is defined by what runs Linux.
Nobody will dump their Linux-based designs for a Cortex-M, and very, very few microcontroller designs need the additional processing power, so this looks like a very narrow niche.