If you polish a Turd...
... it's still a Turd.
Windows CE might seem like a bad dream from this distance, but a IoT developer speaking at Microsoft's Ignite gabfest in Australia have told The Register it represents a starting point that now positions Redmond well to respond to the Internet of Things. Mitch Denny, CTO of Readify and a .NET developer for several years, is …
... it's still a Turd.
"... it's still a Turd."
Presumably why Linux is still at ~1% PC market share...
"Presumably why Linux is still at ~1% PC market share..."
Presumably you are only counting PCs sold through retail channels.
1%. Don't believe that!
If you count operating systems running on things around the home, you will find Unix/Linux will destroy windows.
I just did this. 17 *nix devices. 1 windows PC
Several Android phones and tablets
Sony a6000 camera (Android)
LIFX bulbs, Linux
It's like Android doesn't even exist for you or smart TVs, the new PC or chrome. The only thing I got from this is the same root hacks will be applied to their OS in every single use to provide them with backdoor access, including keyloggers, storage access and network monitoring. Nothing coming out of M$ can be trusted anymore, not one program.
> Nothing coming out of M$ can be trusted anymore, not one program.
Which is a shame because flat ass millennial web 3.0 look aside Windows 10 is actually really decent (check my post history to see how rarely I give M$ even faint praise). Figures they finally get their shit together and then decide to ruin it by going full Google with the spyware. Fuck their SaaS only CEO. Makes you almost miss monkey boy Ballmer.
Running the same code, or even code written in the same language on different devices is not very important. In fact if you do so you will most likely encounter certain problems:
1. Your protocols and data formats will be rather badly defined, you might even just copy blocks of memory around (in languages like C) or use the dangerous serialisation and de-serialisation methods your language provides (for example in Java). Even in less bad cases your format will be defined by code, not by some specification you wrote. That's usually a bad idea.
2. You will encounter certain problems. While it may be possible to code firmware for a micro controller in your unified language of choice, you will certainly hit hardware limits much earlier. Storing even a 10 kilobyte runtime on your controller means having to choose one with at least 16 kilobytes of flash when one with 2 kilobytes would have done otherwise. That's many cents per controller.
3. You don't actually gain much, as all the cool features you like from your favourite language won't be there in the down sized embedded version. Or if they are available they consume more memory than you have.
However I can see how such an idea might look appealing to people in the Windows world. There you often have binary file formats or XML. Getting data from one program to another is already hard if you use the same programming language, inter language communication between programs is largely unknown. Windows people never grew up in a world of simple reusable tools which use simple text formats as an input and similar formats as output.
In a way standardizing on one language/codebase is like standardizing on one graph:
What about the Christmas ham trade?
What about the Christmas ham trade?
If it's a proper ham, it doesn't need refrigeration. Same for bacon. Smoked and salted....
How to make an easy job for hackers even easier. Write once, hack anywhere.
Don't worry about the hackers- it's the Crackers that are important here...
They don't need to; Windows 10 already extracts all the 'telemetry' you own, like it or not.
Windows 10 is just the starting place for collection.. Everything is now fair game and not just with MS either. Their a little late off the starting block but trying to catch up with Google currently being the leader.
I can see the use for telemetry in situations where there is a strong desire to find out what is going on in an embedded or isolated system - eg 'How does it perform?', 'WTF happened there, then?'.
In such situations I would expect the telemetry to be targeted at how the system behaves and be fairly light in terms of perfomance/data flow impact.
My concern is that 'Telemetry' is becoming a catch-all for sucking out as much debug information as possible. That level of information is way higher than what I would call 'true telemetry' and, apart from the obvious performance hit that it implies, it opens up the possibility of being too revealing.
And of course, it allows developers to forgo the pleasure of ensuring that their embedded software is damned-near perfect, in favour of debugging after deployment. I really hate that vision of the future.
The problem I see is this, and one that is underestimated by people higher up the food chain than devices; different devices produce very different telemetry.
At a simple level an IoT freezer might report power consumption, compresser vibration and temperature. An IoT central heating system might report a whole lot of statistics including water flow rate and carbon monoxide level. A printer is going to report consumable levels, perhaps the fuser temperature, internal warnings, and information about what is being printed - with security concerns here. These are all different kinds of data in different formats. A simple protocol of name:value tagged pairs can have significant disadvantages because some data will be in the form of arrays, and this results in unnecessary bloat.
You see this issue in frameworks like Tivoli, if anybody is still using it, or Xerox's attempt to remotely collect data on leased copiers, which last time I looked involved a humungous wsdl in order to put data straight into a backend database.
Microsoft is seemingly trying to do something that others have tried and not succeeded very well at doing on a restricted problem set. I suspect that they will start with something straightforward and then everything will turn out to be a corner case and the usual bloat will ensue. I could say "like Windows" but that would be snarky, and besides I think W10 is an improvement. But only over W8.
I see where this is going:
"Hello, Support? My central heating is saying 'PC LOAD LETTER'!"
There is no perfect data transfer format that works for all situations. Different devices are very different in their required interface and complexity. It sounds as if MS thinks everything should run Winblows when in many cases something much simpler and lighter is the correct solution.
"a starting point that now positions Redmond well to respond to the Internet of Things."
"Well" isn't a word I associate with the IoT unless it's a deep hole in the ground into which to drop them.
You don't use a sledgehammer to put a screw into a piece of wood. Unless, y'know, it's right next to you and the screwdriver is still in the toolbox...
It sounds like we're going back to the "write once, run anywhere" ethos that Java once enthused about, and it's likely to encounter the same issues that Java did: the levels of abstraction needed to get the same code running on devices A and B mean that you need more physical storage, more run-time memory, more processing power, and more electrickery to keep things ticking over. And for the IoT ecology, all of these - especially the electricity - are generally in short supply. It's the age old "cheap, powerful, efficient: pick two" dilemma, and in a commodity market, cheapness is generally mandatory.
Also, there's a question about what's going to be done with all the data spewing from these devices. Is someone really going to gather all the stats needed to monitor a fridge compressor - and even if they do, are they going to be able to put together a realtime monitoring mechanism *and* have some way of exposing it securely for customers to access? That sort of thing costs time and money and unless there's some sort of high-value support contract in place, there's little or no reason to provide it. Especially since in a few years time, there'll probably be a new model of the compressor and the entire thing will have to start again...
(To be fair, there is a case to be made for having a widget sat atop the freezer that monitors for pre-defined, short-term issues - a change in the compressor's RPM or power usage, a prolonged change in temperature, etc - and punts out an alert via email to the butcher and/or the company which provides the support contract for the freezer. But that's very different to the kind of real-time monitoring/tuning/statistical analysis that Microsoft are talking about, and requires far less resources to implement)
"You don't use a sledgehammer to put a screw into a piece of wood. Unless, y'know, it's right next to you and the screwdriver is still in the toolbox..."
Tell that to a Brummy...
> Also, there's a question about what's going to be done with all the data spewing from these devices.
"""Denny said the longer-term value is what he's trying to develop for: that the behaviour of 30,000 compressors becomes a data set that goes a long way to forecasting what's going on."""
It seems that what MS is aiming for is to have billions of IoT devices all sending their data through Azure where they can scrape it and sell it. The 'forecasting' is of no use to the butcher, it won't tell him the date and time his compressor will fail, he only needs to know, quickly, that it has failed when it does. The 'forecasting' would be done by collecting the failures and selling this data to the manufacturer (and meanwhile charging the butcher for this service).
The '10' in Windows IoT is the increase in cost (10 times) that a Windows IoT solution will have over a rational solution.
>>The 'forecasting' is of no use to the butcher
That would be my thought too, that monitoring and disaster recovery are local processes that are already in place.
Personally I like having sensors on critical devices with the ability to log and monitor remotely.
I'm not too excited about having data broadcast and collected by 3rd parties.. unless I get something in return.
It seems that what MS is aiming for is to have billions of IoT devices all sending their data through Azure where they can scrape it and sell it.
Well given Intel's WindRiver pitch recently reported in El Reg, MS are obviously keen that the OS and VM's that run on these devices is Windows...
The 'forecasting' is of no use to the butcher, ... selling this data to the manufacturer (and meanwhile charging the butcher for this service).
Yes preventative maintenance is really the only benefit arising. When you have large numbers of devices you can start to determine real world MTBF etc. and so send an engineer out to replace a device before it reaches the high-risk and certainty of failing part of it's lifecycle.
How has he been a .NET developer for longer than .NET has been around? He's out by a few years...
"How has he been a .NET developer for longer than .NET has been around? He's out by a few years..."
.Net 1.0 beta was released in 2000.
I'm not sure that counts
It does. My company (at the time) were writing live software with .NET beta. I think back now and it seemed so easy but dear Kibo, it was clunky and badly unfinished.
Still. Not everything changes. Windows Forms still suck buttock.
Not even slightly worried
"2. Type and press ENTER"
"3. Type and Setup and press ENTER"
What sort of OS uses broken English like this?
I don't wish to be picky (OK, I do) but for the first time I thought "ha, a good pointless image on a reg article" ... but then I looked.
Disk 1 of 2079?
The latest build on MSDN for 64-bit en-us (you surely don't run 32-bit?) would be 2776 disks approx on a standard 1.44 (1.38 formatted) floppy.
2079 disks gets you to just over 2869MB
Which version of Windows 10 was that the floppy disk for...?
I will not consider Microsoft a serious contender in the IoT field until they are able to keep up with the latest standards and support it. For example, OS X, Linux, iOS and Android all support Bluetooth Smart 4.2 however Windows does not support it: https://msdn.microsoft.com/en-us/library/windows/hardware/dn133849(v=vs.85).aspx Bluetooth smart 4.2 has been known for awhile now and was officially released over a year ago. Why can all other major OS's support the standard but not microsoft?
Why can all other major OS's support the standard but not microsoft?
Because Microsoft didn't write it. It's the same thing as the old browser wars, the inclusion of certain format changes in the CD ISO, the inclusion of Microsoft code in some Unixen, and so on. If they don't have a stake in it, they aren't interested, and the users that are affected can go fuck themselves.
Microsoft aren't quite the epitome of a corrupt, self-serving Merkan corporate, but not for want of trying!
Those of us who have to make Bluetooth run under Linux would not recognise your picture.
The trouble is the gap between these imaginary problems and reality. Devices often don't fail gradually in some mensurable way that suits the IoT sales scenario. Even if the compressor flashes a red light (which doesn't need IoT) to warn of a forthcoming failure, the butcher will do what we all do: he'll wait, hoping it will work until he has time to fix it, a time which will likely never come. It won't be fixed until it breaks altogether, whether that's this Christmas or next.
They /should/ have done this a decade and a half ago, instead of relentlessly inventing and abandoning APIs and development tooks.
Biting the hand that feeds IT © 1998–2018