This is probably...
The most serious WTF moment I had this year.
What the FCUK is this useful for?
The question is serious.
Microsoft plans to emulate x86 instructions on ARM chips, throwing a compatibility lifeline to future Windows tablets and phones. The Cobalt project should bear fruit within the year, when the "Redstone 3" release of Windows 10 is due to arrive, Mary Jo Foley reports. Ideally, Microsoft wants everyone to convert their old …
Lots of companies have legacy apps that will never get ported to ARM.
Microsoft is hoping everyone will buy Windows phones and tablets, which may use ARM chips.
This allows your company Windows phone to still run your legacy x86 app on an ARM chip.
Probably no one is going to buy Windows ARM phones or tablets for their whole company, but this is Microsoft trying to make it slightly more attractive.
Also, if ARM based servers suddenly become popular this could be useful, but that's not looking very likely.
MJF covers "all things Microsoft". All the action is in mobile these days, and she would like to stay relevant, but Microsoft is dead in mobile. So rumors like this give her hope. That is what it's good for. She thinks it might bring Microsoft Windows to mobile.
It won't. Emulation is slow. It's not gonna happen. She fell for this once before, when reporting on the release of the original ARM based Surface.
How you bring Windows to the mobile ARM platform is simple: you don't ever stop developing it for multiple platforms. Unfortunately for Microsoft, they crossed that bridge and burned it over 20 years ago.
"It won't. Emulation is slow. It's not gonna happen. She fell for this once before, when reporting on the release of the original ARM based Surface."
Emulation isn't necessarily "slow" these days - and hasn't been for decades, FX!32 ran PKZIP 1.5-2x quicker despite the host processor having a 30% clock speed deficit. Typical desktop apps make a lot of API calls - those can be run native, and of course code tends to spend a lot of time executing loops - so caching translated sequences of instructions yields big benefits. Folks running code on JVMs benefit from those same tricks every day - it's not rocket science any more.
Of course it's better not to require emulation in the first place - but old x86 binaries developed for 300Mhz P3's should run just dandy on a 2GHz ARM these days.
Google for Apple Rosetta to get a better idea of what and why. This is the Microsoft equivalent, except it's for Windows apps, and for going from Intel to ARM.
I think it's a good idea. It will be even better if, unlike Rosetta, it's supported for more than just a couple years and then dropped.
Plus there are many high-performance applications that work close to the metal. They squeeze out every last drop of performance using less-documented features and tricks and so on. In fact, many of these programs turn out to not be forward-compatible because newer CPUs drop or change the features and make them crash.
Yes, very sensible, except for licensing. Do you add a license cost to each dock in which case it may as well be a PC with a USB cable to access to phone data? If not then there is nothing stopping you buying a cheap windows phone and running Exchange Pro-Annihilator on the dock with no license payable.
The dock used with Continuum has no brains. It's just a hub to allow wired connections to screen, keyboard and mouse.
I assume the limit of only allowing x86 with Contiunuum is due to the problems of trying to drive a mouse and keyboard on a phone - never a clever idea.
I'm not that great with licenses, but why not flip the license around, and license the dataset? The processor it runs on in this case is irrelevant to the data that is being worked on. License the data, and then the user gets the ability to either underpower or overpower the system according to their own needs in order to use the data for their work. If they want to just slowly churn through an emulated Outlook on ARM, then when back in the office dock it into a full PC, the same dataset is being used.
That's only valid when you're using a dock. Continuum works wirelessly too.
Also, you're not thinking about the bigger picture. If Microsoft can get x86 applications running on ARM, that also opens up the door for ARM based laptop and desktop machines too. Which would really get Intel worried.
I don't pretend to understand the low-level technical concepts of what they are wanting to do here, but the gist and it does lead me to wonder: Microsoft has had a very good CPU-independent program execution engine for at least 16 years: The Common Language Runtime. I continue to be surprised that they have only ever considered it a platform to run applications on. Had Microsoft invested in making parts of the operating system *itself* run on the CLR (or maybe some special version of it) we'd already be years into developing CPU agnostic applications that could run effortlessly regardless of the underlying CPU. ARM or Intel. As application consumers we simply wouldn't care. Sure, some CPUs would be better than others, but CPU manufacturers would have developed new devices specifically to target the environment, maybe the running the CLR instruction set (or a subset) as native on-silicon instructions.
I think Microsoft are 10 years behind where they should/could be on this issue. Instead, they're faffing around changing the user interface (flat GUI, I'm looking at you) with operating system releasing, polishing the same turd over and over.
I think the obsession with always-connected, mobile computing has pushed progress (not necessarily innovation, but certainly progress) back significantly.
But you need something CPU-specific to load the CPU-agnostic code. Or, to put it another way, some sort of CPU-specific system that can be used by someone operating the computer.
You could possibly call it an operating system...
We'd already be years into developing CPU agnostic applications that could run effortlessly regardless of the underlying CPU
You mean something that we could "write once, and run anywhere"? We've had Java for the last 20 years. But even then, you still need a CPU and operating-system specific runtime. Oracle currently lists seven JREs for Linux (x86 and x64), Mac OS X (x64), Solaris (SPARC and x64) and Windows (x86 and x64).
They tried and succeeded, actually:
"The Singularity research codebase and design evolved to become the Midori advanced-development OS project. While never reaching commercial release, at one time Midori powered all of Microsoft’s natural language search service for the West Coast and Asia."
LISP machines compiled to the machine code.
The problem was that the LISP CPU was not followed up. The CPU itself was a dual CPU, one M680xx just for garbage collection, the other was microcoded to be the LISP interpreter.
It worked - and fairly well, but with no family of CPUs planned the company died before being able to come up with a successor.
Remember Longhorn? The project that got restarted after 4 years of development and eventually became Vista? Its first iteration used the CLR extensively for OS features, but it just didn't work well enough. Some pieces (WPF) got released separately, but MS won't make the same mistake again.
It's hard to say whether the new cross-platform (hardware platforms) strategy will work. I don't see the demand for Windows on phones and tablets, at all.
>> I don't see the demand for Windows on phones and tablets, at all.
I'll get more interested when/if I could get a docking phone with multiple monitors, inputs, outputs, etc... as powerful as my desktop. For now I'm still holding on to my flip phone and a Windows 7 desktops.
The problem is that there are plenty of people out there in the desktop world who demand performance in their applications. Businesses, for one, who have to keep and eye on throughput, as well as specialists like 3D artists and gamers, both of whom push the envelopes in need of better performance.
Now ask yourself why no one writes a high-performance program in something like Java. Simple, it's like trying to hitch a low-friction wagon to a mule or kitting a race car with a Mini engine. All the performance gets lost in translation.
Your argument doesn't hold water.
All the examples you give are distributed across multiple servers - i.e. as scale out architectures,
This doesn't make them high performance applications,
You are trying (unsuccessfully) to compare them to a single compiled executable that runs on a single host.
Some of them, sure. Lucene is a single process kind of library though, and it is plenty fast enough.
I am/was being a bit disingenuous however - the one thing they all these high performance Java applications have in common is that they have no GUI. Java GUIs suck very very badly in performance terms, which makes ignorant people think that Java sucks very very badly in performance terms.
I don't mind; it's dead handy seeing who the ignorant people are.
Now ask yourself why no one writes a high-performance program in something like Java....All the performance gets lost in translation.
Now ask yourself why a quick Google for "Java algo trading" returns 173,000 results.
It's a shame you're posting anonymously. There are probably lots of financial institutions that would love to contact you and tap into your extensive knowledge of high-performance software.
"Now ask yourself why a quick Google for "Java algo trading" returns 173,000 results."
Actually thats falling foul of the 'say a search terms has x 1000 results when in fact it doesn't'
I get 138,0000 results but if I click the 10 at the bottom of the 1st page and go to page 10 and then click 14 to go to page 14, what happens?
You get the "In order to show you the most relevant results, we have omitted some entries very similar to the 139 already displayed.If you like, you can repeat the search with the omitted results included."
In your case I'm guessing page 17 or thereabouts will show the same message. So there aren't 178 *thousand* at all
Yes, yes, in 2003 they swore everything would have been written in .NET including the OS. Never happened, despite the efforts in crippling Windows using .NET where it wasn't required (just open the Event Viewer in XP and then in 7 to feel the whole difference...)
They were and are just a different flavour of Java. While Windows Phone 8.x was technically good and not resource hungry (unlike Android) exactly because it got rid of all the Silverlight/.NET stuff and run native apps.
Actually they did try that. I don't know about now, but at the time, the CLR was so significantly slower than running native instructions that they ended up having to abandon the entire concept and rewrite from scratch. Could you imagine trying to emulate physical hardware on not just a foreign architecture, but another virtual machine? It would be downright painful.
but at the time, the CLR was so significantly slower than running native instructions that they ended up having to abandon the entire concept and rewrite from scratch
Makes we wonder why they didn't try again. Ngen has been around since 2005, and it compiles CLR into native x86/x64 images.
I don't think C# nativized would be much worse than a good C++ compiler.
> Had Microsoft invested in making parts of the operating system *itself* run on the CLR (or maybe some special version of it)
Allegedly they tried that, according to developers. The reason that Vista was 6 years after XP and was hurriedly thrown together* was that the plans were to have a CLR based OS to be released in 2004 or so. This was supposedly to run on x86 and PowerPC. It appears that they also want to build an XPC similar to the XBox 360 so that they could take over some of the OEM's business to increase revenue and become more like Apple.
The problem was that they couldn't get it to work, and what did work was really slow.
* Enterprise contacts were a 3 year period. They signed up a lot of them around the release of XP. There was no update in 2003/4 but renewals were signed anyway. If there was nothing in 2006/7 then a lot of contracts would not have been renewed so they had to get _something_ out and Vista was the result.
Microsoft should have required Intel mobile chips in their Continuum phones. It is painfully obvious that a Windows device that can't run the majority of Windows software is terrible idea. This was amply demonstrated with WinRT so how could they be dumb enough to make the same mistake twice?
Perhaps in time they could have offered ARM devices with emulation, or perhaps laid the foundation for universal binaries built against Win32 that compiled to whatever native platform they were executed on. Or both options.
But sending Continuum out to die was just a bad idea.
I bought a Linx 8-inch tablet 2 Christmases ago with Atom Z3735G that cost me £80 and runs a Windows 10 (upgraded from 8). While I mostly use it like a tablet, I can and have used the desktop just fine. It's not going to win a prize for speed but it's enough to run desktop apps like word processors and similar apps.
While Intel was late to the mobile market, their chips are competitive with ARM chips. Performance / battery benchmarks show that time and again. And in a device which is supposed to turn into a Windows desktop it seems a no-brainer to at least start off with an x86 compatible device.
I doubt emulating x86 on ARM is going to lead to a good experience at all. I suppose Microsoft could machine translate DLLs to native ARM instructions on first load but even that is going to be suboptimal.
Like DrXym I've got a couple of Atom Z3735 devices and the Linx 7 isn't much bigger than a Phablet. After the failure of RT and with the new Atom x3/5/7 out I can't help but think Microsoft missed a trick not going with x64/x86 phones. It would have then been a relative doddle to put ARM emulation on top of that.
This too is doomed.
Only larger tablets are suitable for "legacy" x86 apps. Even then Emulation (VM makes it WORSE, though more secure) is rotten performance.
I was disappointed. I thought they were adding a better x86 compatibility layer for Windows 10 DESKTOP x64, that would avoid RAM and licence overhead of a full VM with XP 32bit (or earlier).
Doing this for ARM phones is pointless.
The idea of Continuum was you had a Windows phone in your pocket but if you docked it in some kind of port replicator or big tablet frame you could use it as a a desktop. So the expectation was not to be able to use the desktop on a small screen, but allow a desktop on a big screen.
But it kind of sucks to have your desktop and discover there are no apps for it since its running ARM...
> Perhaps in time they could have offered ARM devices with emulation, or perhaps laid the foundation for universal binaries built against Win32 that compiled to whatever native platform they were executed on.
And for reference, Apple did this successfully: migrating their entire platform and user base from PowerPC to Intel. This was using a combination of emulating PowerPC in software, and getting developers to build "fat binaries" compiled to run natively on both platforms.
What this would require on the OS side is a full-fat Windows built for ARM (not cut-down Windows RT).
Perhaps if MS were to threaten to migrate their whole user base to ARM (Apple-style), Intel would sit up and take notice?
Apple did it twice even. Moto 68k (the original Macs) to PowerPC with 'fat binaries' and then PowerPC to Intel with 'universal binaries'. This has the added builtin marketing feature that when the old platform is finally deprecated they can claim to free up disk space.
Apple's library was noticeably smaller than Microsoft's, thus making it easier to convert. Also, Apple controlled the hardware chain so there were very predictable hardware specs for the devs to work with. Even with the brief stint into third-party PPC Macs, the specs were still pretty strict.
>Apple did this twice
But they were going from slower to faster CPUs.
Going the other way is madness, doubling down on disappointment: first with a slower CPU and then with emulation.
If you want to run arm and x86 together, find some add arm to an x86 chip package and find some way to freeze/thaw data when you power each chipset on/off so you can share data quickly or not at all.
"Apple did it twice even. Moto 68k (the original Macs) to PowerPC with 'fat binaries' and then PowerPC to Intel with 'universal binaries'."
Yes, but they went to more powerful platforms, so the overhead could be absorbed.
I don't think Arm is faster than Intel, just runs much better optimised stuff most of the time due to its roots.
PPC CPUs were faster than the available M68K CPUs, and x86 CPUs were faster than the available PPC CPUs.
With ARM they go the wrong direction. What would be the incentive for people to "upgrade" to ARM Windows on an ARM PC for less performance? People would keep buying x86 CPUs and stick with an x86 version of Windows.
Its like when Intel hoped to force the PC market to Itanium to get 64 bits. They thought by withholding 64 bit CPUs people who needed 64 bits would go Itanium, and as memory sizes increased eventually even desktop PCs would go Itanium (i.e. the last dregs of x86 CPUs would be sold in low end PCs today) But they didn't figure on AMD successfully creating their own 64 bit extension to x86, and Microsoft accepting it.
This is the one flaw in the idea about Apple transitioning the Mac to ARM. Yes, Apple's SoCs are by far the fastest ARMs around, and compare favorably on a performance per watt basis with Intel's x86 CPUs. But they are still only half the performance of Intel's highest end x86, and while a design targeted at using more power could bump that up somewhat, it would still mean a transition that costs performance - a hard sell for Apple's professional creative market where a performance drop of 5-10% might be tolerated, but not 30-40%.
Now there's no reason to think that it is impossible for Apple to design an ARM SoC that matches Intel's performance, and if Apple can do it Microsoft could (eventually, once they had a competent team like Apple does) do it. But it hasn't been done yet, and simply matching performance just gives you a migration for migration's sake. Apple could get away with it (especially if it meant bringing Continuum-like capability to the iPhone that could really run all Mac apps) but Microsoft has almost no userbase to amortize those development costs with - though I guess they have proven over the past 15 years that they're not averse to throwing billions down a black hole so who knows?
> Apple's SoCs are by far the fastest ARMs around, and compare favorably on a performance per watt basis with Intel's x86 CPUs.
While Apple's may be the fastest _mobile_ SoC, it is not the fastest ARM chip. For example see X-Gene 3 for servers. https://www.apm.com/products/data-center/x-gene-family/
ARMs _trounce_ Intel on performance per watt, and also on performance per dollar in most cases.
They have the same clock rate, and Apple's designs are very aggressive in terms of IPC - nearly at Intel levels - so I think it is actually not very likely XGene's per core performance is better than Apple's. Now obviously per chip it will blow away the A10, because it has so many more cores, but that's a different argument.
As for performance per watt, the less power you use the better your performance per watt. Since most ARM designs are using 1 to 2 watts, of course they blow Intel away. But Intel's own CPUs show that cherry picking the best CPUs and calling them 'U' series instead of 'K' series makes them about 5x better on performance per watt. If they downclocked/downvolted them further to 2 watts they'd do even better. Still not as good as ARM designs, but Intel is designing their cores to work best in the 20-50 watt range, while Apple and Qualcomm design theirs to work best in the 1-2 watt range - with appropriately adjusted pipeline sizes in terms of FO4 stages, etc. Modern CPUs have so much going on that the more complicated decoder required for x86 is simply lost in the noise in a multi billion transistor chip.
Have an upvote DougS:
"Modern CPUs have so much going on that the more complicated decoder required for x86 is simply lost in the noise in a multi billion transistor chip."
I'd still rather that chips did not have complex decoders simply because it makes design verification difficult, and that is is important because I want bug-free chips... I want bug-free chips because it's hard to replace a chip soldered onto a board - or buried inside a rack. The x86 errata sheets & documentation around the various permissions mechanisms tend to be byzantine - and Intel have a rather ugly track record of pretending security vulnerabilities in their chips don't matter. :(
The thing that ARM chips typically lack in comparison to their Intel competition is memory bandwidth - and that requires to you drive a lot of wires at high frequencies - which burns a lot of juice. My suspicion is that ARM chips with equivalent STREAM bench figures may actually burn a comparable amount of power to an Intel chip...
Apple won't do it for the high-performance machines but you should expect to see MacBooks with A-xx CPUs in the not-too-distant future.
I expect that by now Apple is capable of building a Rosetta-like tool but even if not - who do you think owns the Rosetta IP these days ?? IBM, that's who. And has anyone failed to notice the cozy relationship between IBM and Apple? Not if you're this side of dead.
x86 on ARM will happen. The fat lady has sung.
Microsoft should have required Intel mobile chips in their Continuum phones.
The whole thing started with the WinRT fiasco. They could, and possibly should, have included something like Rosetta, possibly backed with something like Transmeta's emulation microcode on the hardware, maybe throw in some ART style JIT cache; depends a bit on which resources are available. But they were worried about cannibalising their existing market and pissing off Intel.
For the last few years all mobile chips have been beefy enough for this kind of thing which Intel demonstrated with the ARM emulation code for x86 for the Atoms: nearly all apps ran fine at the cost of battery use.
My guess is that they're getting ready for the long-heralded but Zarquonesque arrival of ARM on servers and possibly desktops. Many of us expected that to be from Apple this year but they were too busy raising margins and adding fluff. Still, can't be too long before we see some impressive reference designs.
Around the world are many small software houses, supplying specialist and sometimes bespoke software to a moderate number of customers.
Probably late into the world of Windows, small number of programmers, often self taught, but do software no one else does.
Now they have their X86 software running on Windows using WIN32. Sometimes using unusual software to create theirs, (lots of these people used XBase in the 80s and 90s and moved to something unusual to keep their code base).
Anyway people have to buy what they sell, but on Windows no problem. But they are worried now about MS abandoning the Windows market as a total rewrite would take say 5 years.
To them Windows means X86 or similar.
It does not mean some new language which may not be really suitable for specialised data entry screens with large client server setups.
"Only .net adds some level of CPU independence."
and does it POORLY at the expense of application performance, and hauling around that monolithic dead-man-on-the-back known as the ".Not" runtime...
And cross-compilers aren't THAT hard to use...
I don't think being able to run legacy x86 code on phones is that important. Those legacy line-of-business applications were designed for desktop PCs; nevermind binary compatibility, they won't work on a phone because they weren't designed for phones. I don't think that many people have *real* use cases for this.
>nevermind binary compatibility, they won't work on a phone because they weren't designed for phones.
That is very true, but MS are pursuing the idea of letting people plug their phone into a monitor, mouse and keyboard. Ubuntu were making similar noises, but Apple have taken a different approach (documents and draught emails on your phone are handed over to your Mac, presumably through iCloud or somesuch).
I'm not sure why - SoCs are so cheap these days you might as well just have a second PC instead of a dumb dock.
The only advantage to MS's approach I can think is security - you'd be using your own personal device instead of running your software on an untrusted PC. You'd still have to have trust that the keyboard wasn't logging keystrokes and the monitor wasn't grabbing screen shots, though this wouldn't leave you as wide open as running your software on an untrusted PC.
Still, if working away from an office is your thing, just use a laptop.
I can't understand why more tablet makers don't let them be used as second monitors. You could have an ARM tablet that acts as a screen for a headless x86 box for when you need it. Shit, we could have the x86 box built into the keyboard, a form-factor I've not seen since the Amiga :)
"Still, if working away from an office is your thing, just use a laptop."
But laptops are increasingly being seen as security risks: being too big to really be portable, they become easier to lose. A case of bigger NOT being better. At least phones are usually kept in pockets, clips, or holsters: basically, on the person, and usually with locks. They're considered more personal so more attention tends to be paid to them.
Those legacy line-of-business applications were designed for desktop PCs; nevermind binary compatibility, they won't work on a phone because they weren't designed for phones.
Reminds me of a pocket computer I had running windows CE. It was rather useless - Microsoft assumed you'd need to use a start button with hierarchical menus to get at programs. oh yes, you could make icons, but it wasn't easy on a small device.
As you said, MS set the bar that apps had to be ported to run in their UWP platform and sold on that way, but nobody really bothered as they could see the down sides.
We were told it was there for lots of good marketing BS reasons about security and cross platform etc - yea, nobody bought that either.
Now we're going to have emulated x86 on ARM - can't see that performing very well. I wonder how they are going to do things like emulate the keyboard and mouse, given that they had, apparently to re-write the whole OS to make that work - or has someone just swept this little issue / marketing BS under the carpet too ?
I expect that many are already writing their apps on other platforms as they can see the writing on the wall for Windows.
yes, and UWP also shoves their 2D FLUGLY crippled "universally dumbed down" user interface up everyone's backside, to make us accept inferior computer performance and all of the adware and spyware and tracking and micro-shaft logins and yotta yotta yotta aka "normalize it" and only runs on Win-10-nic... not a good selling point. I'm not buying.
Microsoft was there once before with Windows NT, mostly through no fault or ingenuity of their own.
DEC was desperate enough to sell their Alpha-based workstations to develop FX!32. This was a binary translator rather than a pure emulator, and it did achieve some very impressive speeds - I distinctly remember my Alpha outperforming native x86 workstations running x86 code.
Nonetheless, that wasn't enough to save Alpha in the long run :(
Finally a reason to run Windows for ARM over Android. Now they only need to find a way to automatically adapt desktops UIs to make them usable on small touchscreen devices. Alternatively they could introduce a phone with stylus and keyboard.
The point is that they cannot out-iPhone the iPhone. If they want to succeed, they need to build on something only they can provide... in the case of Microsoft that's running legacy Win32 and Win16 code.
I completely understand what Microsoft are trying to do from a strategic perspective and if they can pull it off it would be great. But I can't help think that they are bringing with them so much extra baggage compared to other OS providers that they won't be able to achieve their goals.
Chrome OS supports Android apps and developers seem to be happy creating new Android apps instead of converting legacy Windows apps - Google seems to be getting in a better position than Microsoft.
"Chrome OS supports Android apps and developers seem to be happy creating new Android apps instead of converting legacy Windows apps - Google seems to be getting in a better position than Microsoft."
Except most Android apps aren't business-oriented, built around internal networks, keyboards, mice, and so on. IOW, there's a general dearth of productivity software. High-performance gaming is also notably absent from the Android lineup because the specs basically demand a PC with plenty of RAM (not just storage) and graphics capability that would melt most tablets.
"Except most Android apps aren't business-oriented, built around internal networks, keyboards, mice, and so on"
You're right they probably aren't, but I bet the number of business orientated mobile apps is increasing as companies look at "mobile first" strategies. Once you have the programming logic in place then designing the GUI for different interfaces isn't that difficult, its been done for websites for years. I agree with you on the lack of productivity apps, but you're talking about today, I'm talking about the next 3-4 years.
I agree on the high performance gaming, but for slightly different reasons. There has been a dark cloud hanging over PC gaming for years, console gaming seems to be taking the market share (Just to be clear I prefer PC gaming and don't own a console). But now Windows 10 and Xbox one share a common platform I'm hoping this will change. High performance gaming developers have no interest in Android, only consoles.
"I agree on the high performance gaming, but for slightly different reasons. There has been a dark cloud hanging over PC gaming for years, console gaming seems to be taking the market share (Just to be clear I prefer PC gaming and don't own a console). But now Windows 10 and Xbox one share a common platform I'm hoping this will change. High performance gaming developers have no interest in Android, only consoles."
But you still have games that REQUIRE a keyboard, mouse, and good eyes to play. WoW's still the 800-lb gorilla of online gaming, and while Overwatch will expand Blizzard's reach, they know where their real money lies. Unless and until WoW becomes playable on a PS4 the same way you can with a PC, PC gaming and the like aren't going anywhere soon.
How many people need to run "legacy" apps? Most people use a browser, office, skype..... and not much more.
CAD? Better off with a dedicated machine - a tablet/phone won't have the graphics and cpu horsepower and storage for anything other than minor stuff.
There will be a market. But I'll guess that it isn't that large of a percent of the total market. So this will be a nice feature but no game changer.
"How many people need to run "legacy" apps?"
a) an application I purchased in the early noughties should STILL work on a new computer.
b) purchased games that are still entertaining, 15 years later. And they might even perform a bit better on 'modern' hardware.
c) I don't want to shell out $$$ for a VERSION UPGRADE for my favorite software package every time Microshaft excretes another OS/change-for-the-sake-of-change.
I doubt I'm alone on this.
Or it is me doing 90% of the development, going DOS to Windows was a huge move, took us FIVE years.
I am nervous, how long will WIN32 be available, or will we make our users put 7 on their new machines in 10 years time.
I have started investigating WINE but some C modules do no work, my dev IDE does however.
I look at the new GUIs coming out, absolutely shit for data entry.
"How many people need to run "legacy" apps? Most people use a browser, office, skype..... and not much more."
At home, maybe, but in companies legacy code is essentially. For example we use WS_FTP95 at my current company as the only allowed FTP-client. At a company I was before (from 2008) we were using Protel98, a electronic CAD package from 1998 with no plans to ever upgrade.
There's plenty of software run in company that will never be updated because it runs and because Win32 used to be more or less stable. In fact there are even many software packages like Praxident which where maintained over 2 decades, but assimilated all those old technologies which seemed hip at the time. Those packages use everything from OCX components over OLE Automation and direct access to printers, up to .net.
The Business market is still important for Microsoft. Office is one of their most profitable product, and companies are likely to purchase profitable service contracts.
The consumer market is long lost to Android anyhow.
I mean if you look at Microsoft, the only thing that's consistently worked for the last 20 years was the win32-API on x86. If you wrote a program only using that in 1996, it's very likely it'll still work perfectly fine today. If you were smart, it won't even need any kind of installation or framework.
Now Microsoft is finally trying to do what they can do best, running win32 code.
To succeed you need to find something you can do well. For the iPhone this was shiny design, for Android this was the (broken) promise of having an open system. For Microsoft this always was running legacy code from the previous decades.
And it's always been that way, even if you look at the famous Windows 386 commercial, you'll notice that they are mostly running DOS software in their shiny new Windows 386.
Even well after the year 2000, people often ran DOS software for some applications.
FPGA are for desktops, niche or prototypes. Horribly power hungry and expensive each compared to an x86 HW subsystem.
Legacy x86 needs a decent screen, so really only usable on a decent Tablet (Surface Pro is x86 family anyway?) or docked simply using a phone as portable storage. Put x86 in the dock.
Well actually, back when that Windows software was written 640x480 was an OK resolution with 1024x768 being about the maximum you can get.
If Microsoft was to either find a way to rearrange GUIs so they fit on those tiny screens, or bring out a mobile device with keyboard and pen, those software packages would be useful again.
Also there's a lot of software packages around for Win32 which are still maintained. They could still adapt the GUI without changing the rest of the system. This would give those applications a bridge.
Furthermore there's also quite some Win32 stuff, like VPN clients, which do not really need a GUI.
I wrote an x86 & PC emulator for Windows CE around 1998, it interpreted the instruction set and the standard peripherals of a PC such as the PIC, but also paravirtualused many of the interrupt calls. We didn't call it paravirtualusation back then though. It ran at about the equivalent speed to an 8MHz PC AT on a 33MHz MIPS processor.
ISTR Linux had DOSEMU at about that time too.
You discover weird stuff undergoing such an exercise when you can't get DOS to boot: one or two undocumented processor features were used for example.
The market? As the article states, there are shedloads of vertical market niche applications out there that will never be re-coded in this month's hottest language/framework.
It was both a shame and a stunning marketing fail that Microsoft deliberately crippled Windows RT to Modern UI only apps from third parties, that pretty much killed it off before it was born. Maybe they did learn something.
you know, there's such a PATHETIC number of people actually USING Win-10-nic phones, that it's really just a WASTE! OF! TIME! even TARGETING this kind of thing.
Micro-shaft is busy walking over gold bars to pick up PENNIES. They've completely screwed the pooch and dropped the ball for existing desktop users, and also alienated their "developers developers developers developers" by INSISTING on this path towards self-destruction for everyone ELSE, too. I guess FAIL likes to take others with them so that they don't "feel so bad" about being FAILURES.
UWP isn't selling Win-10-nic. Developers aren't buying it. Adding x86 emulation for a PHONE is just a waste of time. Has anyone else out there tried using qemu in emulation mode? Performance STINKS when you emulate! Unless that x86 application was written for MS-DOS back in the early 1980's, it's not going to perform very well on a PHONE, and you probably won't even be able to USE it properly.
SO many levels of FAIL here.
They kinda did sort their shit out back in 2008/2009. They showed off win 7 + office recompiled for ARM at a trade show, and everyone though "MS is really getting to grips with ARM". And then instead they did Win RT, Windows 8, and it all turned into a debacle...
Windows has traditionally been reasonably re-targetable, they just failed to spot where things might go with ARM. They could right now do it, but with UWP targeting the non existent mobile platform (Win 10 mobile) they're still missing the point.
"How to run normal x86 Windows apps on your Windows RT"
"How to Jailbreak Your Windows RT"
As far as I know, Windows RT was the ARM version of Windows 8. Unfortunately and obviously, only some exe (probably simple application) can run. http://forum.xda-developers.com/showthread.php?t=2092348
On the side, Microsoft should probably have done this themselves like before 2012 with their 'Mobile First' motto.
I am not sure I fully understand the ramifications here but it sounds like Microsoft is fighting Android. If Microsoft make it easy to run x86 apps on Android why buy Windows. If they do not ... why buy Windows. Or have I missed the point. Quite probably seeing as I am not interested in apps, I do not tweet and I can see no point in facebook. I use a 'phone to talk to my friends. You know, the spoken word, using whole sentences, unlike Ronald Chump.
...well kind of. In the late 80's the Acorn were selling an MSDOS emulator for the Archimedes, for which the first ARM chip was created. As you'd expect of an emulation in most use-cases it was a bit slower than PC but still pretty good and IIRC a few apps ran faster on ARM.
Better solution, ditch Windows and just go GNU/Linux.
If that must have legacy app is one you develop, or have source, you can use winelib to port it easier.
If that must have legacy app is closed, maybe you can get it to run on ARM GNU/Linux with Wine's ARM support. https://wiki.winehq.org/ARM
Of course really, you just finally bite the bullet and replace that legacy app.
Biting the hand that feeds IT © 1998–2019