It's as if MS are back where Apple were in the late 90's
Only they've taken the blue pill instead of the red one.
They know the OS needs an overhaul, but are bodging the overhaul as well instead of re-architecturing the OS from the ground up.
While chief technology officer Ray Ozzie was away in the clouds at Microsoft's Professional Developer Conference, technical fellow Mark Russinovich got down and dirty with the true heart of Windows - the kernel. He presented a two-hour session on changes made to the kernel used by both Windows 7 and Server 2008 R2, shedding …
Most of this sounds good, particularly MinWin and layering. Microsoft have to get from where they are now, not from some ideal and at least they are finally making the right moves. As a middleware designer I know a bit about carrying lots of legacy code bases around, I can't imagine the problems they face.
Dumping synchronous RPC would help to eliminate those nasty dependencies though, so ditching OLE and it's bastard children would be a good place to start.
I've a lot of respect for Russinovich, I suspect the rather good resource monitor in Windows 7 has a lot to do with him.
Bit nervous about the OS trying to disguise bad apps though.
And this comes from a person who knows more about Windows than anyone on Earth. Oh, and this is about the latest version of Windows. Does this mean Microsoft people do not know what they're doing ? My God, it's like flying on a plane where the pilot does not know exactly what some of the flight instruments are doing!
It's good to see Microsoft focusing more on the developer's problems. That's where the company's strengths lie: Tools and technologies for developers.
The .VHD thing sounds a lot like the .DMG (virtual disk) container files used to distribute apps on OS X. Interesting move, though it'll be interesting to see if it takes off. (Windows' architecture makes OS X-style drag-drop installs unlikely without some major development policy changes, but it's still a good way to distribute software over the 'net.)
Re. DLL Hell:
OS X applications are actually folders which the GUI treats as a single entity. You can open them up —right-click and choose "Show Package Contents—to get at the actual application code, supporting libraries and other app-specific resource. This approach removes the "DLL Hell" issue at the expense of requiring more storage space. (The latter is a lot cheaper now than it once was.) An advantage of this approach is that even many non-trivial applications can be installed by simply dragging and dropping it onto the OS X Applications folder. MS Office 2008 does this; the individual apps install any shared libraries and resources the suite needs on first use.
This approach also means far fewer interdependencies, which is why installing OS X apps and updates to the OS itself require far fewer restarts. (It's still not "none at all" though.) Non-trivial Windows apps generally rely on a more complex installation process. It's not really more difficult for end users, but it's nowhere near as elegant and all that shared cruft means you tend to see more "Restart Now" buttons after an installation or update.
I've noticed that Windows Vista and 7 seem to be trying to reduce this interdependency problem. The present architecture makes this inherently more difficult to take to its ultimate level, so it'll be interesting to see how far they take MinWin and whether they can capitalise on it to refactor their OS.
(NOTE TO THE FLAMEBOIS: I'm not a fan of *any* OS, computer system or manufacturer. There is no such thing as a One True Way, or a "best" operating system, any more than there's such a thing as a "best length of string". I've been using—and programming—computers since the days of the ZX81 and have seen umpteen operating systems, marketing approaches and "paradigms" come and go. Your opinion matters not one jot to me and will be ignored: I don't make decisions on which tools I use based on the foam-flecked rantings of some random, insecure little tit posting on the internet.)
"It amounts to around 150 binaries, and requires 25MB disk space ..."
A little way to go yet, Microsoft. I remember a QNX demo FLOPPY a few years back that had a windowing system, TCP/IP, a web browser, etc. On a fricking floppy.
I always like playing around with minimal installations. I've wasted countless hours doing trial and error Solaris minimal installs. Just for fun. I wouldn't know where to start with Windows.
> Russinovich admitted: "We don't really understand those dependencies".
> MinWin is a first step in making Windows layered, maintainable and understandable.
Am I the only person worried about this sort of thing?
It's no wonder that they try and stick rigidly to the "Software's different" line and "so we can't possibly be expected to be liable for product quality"
> Microsoft observed that 15 per cent of all user-mode crashes
User mode crashes?
How the hell do you crash an OS from user mode?
Crashing is a kernel mode function. Anything that crashes in user mode should be able to just die and leave the kernel running - haven't they heard of privilege levels or some such.
Mivcrosoft's business strategy has been a series of opportunistic scams since the day they sold QDOS to IBM as an operating system. That strategy was always doomed long term; even Bernie Madoff got caught out eventually. The problem MSFT has now is threefold- first, how to backout of their architectural dead-end, second, how to maintain their existing base of users while doing it, and third, how to lock out competing OS from being able to run Windows applications better than Windows? That third concern is much of what drove Win95's design. Windows 3.11 applications ran better on OS2 than on Windows. Win95's abandonment of any modularity or layering put paid to OS2, but at great cost. If MSFT implements a rational OS with distinct and well defined interfaces, then they open themselves up to that level of competition again. If they don't fix their OS, then they continue to have a defective product that fewer and fewer people want to buy. The inertia of monopoly only can take you so far.
>The problem is that the operating system is full of internal dependencies, and as Russinovich admitted: "We don't really understand those dependencies".
Yeah I have heard rumors that windows is so fully of circular dependencies that M$ has to try to build it 200 times in order to get a success build (takes a room full of computers several days to do so). Its obvious that that the decision to go against 50 years of computer science and not build an OS in a modular fashion was made by the marketing suites and not by the software architects and has bitten M$ in the ass in a major way (from a legal as well as security standpoint, IE the great malware portal right into the heart of the OS). The first step to getting out of a very deep hole is to stop digging and by admitting some of these mistakes it sounds like the new generation at M$ might finally be getting it. Lets just see if they can get it done before M$ becomes the irrelevant company (ala Digital Corp) of this generation. WPA and all the DRM that slows down everything, points otherwise.
I don't understand why MS remains hell bent on providing legacy compatibility at the expense of building a truly great OS. We already have a superb solution for running old apps - it's called Windows XP, and it runs just dandy in a virtual machine. In their OS9 > OSX transition, Apple (hisssss!) demonstrated that people WILL abandon ship IF you give them something worth jumping for.
Microsoft need to stop dicking about taking the entire world on, and focus on their core competencies - namely OS and productivity apps. Innovation FTW!!
The author speaks as though only Windows kernel is full of workarounds ... give me a break! Look around you .... anyone who has worked on the Linux source code will know that it is full of workarounds and magic numbers and all that jazz.
Workarounds are the way software is written, so as not to sacrifice backward compatibilty . Its a necessary evil, albeit one that could be avoided with a very well planned design. Get used to it!
1) "FTH will over-allocate memory, and keep a copy of freed memory so that attempts to re-read it will succeed." [I'm guessing this was done primarily for Microsoft and HP programs.]
2) So what is the point of UAC? "UAC is not an anti-malware solution. It is about one thing, which is about getting you guys to write your code so that it runs well as standard user."
Is there any way the above two "features" are not opposites? The first says "We encourage crap programming", while the second says "We demand quality programming from you, but, no, sorry, it's not actually going to help things because ... [I don't know why, I guess because programming is too hard compared to holding press conferences]"?
Free ProTip: When your employees go out of their way to "fix" security and stability issues by intentionally doing the exact opposite of what OpenBSD does, you should probably encourage them to recareer themselves.
Good to see some positive stuff out of Redmond.
I most liked the bit about getting the 3rd parties to write stuff that works on a restricted user account - full-access-requiring softjunk is the bane of my day job. In *NIX land (where I live when not being paid) such programming gets laughed at.
Microsoft can start whittling away the dependencies by stripping out the DRM, hidden Internet Explorer hooks and publishing "all" the APIs.
Mark is a very smart man, I attribute a good portion of the improvements of 7 over Vista to him. His SysInternals utilities are an invaluable asset to any IT professional that has to work on Windows machines.
Well, this MinWin thing has been batted around for years. I remember a long time ago, being told that it woul dbe included in the version of Window after Vista, instead we get Vista SP2 masquerading as Win 7.
Don't get me wrong, I'm using Win 7 as the OS on top of which I've installed FF3.5 adn it does work, but I have not noticed any speedups when clicking on the Start menu, in fact, I think it is quite sluggish to respond to click on the start menu. Vista, on the same hardware actually 'seems' faster.
Of course, I'm just about to reboot to perform an upgrade to the real operating system on this computer, if the servers have the capacity that is...
Andy (F12 here I come!)
What's up? This article has been online for at least 4 hours, and still no bile-fuelled spew from commentards who don't know shit from shinola, but they can tell you that Windows is crap, because only stupid people use it.
I'd love to get the chance to hear Russinovich's talk - there's a guy who knows his stuff.
"The problem is that the operating system is full of internal dependencies, and as Russinovich admitted: "We don't really understand those dependencies"."
The bigger problem is that the NT kernal source was poorly documented and had inadequate coding standards / management .. all code is "dependant" on how it's programmed to work with other code .. and back then programmers often had different approaches. It's tough indeed to look at older source and see the intent
Kudos to MS for reaching this deep .. almost to the point of 'design' change, IMO
Re: "... Its solution was a feature called the Fault Tolerant Heap (FTH)."
So the "permanent fix" is not to create a solution but rather a built-in work around that simply covers up the problem (after a few tries) and makes it looks like all is well? I'd be ashamed to admit that.
Cover-ups are like that; yes they are.
QUOTE : The problem is that the operating system is full of internal dependencies, and as Russinovich admitted: "We don't really understand those dependencies".
At least they're honest about this. To be fair, they have kept the API reasonably stable. Simple win2k stuff still runs.
Thumbs up for MinWin. Reorganize, it's high time. Please do leave us detailed err details...
Whatever you do when you finally reorganize, just don't break win32. I have kind of grown used to it. Well ok if you must, give us a better C api. Some of us just don't like that C#, managed C++ shite.
Well F@@# that...
Pointers are a bitch, no matter how good one is, one will inevitably stumble, but honestly, this is not a solution. In fact, this is possibly going to make things more of a bitch to debug.
And, who are they covering anyways, THEIR sloppy coding or ours?
I'm just saying... If they are actively encouraging people to use things like C# which have auto GC, then whose fault is it when invalid pointers are accessed?
I wonder if they are saying their foundation is that shaky :) LOL
Even though they seem to finally be grasping the benefit and competitive advantages of a modular architecture, I can't imagine MS will ever eliminate the precluding dependencies without ultimately compounding it and obfuscating the architecture, if only to prevent the possibility that a third party may develop a much more efficient and reliable modular component.
The "memory game" workaround is most concerning, and I can only deduce that memory creep will again be a major issue with Win7, not to mention the remote possibility of disastrous results as program A, which would normally crash on access to previously unallocated memory, now obtains whatever value may have been there and uses it as, say, a disk write address. One never knows whether that bit of code that didn't get executed before is OK to run now with "assistance". If the developer is unable to properly manage memory, it is quite reasonable to expect that there may be other issues within the code. It appears to me that attempting to make an app more reliable has the potential to make the OS less reliable.
Still, appreciate very much Mark's straightforwardness. Perhaps this is a new era for Microsoft.
>Windows 7 is version 6.1, not because it is a minor release,
>but for compatibility with applications that check the major
>number and would not run if it said 7.
Right - so it's a Major Release, but *sooooooo* similar to 6 that the only thing stopping older stuff from working on it is whether the version number says 7?
That's quite similar then. You know, like it was a point release or something. Except it it isn't a point release. No. Not at all. Really. Glad we got that cleared up then.
Considering how many bad programmers are hard at work creating bad software for Windows, it's not surprising that MS have had to put a lot of workarounds into their code to avoid the egg-on-face scenario when new versions roll out. Yes, I'm looking at you. No names, but I see you in the audience and you know who you are. Now stop creating heap corruption, you worms! It's 90% of programmers that give the rest a bad reputation...
As for Mr. Russinovich, he is and always has been a Good Guy. Doesn't mean he always has to be nice though. I liked the way he reframed UAC and made the developers of badly behaved programs responsible for user pain.
I'm running Seven on same hardware as my WinXP system (2.66 dual core Xeon/2 GB RAM) and, while better than Vista, still not XP. And I'm not running high end apps; just a ticketing system, Office 07, and some SMS tools.
Am getting ready to butt heads with management over my going back to XP again. It just works better.
This is yet more evidence that NT4 really was the pinnacle. They watered it down in an attempt to create a Win2000 that could merge the NT and 9x families and allow games and such to disregard proper design and security. Supposedly NT4 was the clean rewrite that left legacy concerns behind. Then they corrupted it and are now attempting to do AGAIN.
GO (around and around and around...)
Doug Glass writes:
So the "permanent fix" is not to create a solution but rather a built-in work around that simply covers up the problem (after a few tries) and makes it looks like all is well? I'd be ashamed to admit that.
- If you read the "Old New Thing" blog linked by another comment, then you'll get an idea for what MS is up against ; they don't want to get blamed when crappy software breaks after MS changes how an interface is implemented. An example seen on more than one OS is that if you change memory management to trap buffer overruns , it'll break badly written code. [*cough* Mozilla on OpenBSD *cough*] Who gets blamed, and who is supposed to fix the problem? If MS is the one getting blamed because "it worked just fine under the previous version of Windows", and even better if the program is not being actively maintained by the ISV, then naturally MS will try to reduce the incidence of such failures.
They could of course take a principled stance; maybe throw up a dialog saying "sorry, this crappy program died because it was written by clueless programmers; you really ought to update it or give it the flick"... with a single button labelled "oh, OK then" to dismiss the dialog. How happy do you think the users, or indeed the ISVs would be with that approach? You have to keep all those "developers, developers, developers" writing stuff for your platform, even if some of them are not very good at coding.
I personally feel that virtualizing old environments for crappy software to run in is a more attractive option than keeping all the workarounds in the code base of the current OS, but maybe that's unrealistic...
especially @AC: Can't you troll elsewhere?
Only kids complain about their OS. If you don't like it, change it. There's no need to 'defend Microsoft' if someone doesn't like Windows.
Drama queens are upset no one is crying? Please give us a break! There should be moderation on these comments.
Err, wasn't the poorly written (and unintentional) "fault tolerant" heap in the Win9x codebase what got MS into such trouble when they decided we plebes might actually try using the Internet? Now they think they can do it right just because they're doing it intentionally? The hax0rs must be slobbering their keyboards over this...
I wish MS had a "device driver fault tolerant" feature in Windows 7. I have never seen so many Blue Screens as I have in the last few weeks trying to get Win7 up and running.
BTW it is very common for windows application programmers to ignore return codes from windows api calls. This is commonly found even in sample code from Microsoft. Most api failures don't express themselves well when explained to users, which is why many programs simply don't work for no apparent reason, or issue a helpful "Critical Error" message or just crash.
It's a reasonable point, but it's also a trap. Layering compatibility kludges on top of other compatibility kludges that were built on top of someone's really bad idea from a 1987 Friday afternoon special is how Microsoft wind up in the situation where they don't know what their own dependencies are and have to implement virtual DLLs (I mean...really?) in the first place. And it's also an excellent way to make your developer base so confused about how the hell everything's supposed to work (am I supposed to link this against the virtual DLL or the real DLL or the really-real DLL that lives in Minwin? Where the hell did I put my whisky?) that you can be assured of having lots more crappy code you have to hack stuff up to be 'compatible' with in the future. It's a never-ending, guaranteed-to-keep-getting-whackier cycle, in other words.
Of course, other models aren't perfect either. The Linux world prefers to keep its developers in a state of bafflement by making sure everything gets rewritten every two weeks just to keep everyone on their toes. But you really do have to question Microsoft's approach when they're getting to the points of frankly surreal behaviour documented here and on that excellent MSDN blog someone linked to. The longer they delay killing off the crazy, the more it's going to hurt when they have to.
into the same DLL for performance reasons"
As well as making it impossible to disintangle apps from core, like MS Explorer. And forcing any 3rd party apps to duplicate non-core functions if they want to replace function they might be able to do better. Poor structure coupled with a regularly changed API make a very effective way to limit competition.
"we don't really understand these dependancies"
Am I the only person here whose jaw dropped at that?
You have the *all* the API calls, and their parameters, weather or not you share them with outside developers.
You have the source to the compilers which convert it , so if needed you can hack them.
You have a team of tool builders who can write tools to process this source or deveop specific trace tools to write an API call map in real time.
So what is he saying?
They *can't* figure out what calls what when the system is running.
Or they *can't* figure out how to de-circularise this calling.
This would truly meet the definition for FUBAR (correct spelling)
The Windows kernel is *not* some mult-headed beast whose behaviour is unknown and unknowable and attacks without warning (except to users).
It's large and complex and badly structured by this description.
Saying "we don't understand it" reminds me of the decades long protestations of tobacco companies that "We don't know if Nicotine is addidictive or carcinogenic. Honest." While they developed treatments to dial the level up or down in processing.
He scores points for honesty. No doubt this will be followed (in about 12 months) by the usual "Windows 8 is a major improvement on Windows 7" line.
...Tim Anderson for the best read on the Reg in months. As soon as I saw the name Mark Russinovich I knew I would be in for a good read.
...Microsoft for employing Mark, putting him in a position where he can do good work and giving him the authority to do so.
...Microsoft again for grasping the nettle.
...Mark Russinovich for telling it like it is. No bullshit, no spin.
...Sean Timarco Baggaley for the best commentard flame of the year.
It's just occurred to me that I'm actually interested to see an MS OS for the first time (in a VM naturally) . And it's not even at SP1 yet. I need a lie down.
I personally have no intention of migrating my huge IT estate to Windows 7. Its just pure bloatware, with a install footprint of 2.3GB !
Its good to see that HP still know what corporate customers want. All HP business desktops still come with an XP Pro image as the default install. I hope this continues.
Take a look at some companies with massive PC estates, such as Delta Airlines, or BA for example. What do they use on every unit world wide ?.
NT4 SP6 !.
310mb install footprint, and 80mb RAM consumption at idle. Makes for a rapid and smooth OS on modern equipment. Hence why check-in Kiosks run so nice.
Fast becoming a victim of it's own success. WIndows is everywhere now, we are so dependent on it that there hundreds working on "emulators", like Wine, CeDega, CrossOver and VMs that will allow any O/S to run Windows apps and Windows base installs. This is the perfect time for MS to stop fannying about pulling Windows into an ever deepening dead-end. MS can then side with one of these groups and develop something will support old Windows apps. MS can then start doing an Apple, ( dump OS9, all hail OS-X ), turn around and say in 3 years time, Windows as you know it will be dead, we will start again with something new and radical. We have a rock solid VM layer in their for your apps, but you need to start coding to this new O/S standard. They won't, the potential upset to the businesses that feed the MS machine would be colossal, possibly fatal.
MS are starting to step out on to shaky ground if they continue to maintain this ancient codebase without some major, radical overhaul.
Sorry to say it, but Windows 7, with its 15 year-old codebase, is still a pig even when plastered with make-up, very expensive make-up!
Finally, someone with a sense of perspective; the worst software I've come across tends to be in-house applications that were thrown together by some office junior while on work placement five years ago, that inexplicably become vital to the operation of the company (although not critical enough to employ anyone to code it properly).
If this application breaks because of a new operating system, guess who gets the blame? The office junior? No, Microsoft.
Biting the hand that feeds IT © 1998–2019