It's as if MS are back where Apple were in the late 90's
Only they've taken the blue pill instead of the red one.
They know the OS needs an overhaul, but are bodging the overhaul as well instead of re-architecturing the OS from the ground up.
While chief technology officer Ray Ozzie was away in the clouds at Microsoft's Professional Developer Conference, technical fellow Mark Russinovich got down and dirty with the true heart of Windows - the kernel. He presented a two-hour session on changes made to the kernel used by both Windows 7 and Server 2008 R2, shedding …
Most of this sounds good, particularly MinWin and layering. Microsoft have to get from where they are now, not from some ideal and at least they are finally making the right moves. As a middleware designer I know a bit about carrying lots of legacy code bases around, I can't imagine the problems they face.
Dumping synchronous RPC would help to eliminate those nasty dependencies though, so ditching OLE and it's bastard children would be a good place to start.
I've a lot of respect for Russinovich, I suspect the rather good resource monitor in Windows 7 has a lot to do with him.
Bit nervous about the OS trying to disguise bad apps though.
And this comes from a person who knows more about Windows than anyone on Earth. Oh, and this is about the latest version of Windows. Does this mean Microsoft people do not know what they're doing ? My God, it's like flying on a plane where the pilot does not know exactly what some of the flight instruments are doing!
It's good to see Microsoft focusing more on the developer's problems. That's where the company's strengths lie: Tools and technologies for developers.
The .VHD thing sounds a lot like the .DMG (virtual disk) container files used to distribute apps on OS X. Interesting move, though it'll be interesting to see if it takes off. (Windows' architecture makes OS X-style drag-drop installs unlikely without some major development policy changes, but it's still a good way to distribute software over the 'net.)
Re. DLL Hell:
OS X applications are actually folders which the GUI treats as a single entity. You can open them up —right-click and choose "Show Package Contents—to get at the actual application code, supporting libraries and other app-specific resource. This approach removes the "DLL Hell" issue at the expense of requiring more storage space. (The latter is a lot cheaper now than it once was.) An advantage of this approach is that even many non-trivial applications can be installed by simply dragging and dropping it onto the OS X Applications folder. MS Office 2008 does this; the individual apps install any shared libraries and resources the suite needs on first use.
This approach also means far fewer interdependencies, which is why installing OS X apps and updates to the OS itself require far fewer restarts. (It's still not "none at all" though.) Non-trivial Windows apps generally rely on a more complex installation process. It's not really more difficult for end users, but it's nowhere near as elegant and all that shared cruft means you tend to see more "Restart Now" buttons after an installation or update.
I've noticed that Windows Vista and 7 seem to be trying to reduce this interdependency problem. The present architecture makes this inherently more difficult to take to its ultimate level, so it'll be interesting to see how far they take MinWin and whether they can capitalise on it to refactor their OS.
(NOTE TO THE FLAMEBOIS: I'm not a fan of *any* OS, computer system or manufacturer. There is no such thing as a One True Way, or a "best" operating system, any more than there's such a thing as a "best length of string". I've been using—and programming—computers since the days of the ZX81 and have seen umpteen operating systems, marketing approaches and "paradigms" come and go. Your opinion matters not one jot to me and will be ignored: I don't make decisions on which tools I use based on the foam-flecked rantings of some random, insecure little tit posting on the internet.)
"It amounts to around 150 binaries, and requires 25MB disk space ..."
A little way to go yet, Microsoft. I remember a QNX demo FLOPPY a few years back that had a windowing system, TCP/IP, a web browser, etc. On a fricking floppy.
I always like playing around with minimal installations. I've wasted countless hours doing trial and error Solaris minimal installs. Just for fun. I wouldn't know where to start with Windows.
> Russinovich admitted: "We don't really understand those dependencies".
> MinWin is a first step in making Windows layered, maintainable and understandable.
Am I the only person worried about this sort of thing?
It's no wonder that they try and stick rigidly to the "Software's different" line and "so we can't possibly be expected to be liable for product quality"
> Microsoft observed that 15 per cent of all user-mode crashes
User mode crashes?
How the hell do you crash an OS from user mode?
Crashing is a kernel mode function. Anything that crashes in user mode should be able to just die and leave the kernel running - haven't they heard of privilege levels or some such.
Mivcrosoft's business strategy has been a series of opportunistic scams since the day they sold QDOS to IBM as an operating system. That strategy was always doomed long term; even Bernie Madoff got caught out eventually. The problem MSFT has now is threefold- first, how to backout of their architectural dead-end, second, how to maintain their existing base of users while doing it, and third, how to lock out competing OS from being able to run Windows applications better than Windows? That third concern is much of what drove Win95's design. Windows 3.11 applications ran better on OS2 than on Windows. Win95's abandonment of any modularity or layering put paid to OS2, but at great cost. If MSFT implements a rational OS with distinct and well defined interfaces, then they open themselves up to that level of competition again. If they don't fix their OS, then they continue to have a defective product that fewer and fewer people want to buy. The inertia of monopoly only can take you so far.
>The problem is that the operating system is full of internal dependencies, and as Russinovich admitted: "We don't really understand those dependencies".
Yeah I have heard rumors that windows is so fully of circular dependencies that M$ has to try to build it 200 times in order to get a success build (takes a room full of computers several days to do so). Its obvious that that the decision to go against 50 years of computer science and not build an OS in a modular fashion was made by the marketing suites and not by the software architects and has bitten M$ in the ass in a major way (from a legal as well as security standpoint, IE the great malware portal right into the heart of the OS). The first step to getting out of a very deep hole is to stop digging and by admitting some of these mistakes it sounds like the new generation at M$ might finally be getting it. Lets just see if they can get it done before M$ becomes the irrelevant company (ala Digital Corp) of this generation. WPA and all the DRM that slows down everything, points otherwise.
I don't understand why MS remains hell bent on providing legacy compatibility at the expense of building a truly great OS. We already have a superb solution for running old apps - it's called Windows XP, and it runs just dandy in a virtual machine. In their OS9 > OSX transition, Apple (hisssss!) demonstrated that people WILL abandon ship IF you give them something worth jumping for.
Microsoft need to stop dicking about taking the entire world on, and focus on their core competencies - namely OS and productivity apps. Innovation FTW!!
The author speaks as though only Windows kernel is full of workarounds ... give me a break! Look around you .... anyone who has worked on the Linux source code will know that it is full of workarounds and magic numbers and all that jazz.
Workarounds are the way software is written, so as not to sacrifice backward compatibilty . Its a necessary evil, albeit one that could be avoided with a very well planned design. Get used to it!
1) "FTH will over-allocate memory, and keep a copy of freed memory so that attempts to re-read it will succeed." [I'm guessing this was done primarily for Microsoft and HP programs.]
2) So what is the point of UAC? "UAC is not an anti-malware solution. It is about one thing, which is about getting you guys to write your code so that it runs well as standard user."
Is there any way the above two "features" are not opposites? The first says "We encourage crap programming", while the second says "We demand quality programming from you, but, no, sorry, it's not actually going to help things because ... [I don't know why, I guess because programming is too hard compared to holding press conferences]"?
Free ProTip: When your employees go out of their way to "fix" security and stability issues by intentionally doing the exact opposite of what OpenBSD does, you should probably encourage them to recareer themselves.
Good to see some positive stuff out of Redmond.
I most liked the bit about getting the 3rd parties to write stuff that works on a restricted user account - full-access-requiring softjunk is the bane of my day job. In *NIX land (where I live when not being paid) such programming gets laughed at.
Microsoft can start whittling away the dependencies by stripping out the DRM, hidden Internet Explorer hooks and publishing "all" the APIs.
Mark is a very smart man, I attribute a good portion of the improvements of 7 over Vista to him. His SysInternals utilities are an invaluable asset to any IT professional that has to work on Windows machines.
Well, this MinWin thing has been batted around for years. I remember a long time ago, being told that it woul dbe included in the version of Window after Vista, instead we get Vista SP2 masquerading as Win 7.
Don't get me wrong, I'm using Win 7 as the OS on top of which I've installed FF3.5 adn it does work, but I have not noticed any speedups when clicking on the Start menu, in fact, I think it is quite sluggish to respond to click on the start menu. Vista, on the same hardware actually 'seems' faster.
Of course, I'm just about to reboot to perform an upgrade to the real operating system on this computer, if the servers have the capacity that is...
Andy (F12 here I come!)
What's up? This article has been online for at least 4 hours, and still no bile-fuelled spew from commentards who don't know shit from shinola, but they can tell you that Windows is crap, because only stupid people use it.
I'd love to get the chance to hear Russinovich's talk - there's a guy who knows his stuff.
"The problem is that the operating system is full of internal dependencies, and as Russinovich admitted: "We don't really understand those dependencies"."
The bigger problem is that the NT kernal source was poorly documented and had inadequate coding standards / management .. all code is "dependant" on how it's programmed to work with other code .. and back then programmers often had different approaches. It's tough indeed to look at older source and see the intent
Kudos to MS for reaching this deep .. almost to the point of 'design' change, IMO
Re: "... Its solution was a feature called the Fault Tolerant Heap (FTH)."
So the "permanent fix" is not to create a solution but rather a built-in work around that simply covers up the problem (after a few tries) and makes it looks like all is well? I'd be ashamed to admit that.
Cover-ups are like that; yes they are.
QUOTE : The problem is that the operating system is full of internal dependencies, and as Russinovich admitted: "We don't really understand those dependencies".
At least they're honest about this. To be fair, they have kept the API reasonably stable. Simple win2k stuff still runs.
Thumbs up for MinWin. Reorganize, it's high time. Please do leave us detailed err details...
Whatever you do when you finally reorganize, just don't break win32. I have kind of grown used to it. Well ok if you must, give us a better C api. Some of us just don't like that C#, managed C++ shite.
Well F@@# that...
Pointers are a bitch, no matter how good one is, one will inevitably stumble, but honestly, this is not a solution. In fact, this is possibly going to make things more of a bitch to debug.
And, who are they covering anyways, THEIR sloppy coding or ours?
I'm just saying... If they are actively encouraging people to use things like C# which have auto GC, then whose fault is it when invalid pointers are accessed?
I wonder if they are saying their foundation is that shaky :) LOL
Even though they seem to finally be grasping the benefit and competitive advantages of a modular architecture, I can't imagine MS will ever eliminate the precluding dependencies without ultimately compounding it and obfuscating the architecture, if only to prevent the possibility that a third party may develop a much more efficient and reliable modular component.
The "memory game" workaround is most concerning, and I can only deduce that memory creep will again be a major issue with Win7, not to mention the remote possibility of disastrous results as program A, which would normally crash on access to previously unallocated memory, now obtains whatever value may have been there and uses it as, say, a disk write address. One never knows whether that bit of code that didn't get executed before is OK to run now with "assistance". If the developer is unable to properly manage memory, it is quite reasonable to expect that there may be other issues within the code. It appears to me that attempting to make an app more reliable has the potential to make the OS less reliable.
Still, appreciate very much Mark's straightforwardness. Perhaps this is a new era for Microsoft.
>Windows 7 is version 6.1, not because it is a minor release,
>but for compatibility with applications that check the major
>number and would not run if it said 7.
Right - so it's a Major Release, but *sooooooo* similar to 6 that the only thing stopping older stuff from working on it is whether the version number says 7?
That's quite similar then. You know, like it was a point release or something. Except it it isn't a point release. No. Not at all. Really. Glad we got that cleared up then.
Considering how many bad programmers are hard at work creating bad software for Windows, it's not surprising that MS have had to put a lot of workarounds into their code to avoid the egg-on-face scenario when new versions roll out. Yes, I'm looking at you. No names, but I see you in the audience and you know who you are. Now stop creating heap corruption, you worms! It's 90% of programmers that give the rest a bad reputation...
As for Mr. Russinovich, he is and always has been a Good Guy. Doesn't mean he always has to be nice though. I liked the way he reframed UAC and made the developers of badly behaved programs responsible for user pain.
I'm running Seven on same hardware as my WinXP system (2.66 dual core Xeon/2 GB RAM) and, while better than Vista, still not XP. And I'm not running high end apps; just a ticketing system, Office 07, and some SMS tools.
Am getting ready to butt heads with management over my going back to XP again. It just works better.
This is yet more evidence that NT4 really was the pinnacle. They watered it down in an attempt to create a Win2000 that could merge the NT and 9x families and allow games and such to disregard proper design and security. Supposedly NT4 was the clean rewrite that left legacy concerns behind. Then they corrupted it and are now attempting to do AGAIN.
GO (around and around and around...)
Doug Glass writes:
So the "permanent fix" is not to create a solution but rather a built-in work around that simply covers up the problem (after a few tries) and makes it looks like all is well? I'd be ashamed to admit that.
- If you read the "Old New Thing" blog linked by another comment, then you'll get an idea for what MS is up against ; they don't want to get blamed when crappy software breaks after MS changes how an interface is implemented. An example seen on more than one OS is that if you change memory management to trap buffer overruns , it'll break badly written code. [*cough* Mozilla on OpenBSD *cough*] Who gets blamed, and who is supposed to fix the problem? If MS is the one getting blamed because "it worked just fine under the previous version of Windows", and even better if the program is not being actively maintained by the ISV, then naturally MS will try to reduce the incidence of such failures.
They could of course take a principled stance; maybe throw up a dialog saying "sorry, this crappy program died because it was written by clueless programmers; you really ought to update it or give it the flick"... with a single button labelled "oh, OK then" to dismiss the dialog. How happy do you think the users, or indeed the ISVs would be with that approach? You have to keep all those "developers, developers, developers" writing stuff for your platform, even if some of them are not very good at coding.
I personally feel that virtualizing old environments for crappy software to run in is a more attractive option than keeping all the workarounds in the code base of the current OS, but maybe that's unrealistic...
especially @AC: Can't you troll elsewhere?
Only kids complain about their OS. If you don't like it, change it. There's no need to 'defend Microsoft' if someone doesn't like Windows.
Drama queens are upset no one is crying? Please give us a break! There should be moderation on these comments.
Err, wasn't the poorly written (and unintentional) "fault tolerant" heap in the Win9x codebase what got MS into such trouble when they decided we plebes might actually try using the Internet? Now they think they can do it right just because they're doing it intentionally? The hax0rs must be slobbering their keyboards over this...
I wish MS had a "device driver fault tolerant" feature in Windows 7. I have never seen so many Blue Screens as I have in the last few weeks trying to get Win7 up and running.
BTW it is very common for windows application programmers to ignore return codes from windows api calls. This is commonly found even in sample code from Microsoft. Most api failures don't express themselves well when explained to users, which is why many programs simply don't work for no apparent reason, or issue a helpful "Critical Error" message or just crash.
It's a reasonable point, but it's also a trap. Layering compatibility kludges on top of other compatibility kludges that were built on top of someone's really bad idea from a 1987 Friday afternoon special is how Microsoft wind up in the situation where they don't know what their own dependencies are and have to implement virtual DLLs (I mean...really?) in the first place. And it's also an excellent way to make your developer base so confused about how the hell everything's supposed to work (am I supposed to link this against the virtual DLL or the real DLL or the really-real DLL that lives in Minwin? Where the hell did I put my whisky?) that you can be assured of having lots more crappy code you have to hack stuff up to be 'compatible' with in the future. It's a never-ending, guaranteed-to-keep-getting-whackier cycle, in other words.
Of course, other models aren't perfect either. The Linux world prefers to keep its developers in a state of bafflement by making sure everything gets rewritten every two weeks just to keep everyone on their toes. But you really do have to question Microsoft's approach when they're getting to the points of frankly surreal behaviour documented here and on that excellent MSDN blog someone linked to. The longer they delay killing off the crazy, the more it's going to hurt when they have to.
into the same DLL for performance reasons"
As well as making it impossible to disintangle apps from core, like MS Explorer. And forcing any 3rd party apps to duplicate non-core functions if they want to replace function they might be able to do better. Poor structure coupled with a regularly changed API make a very effective way to limit competition.
"we don't really understand these dependancies"
Am I the only person here whose jaw dropped at that?
You have the *all* the API calls, and their parameters, weather or not you share them with outside developers.
You have the source to the compilers which convert it , so if needed you can hack them.
You have a team of tool builders who can write tools to process this source or deveop specific trace tools to write an API call map in real time.
So what is he saying?
They *can't* figure out what calls what when the system is running.
Or they *can't* figure out how to de-circularise this calling.
This would truly meet the definition for FUBAR (correct spelling)
The Windows kernel is *not* some mult-headed beast whose behaviour is unknown and unknowable and attacks without warning (except to users).
It's large and complex and badly structured by this description.
Saying "we don't understand it" reminds me of the decades long protestations of tobacco companies that "We don't know if Nicotine is addidictive or carcinogenic. Honest." While they developed treatments to dial the level up or down in processing.
He scores points for honesty. No doubt this will be followed (in about 12 months) by the usual "Windows 8 is a major improvement on Windows 7" line.
...Tim Anderson for the best read on the Reg in months. As soon as I saw the name Mark Russinovich I knew I would be in for a good read.
...Microsoft for employing Mark, putting him in a position where he can do good work and giving him the authority to do so.
...Microsoft again for grasping the nettle.
...Mark Russinovich for telling it like it is. No bullshit, no spin.
...Sean Timarco Baggaley for the best commentard flame of the year.
It's just occurred to me that I'm actually interested to see an MS OS for the first time (in a VM naturally) . And it's not even at SP1 yet. I need a lie down.
I personally have no intention of migrating my huge IT estate to Windows 7. Its just pure bloatware, with a install footprint of 2.3GB !
Its good to see that HP still know what corporate customers want. All HP business desktops still come with an XP Pro image as the default install. I hope this continues.
Take a look at some companies with massive PC estates, such as Delta Airlines, or BA for example. What do they use on every unit world wide ?.
NT4 SP6 !.
310mb install footprint, and 80mb RAM consumption at idle. Makes for a rapid and smooth OS on modern equipment. Hence why check-in Kiosks run so nice.
Fast becoming a victim of it's own success. WIndows is everywhere now, we are so dependent on it that there hundreds working on "emulators", like Wine, CeDega, CrossOver and VMs that will allow any O/S to run Windows apps and Windows base installs. This is the perfect time for MS to stop fannying about pulling Windows into an ever deepening dead-end. MS can then side with one of these groups and develop something will support old Windows apps. MS can then start doing an Apple, ( dump OS9, all hail OS-X ), turn around and say in 3 years time, Windows as you know it will be dead, we will start again with something new and radical. We have a rock solid VM layer in their for your apps, but you need to start coding to this new O/S standard. They won't, the potential upset to the businesses that feed the MS machine would be colossal, possibly fatal.
MS are starting to step out on to shaky ground if they continue to maintain this ancient codebase without some major, radical overhaul.
Sorry to say it, but Windows 7, with its 15 year-old codebase, is still a pig even when plastered with make-up, very expensive make-up!
Finally, someone with a sense of perspective; the worst software I've come across tends to be in-house applications that were thrown together by some office junior while on work placement five years ago, that inexplicably become vital to the operation of the company (although not critical enough to employ anyone to code it properly).
If this application breaks because of a new operating system, guess who gets the blame? The office junior? No, Microsoft.
I was worrried (well, it wasn't exactly keeping me up at night, but you know what I mean) when MS bought Sysinternals. Russinovich was obviously a very talented guy and the products were great, but I thought he'd be buried somewhere deep in MS and forgotten about.
But it seems he is right up there, and doing lots of work on the core parts of Windows. Frankly, that's a GOOD thing and gives me hope for the future.
I saw two of Mark Russinovich's sessions at Tech-Ed. That man knows an insane amount about Windows! It's impressive not only how much he knows but that he still manages to get little digs at MS into his very well delivered talks.
Also, seeing him have problems with his VM so run his 'tests' on the machine running the presentation resulting in it giving random blank error boxes throughout the rest of the session was funny indeed.
Microsoft couldn't even if they wanted to. They're limited by their userbase, application and driver writers.
You can debate whether Vista was a good product (buggy on release, fixed later is my view), but the real whinges came from a failure of non Microsoft people to do their job properly. Crap drivers, inability to run as non superuser etc. The OS /was/ re-architected and people didn't like it.
Workarounds like FTH are a pragmatic solution for the userbase. Microsoft simply cannot take the OpenBSD attitude of enforcing a strict heap, especially since as soon as OpenBSD did so it broke a fair amount of software including some very old programs. OpenBSD is a great OS, but its target market is considerably different to Windows.
I would also note, before anyone gets too superior, that whilst Windows may be gnarly, even some of the more pared down Unixes are not entirely free from the malaise of unexpected dependencies. They may not have quite so many shims to support misbehaving applications, but there are odd or unexpected behaviours, the origin of which are pretty much lost in time.
I installed Windows 7 last night, it found almost all hardware (29160 : use Vista drivers, Audigy : online drivers) including setting up my four monitors correctly. Finally, it also suspends correctly, which is something Vista never managed..
"It's as if MS are back where Apple were in the late 90's...They know the OS needs an overhaul, but are bodging the overhaul as well instead of re-architecturing the OS from the ground up."
What a pointless argument. Apple were a nobody with a nothing footprint - particularly in the business world. The effect of "re-architecturing" <shudder> was minimal because hardly anyone was (is) using it. The loss of backwards compatibility only affected a tiny number of people and they were mainly home users.
For MS to re-build Windows from the ground up and lose backwards compatibility would destroy the company. The reason people upgrade their Windows OS is because they perceive (right or wrongly) that they are gaining extra benefits/features whilst maintaining the set of applications that they already have. In other words, their past investment is not wiped out.
I know this is a hard concept to grasp for your average mac user but BUSINESSES invest millions in software for windows. They are not going to write that off overnight because MS want to re-build windows from the ground up for architectural reasons. The MS CEO who makes that call will get laughed at, then sacked.
I too have a lot of respect for Russinovich, the guy has proven time and time again to be awesome.
But I'm sorry to say that he now works for "The man" now, BEFORE he joined MS he could claim integrity on his technical claims, not any more.
As a MS employee he won't stand up for the public when some big corporation decides to cooperate with MS on the next commercial spyware/rootkit scheme like the Sony rootkit fiasco.
He will stand aside, or simply say and do what the man tells him to say or do.
Vista/7 has a mechanism a type of restricted/reserved process that in theory, only MS can control, those processes oversee the DRM and other "undocumented" stuff.
And now that he's working for MS god knows what else he is cooperating on with MS, or what is he helping them to hide.
I'm being paranoid? Hell yes, I do not trust the "big", plain and simple. In this time and age when anyone with a budget is making decisions in the name of my safety or well being, all they try is to provide themselves with the means to remain in power.
That leads to totalitarianism, and MS is like that, it's been proven again and again.
Just because I feel like bitching about something this morning I guess...
Like NT4 that much? Do you? Had to work on it much? Stable huh? When's the last time you booted an NT4 server that wasn't a fresh build and it didn't say "At least one service failed to run at startup"? Oh, I'll just put that missing .dll on my flash drive and... Oh, wait...
I do love when you boot a later version of Windows and it claims "Built on NT technology" Isn't that like entering your "PIN number"?
Oh well, if it weren't for computers and their many failings I wouldn't have a job that I can at least stand most of the time, so I guess I'll shut up now.
'So what is the point of UAC? "It is about one thing, which is about getting you guys to write your code so that it runs well as standard user".'
Ah. So if I, as an end-user, choose to disable UAC, it won't do me mr my system any harm. Because, according to Microsoft itself, the purpose of UAC is not to confer any security benefits, but just to force developers to write better (and more secure) code.
Bill's fan club will probably hate me for this, but it has to be said:
1) Download a Linux Distro
2) Burn to CD/DVD (Mandriva Free is a DVD ISO)
3) Insert CD in CD drive
5) When prompted, say "Yes, I do want to reformat my HDD"
Of course, it's not quite as simple as that. You'll probably want to backup your documents first. And in between steps 4 and 5:
4a) Realise your computer has ignored your CD and is starting to boot Windows.
4b) Reboot again
4c) Hit Delete like a mad thing to get into your BIOS
4d) Spend a merry 5 minutes working out where they've hidden the boot order screens.
4e) Amend your boot order.
4f) Press F10 (usually corresponds to save and exit)
4g) Wait for the computer to reboot and start loading the CD
4h) If it's a Live distro, double check that most of your hardware still works.
4i) Double click the button to install it to your computer.
But I'm not surprised at the Microsoft employee admitting they're not entirely sure of the dependencies any more. What almost certainly happens when they're building a new version of Windows is to start with the existing version, then change / tweak bits as needed. And if the programmers don't document their code fully, then seek alternative employment, you're screwed. Heck, there's probably still some Win 95-era code hiding in Win 7... and maybe the occasional method or two from Win 3 or earlier...
"Memory footprint was reduced by up to 30 per cent"
Compared to what? Vista, XP, windows 3.1, DOS?
30% is relative; in itself it tells nothing if not stated compared to what.
Don't get me wrong; I think Mark ruleZ! Microsoft was lucky to snatch him from his Wininternals/Sysinternals. I am sure he mentioned what he was comparing and the statement is just taken out of context.
"the worst software I've come across tends to be in-house applications that were thrown together by some office junior while on work placement five years ago, that inexplicably become vital to the operation of the company (although not critical enough to employ anyone to code it properly)."
Very good point, sir. It's true that MS have probably cocked up by trying so hard to maintain backward compatibility over the years, but a chunk of the blame must be laid at the corporate IT world, who allow bag o'shite apps to slowly become mission critical, when originally they were coded to be stop-gap measures. With all of the Freeform DYnamics stuff on the reg in the last couple of weeks about IT Governance fresh in my mind, you've highlighted a problem there that doesn't get talked about often enough: IT departments should have the balls to tell the business to take a running jump when the business comes knocking to demand a technical fix to a failure of management.
What's that? Your MS Access-based app runs poorly, and you absolutely cannot do without it? Stop using it, and go find a proper app that runs on a modern OS and migrate now. Don't postpone another six months, or to the start of the next financial year, 'cos the situation will just be worse then.
"If this application breaks because of a new operating system, guess who gets the blame? The office junior? No, Microsoft." It should be the director of the department responsible.
<As you deserve one.
"We don't really understand those dependencies"
That's no surprise and refreshing honesty. I've seen far smaller projects than "Windows" get out of control, been there, done that, got the T-shirt, and lived to regret it ...
Someone thinks it would be good to add encryption into a simple module which reads config files when the line starts with a certain 'flag', and it's great, works well doesn't break any compatibility, adds security for any app which wants it, everyone is happy, users, developers, customers, product reviewers, management.
Down the line though anyone who uses the config file module also has to include the encryption module and all that relies on ( and there may be circular dependencies ). Now try to work out exactly what has which dependencies and it's a nightmare, even more so trying to prune them down without breaking anything.
It's easy to say that Microsoft shouldn't have got themselves into this mess ( and I'm sure they'd agree ) but let's be more honest and realistic than that. Seemingly minor and inconsequential changes can have massive impact down the road and such problems can afflict any software development. Microsoft are also at a disadvantage because users expect backwards compatibility, so it isn't just a job of re-doing it properly but also an often necessity of 'bloat' to do so and maintain compatibility. It's a long, hard slog and an almost impossible 'win-lose' situation.
Why is everyone so very shocked by this statement? I don't fully understand the entirity of the C# code base we have here (I know the majority of it) and people call me the systems architect.
Our code base is just an app that runs on WinCE, I would imagine that Windows itself is much, much, much more complex than this.
It would probably be a hell of a mindfuck to place that entire model into your head. Any automated diagram you produce would be insanely hard to read as well. Have you people developed large and complex software? It's hard you know, _really_ hard.
The problem exponentually grows and Windows is huge. So i'm really not surpised. I heard when building Vista each developer was about 6 branches away from the trunk more or less at a minimum. It would take about 6 months for your changes to hit the trunk.
Linux/Unix doesn't have economic pressures, Windows histrorically has. Therefore there are a number of WTFs in the Windows code base. This is what happens when you need to ship by a deadline and on a budget.
"User mode crashes? How the hell do you crash an OS from user mode?"
Yes, Dazed and Confused indeed. Who said the OS crashed? How often do you see a Windows box crash these days? I can't remember the last time and I run mine 24x7. He was talking about APPLICATION crashes. Poorly written apps that fall over and die because they're a stinking pile of crap.
What happens when such an App exists? User complains to software manufacturer, who says "It's Microsoft's fault". Enough bad coders out there make MS the common link. So MS have decided enough is enough, if they detect you're incapable of doing your own memory management, they'll do it for you.
Maybe if they get very good at detecting this stuff, they can just print a big error message saying "software manufacturer A is crap, we suggest you get a refund. Here is a list of their competitors" and remove FTH
Talk about missing the point, the lot of you. The fact that it automatically stops nannying your code when run with a debugger is the best part. Maybe if you had actually tested your code before deploying it then complaining that windows is making it crash less wouldn't be an issue.
Right on. And mr. "Dazed and Confused" isn't even the only one. It seems many of these rabid anti-Windows fanatics haven't actually used any Windows after 98 and still believe that Windows is prone to crashing.
While we're sticking to prejudices that were true 10 years ago, I guess it will be fine for me to keep saying that Linux is exceedingly hard to install and configure, and that Apple computers are severely lacking in apps?
...because it's been done before. The MS-DOS-derived Windows 9x/ME series was a very different beast to Windows NT and 2000 under the hood. So it IS possible to do a rewrite of the kernel and its supporting libraries. The trick is to retain backwards-compatible APIs. What those APIs actually do needn't be the same under hood, as long as the end results are.
(Apple isn't a good example; they sell to the consumer sector, not corporates, so they had far fewer issues with legacy apps.)
As others have pointed out, running XP in a VM is likely to be fine for most corporates. MS could easily include a suitable VM in some of their various Corporate Editions of New Windows. (It's probably not worth including in consumer editions.) This frees MS to make the bold choice of building a brand new OS from the ground up.
And yes, we really DO need an alternative to OS X, *BSD and Linux. Those OS families are fundamentally UNIX variants and have 30-odd years of legacy cruft and design in them too. While this doesn't mean they're unstable or bad, but UNIX's design heritage means it inherently limits the evolution of software design and development, not to mention UIs. (UNIX was designed in the age of punched cards, paper tape and big, reel-to-reel magnetic tapes. User interaction in applications was minimal at best.)
Windows isn't much younger, and the less said about GNU's Hurd project, the better. We need fresh approaches better suited to the 21st Century's needs.
posted by Joe User - Russinovich admitted: "We don't really understand those dependencies".
If Russinovich is finding it impenetrable, then Microsoft have really screwed the pooch.
Windows 8 needs to be a properly engineered, ground up, re design. Anything else is unacceptable.
Win for Russinovich, his honesty is admirable, fail for MS.
If you've ever seen a screenshot of Windows Server 2008 Core, you'll realize it's *not* GUI-free at all! - they've just stripped out the Explorer shell and *most* of the GUI apps, but kept things like Notepad and Task Manager. Pretty pointless really.
Have a look here:
I love these comments! Great entertaining stuff: thank you all.
I've been doing 'professional' software development since 1997 and I don't think I fully understand the dependencies of a C/C++ 'Hello World' program. Most coders I have worked with do not even consider coupling never mind dependencies for each and every context of the problem / solution domain that they are dealing with.
I feel that MS has been afflicted with the same disease as almost every other company that develops software. There is a saying in industry and especially software that goes like this 'on time, within budget, good quality; choose any two'. Well, I think the disease is that sales and marketing are just choosing on time, their time. It is their time objectives that has priority. They must have their bonuses.
I like honesty, don't you? Russinovich has done good. I'll listen to him but I won't listen to Balmer.
Two points really stick out here.
Firstly, there are apps out there that wouldn't have worked if the version had been 7. WTF?! For twenty years, each new OS from MS has been plagued by UTTER FUCKWITS who can't write >= instead of ==. Note the present tense here. These apps would have broken *before* version 6 if it was just a matter of linking in old code from somewhere. Someone is actively writing *new* code that commits this error. And if Microsoft do the right thing and let the app do the wrong thing, Microsoft get the blame.
Secondly, that "developer concern" that the fault tolerant heap might hide bugs. Er, yes, that's the point. Microsoft *want* to hide bugs (added by stupid developers) from innocent end-users. I'm sure end-users will applaud MS for this. If "developers" are "concerned" about such a plan, perhaps they could pull their fingers out and start catching buffer overruns *before* they ship. The tools have been part of your IDE for a decade or more. Try using them you clueless pratt, and stop criticising MS for protecting the end-user against your moronic quality assurance.
Much as I hate Microsoft, I hate FAR, FAR more the idiot developers who pour bugs into the ecosystem and force MS to make each new version of Windows even slower than the last. (It's not like MS need the help guys.)
"Rewriting Windows is possible ...because it's been done before. The MS-DOS-derived Windows 9x/ME series was a very different beast to Windows NT and 2000 under the hood"
Indeed, they were very different. Which is why they ditched one and extended the other. They didn't drop everything and start again. They extended the NT/2K codebase. Hence the versions - Win 2K built on kernel 5. XP was 5.1.
To throw everything out and start from scratch would take YEARS, and they'd run out of money before it finished. What they've done over the releases to date is tightened up the core - aka the kernel. Get that stable and it shouldn't matter what fails on top of it. They're now extending that to tighten up "the experience". They could have done it the Apple way - the iPhone's version of FTH is "don't let shoddy stuff run at all" aka, the App Store.
Microsoft should make the next version of Windows free (as in beer, see icon). Then they should charge per shim.
Customers would then have a real incentive to bash ISVs over the head about quality. The worst offenders appear to be in-house applications, but corporate customers have the deepest pockets, so it serves them right. The average Joe who only wants to use a browser, email and media player wouldn't have to pay a penny.
MS get the advertising benefit of rock-bottom pricing, plus a guaranteed revenue stream from all the dickheads who have caused them so much grief for the past couple of decades. What's not to like?
WTF - I have 11 million lines of code, spread across 3 products, each with a differing architecture. I have no design documentation, but they were built with good architectural and software principles. I manage to support (and bug fix where required) these by myself because they were properly written. Mind you three years of doing this has screwed my brain a bit!
>>Like NT4 that much? Do you? Had to work on it much? Stable huh? When's the last time you booted an NT4 server that wasn't a fresh build and it didn't say "At least one service failed to run at startup"?
Its been about a month. And before that it was about 15 months. Why? Because my NT servers are booted less than once per year. They are stable, why would a DLL go missing? Why would I want to use a USB Drive? Ever consider that allowing USB drives is WHY your machines are unstable? Convenience is NOT the number one priority.
Great, you now have an application that behaves differently when run by a user/tester to when you are running it in the debugger.
I see at least two fundamental problems with this:
1. testers/users may initially see the application crashing, but when they try and reproduce it, the issue magically disappears -- great for consistency and reproducibility!
2. when debugging the application to investigate a different issue, the application starts randomly breaking.
Welcome to the Microsoft uncertainty principle!
Here's another question -- what happens when drivers (e.g. NVidia) start crashing, where does FTH apply then (to the kernel? on kernel memory?).
"I have no design documentation, but they were built with good architectural and software principles."
Wouldn't software principles include design documentation? It does in every software development I've been involved in. Or did you design it as you wrote it?
Either way - one man supporting one man's code is not impressive.
If the OS detects that a program keeps failing it should simpy delete it, thus serving 2 purposes.
1-Automatically Removing crap software
2-Educating the programmer who will very soon get hacked off with having to rewrite and may start to get it correct first time or even give up.
Unless, of course, they are MS programs that keep failing!
Meanwhile with all this kernel and user stuff interacting incorrectly, they should have invented some sort of access control level (ACL) and combined that with an access control register (ACR) to see if the code currently has the correct privileges to execute.
Even simpler have a flag that states code or data and make all data unexecutable.
I'm pretty sure the patent is about to run out on good operating system design, so MS should be able to reinvent it soon.
Windows Embedded is already everything that MinWin is supposed to be (like running in 40MB ram, _with_ GUI). However, Windows Embedded is still based on XP. MinWin is simply the attempts to undo the damage of the Vista development era (where software quality took a nosedive) and make feasible a Vista/7-based version of Embedded.
This is the angle the article missed.
Unix may be 40 now, but it remains a shining light of software. Just finished my Lion's Commentary on Unix v6 from 1976 which is 4 years before I was born. Amazing book! What is amazing is how much hasn't changed. So many of the ideas have survived because they are good. Some retro fitting of things that came after the original design weren't done in a way that fitted, but when the design was truely modernized, i.e Plan9, it didn't take off. Though some ideas did make it into other Unixs.
Windows belongs in the bin. It's too complicated, too tangled, too limited, too grown by sales teams, too closed. As a OS, it's of no interest to me anymore.
Sorry mittfh, just wanted to explain why I voted you down.
(1) Installing any OS is never that simple, be it Windows, Mac or Linux
(2) You ignore the 'applications and data' side of computing. People use computers as a means to an end, not as something to show off their OS. By ignoring the realities of computing, not only is your entire post worthless but you come across as naive.
(Ubuntu + XP VM under VirtualBox user)
Biting the hand that feeds IT © 1998–2019