back to article Memo to Microsoft: Windows 10 is broken, and the fixes can't wait

Windows isn't working – and Microsoft urgently needs to change how it develops the platform, and jettison three filthy practices it has acquired in recent years. In 2014 Microsoft decided it could do a better job if it discarded a lot of software testers. This bright new dawn was lauded at the time by Peter Bright at Ars …

Silver badge
Windows

re: Windows is on a vicious downward spiral

Well, yes they always had issues. But after XP & Server 2003, or in parallel, when Vista development started they totally lost it. Too many stupid changes, too much emphasis on cosmetics, FAR FAR too big a team. Auditorium just for the team leaders? When I read that (2004?) I thought, "No-one can manage that size programming effort, certainly not MS."

21
0
Thumb Up

Hello:

"Did Microsoft ever really produce reliable software? If they did, I don't seem to remember it. And, I've worked in IT for 27+ years".

We all know the answer to this so no, it is not a revelation.

I cannot say that I've actually worked in IT but I've worked with MS software (at home and at my various professional posts) from 1994 on till three or four years ago when I finally decided to drop it and fully enter the Linux world.

I really have to make an effort to try to recall any long span of time without having some sort of issue with MS software. I recall 3.11 being realtively easy to get around but the most troublesome and hectic was W95. The least troublesome was possibly XPSP3, but that could also be chalked up to accumulated experience.

Just my $0.02.

14
3

1) Did Microsoft ever really produce reliable software?

Yes, all through the 1980s and 1990s, and into the early 2000s. MS-DOS was terrific, and so was Windows 3.0, for its time. Office was always far more reliable than its competitors. Windows NT, starting in the mid-1990s, was a miracle of stability when compared to most anything on the desktop. (OS/2 was damn' good too, even if it turned out to be a blind alley.) Windows 2000 was also brilliant, and XP was almost the same OS, with a consumer facelift. Lots more examples - ambitious, leading-edge software that no other company could have pulled off as well as Microsoft did.

2) The "vicious downward spiral" started about 20 years ago.

True. But MS really fell of a cliff about the time Bill Gates found that nobody was reading his memos any more, and left to go and cure diseases.

14
4
Silver badge

Re: "Office was always far more reliable"

Except when the newer version couldn't read files created with the previous version.

Or corrupted them when saving.

13
0

Re: "Office was always far more reliable"

@Pascal

True, but for sheer bloody-mindedness, the refusal of o365 apps to move a cursor, respond to a click, or accept typed input through a human interface device takes the biscuit. o365 has to be the worst version of an office application for actual working since, hmmm, DW370...

4
0
FAIL

MS-DOS was terrific

Come on !!!!!!!!!!!!!!!!!

BSOD

5
3

1) Did Microsoft ever really produce reliable software? If they did, I don't seem to remember it. And, I've worked in IT for 27+ years.

Been here since Windows 3.1 and I do not recall an update ever deleting my files from my computer. I would call that a catastrophic evolution of unreliability. IMHO. And yes, everyone should have backups, but they don't - one job I had was to rescue wedding photo's off a dead laptop because they had no hard copies or thought to save elsewhere.

Even a non booting update can just have Windows reinstalled, a pain but everything is still there.

5
0
Silver badge

I think nostalgia's getting the better of you. Don't remember the joys of MS-DOS configuration, Windows 3.1 and Word blowing up and taking your document with it, Windows 95 crashing when you plugged a USB device in, or XP using much more memory than 2000 for some unfathomable reason (themable GUI?) and driver BSODs? I do.

In fact, out of the early lot, I'd even dare to say that Windows ME was probably the most reliable for me. I might have had the one computer configuration it worked well on.

However at least you could control what happened. Windows 10 just updates and takes your work with it and spaffs adverts in your face.

9
1
Silver badge

DOS was mostly reliable, although DOS 4.01 used too much memory, and the shell was useless. That's 'reliable provided you only want to run one program without TSRs', obviously. The problem was that most people did want to add networking and other devices, and then there was insufficient memory left over to run your app.. I don't miss those days, although installing DOS on a modern system is far less painful due to the availability of packet drivers and CD/mouse/peripheral drivers that use minimal amounts of memory.

Otherwise, NT 3.51 was really solid. I'm trying to remember which version of NT had a disk corruption bug on release that was fixed *very* quickly with a service pack, and think it was 4.0 - but I might be wrong.

2000 was pretty decent too, solid, added USB, and modern DirectX was useful (if not for servers).

To be fair to Microsoft, they've mostly fixed Windows over time. NT4 was great after SP3. XP was great after SP2. Vista wasn't perfect, but SP2 fixed a lot. W7 was fine after SP1. Can't remember about 8. 10 has been a bit annoying at work but mostly alright, hibernation issues were fixed in Fall Creators Update. At home the sodding thing claims my graphics cards are broken - they work fine under 8, provided I don't update NVidia drivers beyond a certain release.

The problem here is that 10 isn't given the chance to bed down. For operating systems with a huge amount of backwards compatibility such as this, I'm not in favour of regular updates.

2
2
Anonymous Coward

10 has been a bit annoying at work but mostly alright,

The problem with 10 is more that each different build has had its own quirks and things that either don't work properly or only with workarounds specific to that build. The complexity of looking after more than a couple of builds at once adds up.

Plus, each build is a mixture of things fixed over the previous one, and new problems introduced. You can't get a version that doesn't have some broken or poorly tested parts. Even LTSB/LTSC, which MS are going to great lengths to discourage people from using.

2
0
Silver badge

Re: MS-DOS was terrific

Come on !!!!!!!!!!!!!!!!!

BSOD

WTF? MS-DOS never did BSOD, it was a Windows thing only.

8
0
Silver badge

"Did Microsoft ever really produce reliable software?"

No, they didn't, but after release they had the time to clean up most of their mess so if you just waited long enough, you could get reasonably reliable software from them. Now that they're doing rapid release, there is no time for them to clean up their mess, and you can no longer get reasonably reliable software from them.

So, in short, they went from "not great" to "terrible".

2
1
Silver badge

Windows 95 crashing when you plugged a USB device in,

Never had that experience! Back then we called it "useless serial bus" because there weren't any devices to plug into it that were readily available. The PCs sold by the outfit I was with didn't have USB ports even though the motherboards did support it, but we only had one person ask for USB in all my time there, and he just wanted the board along with the connector (which we did not have), not a full PC.

Windows 95, as originally released, didn't even have USB support (OSR 2.1 added it). At the time I'm referring to here, or the start of the time period at least, OSR2 (95 "B" as it was sometimes called) was brand new for OEM distribution.

1
0
Bronze badge

Re: MS-DOS was terrific

That's true. It would just lock silently.

"I created control-alt-deleted, but I gotta hand it to you--you made it famous!"

0
0
Silver badge

Re: "Office was always far more reliable"

"Except when the newer version couldn't read files created with the previous version."

And completely hung the machine to the point of needing a H/W reset or reboot whilst trying to do so.

0
0
Silver badge

> Yes, all through the 1980s and 1990s, and into the early 2000s. MS-DOS was terrific,

I always found that DRI operating systems were far better. Early MS-DOS systems would corrupt diskettes if they were swapped. CP/M and DR-DOS would actually check. I used DRI's MP/M and Concurrent-xxx which supported hard drives and pre-emptive multi-tasking and multi-user when MS-DOS still only did floppy disks.

MS-DOS was always a poor performer which is why most successful software bypassed it for everything except file system to do direct BIOS calls or even bit banging the video cards.

While MS-DOS 5 was almost up to what DR-DOS 5 did it was 20 months later. DR-DOS 6 then brought better memory management and even task switching while MS-DOS took almost another year to catch up.

> Windows NT, starting in the mid-1990s, was a miracle of stability when compared to most anything on the desktop.

Windows NT was certainly more stable than 3.x or 95. 95 had a bug for two years where, if the internal clock overflowed after 39 days and some hours, the system locked up. This was not reported for two years because no one had reached that point.

> ambitious, leading-edge software that no other company could have pulled off as well as Microsoft

did.

Most of which was bought in from other companies, or the company was bought. Even NT was a bought in project that eventually MS had to settle with DEC by paying them a reported $100million.

1
0
Silver badge

> Office was always far more reliable than its competitors.

But when it completely screwed up a file this could be recovered by using LibreOffice.

3
0

No, Windows 3.0 was not terrific. On a computer with Windows 3.0, Microsoft Word and Microsoft Excel (forgot what the version was), would get sever unrecoverable application errors daily, causing a loss of data and a reboot. Win 3.1 was much better. NT was the first Microsoft OS that wasn't a toy operating system and had any level of security built into it.

2
0
Silver badge

Re: MS-DOS was terrific

MSDOS crashed and spontaneously rebooted, usually due to weird combinations of TSRs, other drivers, and hardware issues such as IRQ conflicts.

Let's not romance the past. A basic DOS system only reading and writing to a disk and using standard VGA is pretty easy to get working. Problem is, everyone wanted printer drivers, networking, accelerated graphics, in short - a protected memory multitasking operating system.

0
0
Silver badge

"Quality" is a structural attribute, not a bolt-on

Therefore prodding a "black box" after creation and each iteration is not going to be effective in converging to a satisfactory product.

23
0
Silver badge

Re: "Quality" is a structural attribute, not a bolt-on

I think we all know the cant of the Agile Brigade.

You still need QA at the far end.

And the OS-y you product is, the less relevant the Agile Part of your product developement strategy.

11
1
Silver badge

Re: "Quality" is a structural attribute, not a bolt-on

And the OS-y you product is, the less relevant the Agile Part of your product developement strategy.

Absolutely. Also if a piece of software is your business system, you certainly don't want to be mucking about with that. Change has to be done very carefully.

I'm sure that's why you see in airlines, retailers, etc a lot of text mode software that originally ran on 3270 terminals. It's there, it does its job, it never goes wrong. If it does break the business is dead in days at most.

Not a place for agile development.

Agile, if done properly, is just another way of discovering what someone's requirements are. However it is often abused as a way of taking short cuts in development. Shortcuts lead to failure.

14
0
Silver badge

Re: "Quality" is a structural attribute, not a bolt-on

I'm going to disagree with you on that. There is nothing fundamentally wrong with agile development. It does not permit untested work. TDD (which if you're doing agile correctly is kinda mandatory) means that you should have your tests written before you start copy pasting from stackoverflowwriting code. If you are doing things properly, you are getting input from QA before you design the tests in the first place. It was never about removing QA from the process. Everything about it is to remove the distance between the subject matter expert and the code monkey so that misunderstandings can be discovered and rectified much sooner.

Of course, you are probably thinking about that other definition of agile favoured by PHBs the world over, where any form of analysis is disregarded because agile, any form of planning ahead can be forgotten because agile, and any form of QA can be ignored because we once showed the devs how to install nunit. That is of course bollocks.

Quality is derived from culture. You need a culture that is ashamed of breaking things, ashamed when their test case design fails to detect a breaking change, is proud about coverage (real, not by fooling the tools), hates when something slips through to QA and really hates when a customer suffers a bug.

Companies that only value story point velocity, that don't invest in reducing technical debt inevitably find themselves producing code which is a quick hack around some work around ona half designed proof of concept which resists even basic enhancements, which then hurts the velocity, so no time for paying back that technical debt and the QA cycle needs to be cut to make deadline. Sigh, I guess some people cannot learn.

11
3
Silver badge

Re: "Quality" is a structural attribute, not a bolt-on

You're not wrong, but the problem is definition vs usage.

It's like 'loose' and 'lose' or 'less' and 'fewer' in English. There are differences, they are defined, but if people continually use the words incorrectly they become accepted usage.

Continual abuse of the Agile philosophy is making it toxic. As the unit tests are generally written by developers, there's far too much scope for taking shortcuts. At least a more classical development model formalises a specification and testing. Yes, it still frequently goes awry, but if the specification is incomplete it's obvious, and if the QA are reduced, there's no 'agile' excuse to hide behind.

3
1
Silver badge
Meh

Re: "Quality" is a structural attribute, not a bolt-on

"As the unit tests are generally written by developers"

about that. 'unit tests' aren't necessary in the majority of cases, especially when they're TRIVIAL. If the system is properly designed, i.e. broken into functional unjits correctly, you can test the overall functionality of the whole on a known dataset. This also means properly writing the thing in the FIRST place, to avoid all of the usual problems (memory leaks, buffer overruns, mishandled erroneous data, use-after-free pointers, etc.).

In other words, "unit test" for every trivial freaking thing is what JUNIOR coders do, because they can't see past their own noses and look at the BIG picture.

(as for making up some kind of test data to verify an algorithm, that's not the same... you use that to WRITE the algorithm, and once done, you NEVER! HAVE! TO! TEST! IT! AGAIN!!!)

1
5

Re: "Quality" is a structural attribute, not a bolt-on

I can see your point with agile, but I'd disagree about companies using old code. History shows us that this code wasn't necessarily (and wasn't in practice for most cases) written and tested as thoroughly as possible using wonderful coding practices that we've sadly lost. The code that's still running is probably quite solid given the decades of testing in the field that it has received, but otherwise it's code that can fail as much as any other code. Companies still use it because they have a fear of doing something differently and because why spend money on making the code fast, modern, and perhaps more full-featured when you can not spend that money and instead spend it on the people keeping old hardware and virtualized old hardware functioning?

If they had to change their business practice and modify their software, the changes are much more likely to be written, tested, and put into operation quickly if the codebase is modern. With an old coding system, you need developers familiar with it (fewer people) and ideally people not only familiar in the sense of "I worked on this in the 80's and 90's" but also in the sense of "I can still remember off the top of my head what that hex error code means". Meanwhile, a modern codebase can run on a lot more stuff and can be repaired should it break without needing specialist knowledge.

0
0
Bronze badge

Re: "Quality" is a structural attribute, not a bolt-on

Devs properly trained in TDD produce working code faster than without. You don't unit test constants, nor java setters & getters. You do write a test before you implement a branch in code execution.

For simply algorithms, yes, it is possible to create a golden dataset, and when the code passes it, it passes. Oh, wait. That's a different form of TDD!

But for complicated algorithms (and we are all guilty), state explosion makes this impossible. Worse, unless you have an advanced degree in mathematics, or a first-class undergraduate degree, you're going to miss things when you write tests before or after. (If you do, you are still going to miss things, but your training will keep you going back enough that your chance of committing bad code goes way, way down.)

0
0
Silver badge

Re: "Quality" is a structural attribute, not a bolt-on

"changes are much more likely to be written, tested, and put into operation quickly if the codebase is modern."

Define modern. This sounds very much like the usual rant about having to periodically update legacy S/W i.e. the stuff that's earning the business's income.

And you shouldn't need to remember what that HEX code meant because there should be documentation, preferably as an old-fashioned comment in the source, to tell you what it means.

1
0
Silver badge

Re: "Quality" is a structural attribute, not a bolt-on

@Adam 1,

Oh, Agile if done properly is fine, but it rarely is. The money men don't see the value of rigour regardless of whereabouts in a programme it crops up. Agile too readily gives them an excuse for dispensing with rigour altogether.

Also one of the tenets of Agile seems to be to embrace failure, let it happen, deal with it when it occurs. That might be fine for a Web IM service, where a day or so offline won't matter, but not elsewhere.

0
0
Silver badge
Boffin

Re: "Quality" is a structural attribute, not a bolt-on

"you use that to WRITE the algorithm, and once done, you NEVER! HAVE! TO! TEST! IT! AGAIN!!!

... until your business requirements change and you have to modify said algorithm so that it returns a different results set under circumstances x, y and z, but otherwise must return exactly the same data as it has been doing until now.

That is the value of full and proper unit test coverage - it's not about making sure your code works right now, but ensuring that it continues to work as expected after modification. Otherwise, the chances are you're just playing whack-a-mole with bugs.

The idea that "just writing it right in the first place" is an archaic throw-back to the pre-internet era when systems were monolithic and updated once in a blue moon by a single big-bang operation. Businesses now expect new functionality to be delivered rapidly and seamlessly, and as developers, we have to adapt or die.

The irony is that until a few years ago, I used to think along the same lines - "what's the point of unit tests? I've manually tested my code and it works!" But now, when faced with modifying a chunk of code that was written a year ago by someone who is no longer with the company, finding a good suite of unit tests that document how it's supposed to work and catch what I might accidentally break is not only reassuring, but also vastly increases the speed I can work at.

1
0

Re: "Quality" is a structural attribute, not a bolt-on

Me: "changes are much more likely to be written, tested, and put into operation quickly if the codebase is modern."

Response: "Define modern. This sounds very much like the usual rant about having to periodically update legacy S/W i.e. the stuff that's earning the business's income."

Modern refers to the original comment about businesses continuing to run code written in the 1980s because, as the comment claimed, that software was just written really well then so they have no problems. Not only do I not believe the software was just wonderful then, as it probably had plenty of bugs that had to be removed from it back then, but it is difficult to change. Code written in the 1980s will still run, but only on legacy hardware or operating systems which imposes another cost on the business. If you need to update how the software works, your options are:

1. The codebase was written in the 1980s in a 1980s language. In order to modify it, you need people familiar with that language. This is not a ton of people. Many of the people who are familiar with it, looking for a job, and willing to work for you have been writing in different things. I know, for example, a person who wrote assembly for various Cray supercomputers. I don't think he would know how to do that now, though he'd be faster to relearn it than would I.

2. The code is written in a modern language. This may be painful to port from the original code, but it is now easier for it to be modified. You still have to hire good programmers, and you shouldn't do it as cheaply as possible because you'll end up with terrible bugs. However, when you need something updated, it's much easier to find a competent person if the language is more modern. If I found a bug that needed to be fixed quickly, I'd much rather the code be written in C, Python, Java, or most other modern languages than Cobol, because I know I can find someone to write in those languages. I don't know how to find a competent Cobol person, let alone assembly for $random_processor_from_three_decades_ago.

Therefore, I would disagree with the assertion that companies should continue running old code because people just don't write how they used to. I think that's a dangerous course of action most of the time, and although many modern companies forget important testing practices and the like, there are others that still produce reliable code.

0
0
Anonymous Coward

I think it's worth remembering ...

that history is littered with companies that seemed so big they'd always be there.

TWA and Pan-Am to name two (because Arthur C Clarke thought Pan-Am would still be around in 2001).

In the UK, GEC might stirs some memories .....

36
0
Silver badge

Re: I think it's worth remembering ...

"n the UK, GEC might stirs some memories"

As does Ferranti and a few more.

Is it possible that rather than businesses becoming too big to fail they actually become too big to not fail?

21
0
Silver badge

Re: I think it's worth remembering ...

And what was the other company that evaporated years ago? I can't remember it's name but we all said that the initials stood for "It Can't Last" ...

14
0
Silver badge

Re: I think it's worth remembering ...

AC - There have been many companies disappear because of major mismanagement. The list is long and there is nothing to say Slurp and other mismanaged IT firms wont join it in the future. A few (non-IT) I remember are RCA (trademark lives, the company is gone), McDonnell-Douglas (gulped down by Boeing), Radio Shack, Toys-R-Us, Pennsylvania Railroad, New York Central Railroad.

9
1

Re: I think it's worth remembering ...

IBM, which was once (believe it or not) synonymous with desktop computing. Borland, Ashton-Tate, Lotus. Netscape. Compaq. Novell. Digital Research.

In the 1990s, MS could never have been as stupid as it has been lately, without being instantly devoured by smarter competitors.

15
0
Silver badge

Re: "In the 1990s, MS could never have been as stupid . . "

That is probably a significant part of the issue. Everyone remembers an MS that was lean and executed rather well. Not everyone realizes that when MS was a lean, mean fighting machine, it's because there was competition.

Where does MS have competition in the OS space now ? Nowhere. Ergo, no need to pay attention, got fat, got sloppy, got childish. Is now more interested in bling than functionalty.

And what can we do about it ? Zilch. Nobody is going to migrate to Linux because that is a functional nightmare for a company of just about any size.

So we bend over and take it, and MS knows we will.

7
2
Anonymous Coward

Re: "In the 1990s, MS could never have been as stupid . . "

Everyone remembers an MS that was lean and executed rather well. Not everyone realizes that when MS was a lean, mean fighting machine, it's because there was competition.

It appears that when I age, my memory seems to erode too. In the 30+ years I had the displeasure of having to deal with Microsoft products because corporate idiots keep buying it, I have never come across anything in MS that was lean and worked well. If there is anything that has been part of the MS culture from the days of W95 onwards it is bloat. IBM and Borland both recompiled Windows code into something that was more compact and executed so much significantly faster that Microsoft had to grab for the compatibility excuse to rescue itself.

No, lean I would not attribute to Microsoft, ever.

6
1
Anonymous Coward

Re: I think it's worth remembering ...

International Computers Limited

0
0
Silver badge

Re: "In the 1990s, MS could never have been as stupid . . "

" Everyone remembers an MS that was lean and executed rather well."

Yes, that was back in the early DOS era. After that, not so much.

0
0
Silver badge

Re: "In the 1990s, MS could never have been as stupid . . "

> Nobody is going to migrate to Linux because that is a functional nightmare for a company of just about any size.

And yet Linux is the basis of the most common OS found anywhere (Android) has 99+% of the top supercomputers and has more than 50% of the servers, and dominates embedded systems.

Most companies are already using it.

1
0
Silver badge

Re: I think it's worth remembering ...

> International Computers Limited

I worked for them here is New Zealand and on projects in Bracknell for a couple of decades. In fact I joined ICT shortly before they changed to being part of ICL.

I still have quite a number of ICL machines in my stack from ICL 1501s (not to be confused with ICT 1500s), PC1, PC2 (8085 and 8086), Quattro, Quattro XM, DRS20 model 40 and 150, DRS300. 6402, 6404, 6404L, 303.

Available by collection only.

0
0
Silver badge

Re: "In the 1990s, MS could never have been as stupid . . "

"And yet Linux is the basis of the most common OS found anywhere..."

EXCEPT at the desktop due to all the legacy baggage and the need for performance.

0
3
Silver badge

Re: "In the 1990s, MS could never have been as stupid . . "

Thumbing me down doesn't make it less true; otherwise, why isn't the Linux Steam library nearly as big as the Windows one, despite advances in DXVK and the like?

0
0
Silver badge

But will they listen?

This article is right on, I think. MS really has screwed the pooch. I'd actually like to be able to use Win 10 eventually (for gaming if nothing else), but don't dare as it's too much of a dog's breakfast. Maybe someday they'll get leadership who has a clue. As another poster said, just releasing a service pack for Win 7 would be a big step forward from where they are now.

Meanwhile, SatNad is doing his best to make come true that ever-receding Year of the Linux Desktop.

35
1
Silver badge

Re: But will they listen?

"Meanwhile, SatNad is doing his best to make come true that ever-receding Year of the Linux Desktop."

I wonder if MS have looked at the way various Linux distros work and not quite understood what they've seen.

The Fast, Slow, Release stuff looks a bit like Debian's Unstable, Testing and Release and the 6 month schedule looks a bit like Fedora. What they haven't grasped is that the cadence of Debian's system is a few years, not 6 months and the 6 monthly Fedora releases feed stuff into the money system, Red Hat, when things are good and ready.

What's more these are distros. Their development is integration of components that have already gone through their own development in a myriad of other projects.

Everyone in software development should occasionally go back and reread chapter 1 of TMMM and reflect on the difference between a raw component and the finished article. I wrote "reread"; probably too many have never even read it once.

32
0
Silver badge

Re: But will they listen?

Meanwhile, SatNad is doing his best to make come true that ever-receding Year of the Linux Desktop.

So he's doing his best to ensure 'Year of the Linux Desktop' remains something yet to be achieved???

Isn't that a good thing (from Microsofts perspective)???

2
4
Silver badge

Re: But will they listen?

@Doctor Syntax - "I wonder if MS have looked at the way various Linux distros work and not quite understood what they've seen.", I have to agree. What I see with Linux distros is there is a core such as the kernel where each component is developed at its own schedule. Combined with the relative independence of each component (X could be replaced by something else, desktops are replaceable, browsers are OS independent, etc.) means a the distro developer is not really develop a large amount of code. This has some nice benefits for Linux in that the pieces are more modular with much fewer cross dependencies and what each distro brings to the table is geared to a more specific audience. To replicate this by Slurp would require effectively break Bloat into independent pieces with each piece having its own development path.

You noted that Fedora is geared to help Red Hat develop features for the 'money system'. However Fedora releases are quite polished and ready to use. Ubuntu might be a closer comparison to Bloat. The flagship release is the 'money system'. Ubuntu does something different than Slurp, they have LTS versions intended for the general user and supported for 5 years. The intermediate releases, again while polished, are more to develop features for the LTS versions. But both Red Hat and Ubuntu recognize that some features may need to incubate longer before being ready for the masses.

16
0
Silver badge

Re: But will they listen?

"However Fedora releases are quite polished and ready to use."

Maybe now. It wasn't how I viewed it back when I used it.

1
0

Re: But will they listen?

The software for Linux tries to minimise changes to the software interfaces so that if software "A v2.0" is said to work with software "B v3.x" then both try to keep backward compatibility until they bring out "A v3.0" and "B v4.0".

1) You can change the internal functions but you can't change the existing interface (the command names or outputs seen by other programs/users).

2) You can add new features but don't break or remove old features (first do no harm).

3) If you want to make major changes then you fork the code tree into a new major version release.

Because the software interface doesn't change then other software that relies on it doesn't break unintentionally.

The problem with Microsoft is that they break all 3 rules listed above. Microsoft remove and half replace features with no care. Microsoft products often have more features broken than fixed each release. They rip out the old GUI dialog and then have this new "user-friendly" process that requires more clicks or has half the functions because it was left unfinished before release. There are too many dependencies where different MS products/components can break each other. Instead of making new 64bit versions of the OS components with new names, they: repurposed the old names; renamed 32bit components so 32bit compatibility was not certain; and made a headache for everyone (only new 64bit SW should have required recompile instead they made 32bit need new installers).

0
0

Page:

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Biting the hand that feeds IT © 1998–2018