I thought Mr Cutler came from Digital Equipment Corporation, where he'd been developing DEC's VMS operating system...
Microsoft released Windows 1.0 on 20 November, 1985, a year later than first promised. Now, nearly 27 years on, Windows 8 is on the shelves. The operating system was chugging away full-steam ahead as Windows XP established itself - then it jumped the tracks at Vista. Where is Microsoft's OS going now and where did it come from …
I thought Mr Cutler came from Digital Equipment Corporation, where he'd been developing DEC's VMS operating system...
He did indeed. And to assert that security was some sort of afterthought in the NT design is patently ridiculous.
There had to be at least one thing that slipped through the net. Yes, he did come from DEC. I should have clocked that. It's fixed.
Security was given a low priority, though it was not Cutler's decision. That was a management decision. Baking in a secure kernel and separate address spaces was strongly recommended by the development team. Ultimately the management rejected it due to the increased cost and time to market.
Complete and Utter Rubbish - NT right from 3.1 had securtiy as a design prioity and has a secure kernel and a seperate address space for it. And full ACLs throughout from the ground up - not as an after thought - like for instance in UNIX type OSs.
The second goal was reliability: The system should no longer crash due to a faulty application or faulty hardware. This way, the operating system should be made attractive for critical applications. To meet this goal, the architecture of Windows NT was designed so that the operating system core was isolated and applications could not access it directly. The kernel was designed as a microkernel and components of the core were to run atop the kernel in a modular fashion; Cutler knew this principle from his work at Digital. Reliability also includes security, and the operating system should be able to resist external attacks. Mainframes already had a system where every user had their own account which was assigned specific rights by the administrator, this way, users could be prevented access to confidential documents. A virtual memory management was designed to thwart attacks by malware and prevent users from accessing foreign areas of memory
ISTR the rumour was that NT 3.5 was every bit as secure as VMS. This meant that the graphics performance was seriously limited, and that it would be almost impossible to put a Windows-98 style interface on top of it with the hardwarre of the day. Microsoft wanted to blow holes in its security to do so. Cutler said over his dead body. Microsoft overruled him and he resigned. Thus NT4 was created. Microsoft then blew more holes in it to make Windows 2000 and yet more to make XP. It wasn't until XP SP1 that they realized Cutler had probably been right. By then they'd made such a mess of the code base that they had to start again with what became Vista (and then 7).
The system should no longer crash due to a faulty application or faulty hardware.
Yeah, and Windows NT 3.x+ NEVER crashed due to faulty software or hardware. Back in the real world, for performance reasons, many drivers, including the graphics, ran in kernel space with full privileges -- this is why you could bring a Windows NT box to its knees just by enabling the 3D screensaver.
Security and stability may have been two of the goals, but they were regularly compromised to meet performance and usability goals as well.
An interesting omission, perhaps coincidental but add 1 to the ASCII of VMS and you get WNT (which, one story says is why NT is NT, and "New Technology" was just convienient). Although denied, a similar theory was turning IBM into HAL from 2001 (IBM -1), probably another coincidence, the explanation that HAL stood for Heuristic ALgorithm works well, but doesn't explain SAL from 2010 (and of course doesn't work with a +/- 1 either).
Certainly I remember that Windows 2000 crashed frequently using large Excel sheets or complex Word documents
Actually NT 3.x was a relatively pure microkernel system with external device drivers. It was only with NT 4.0 that the video driver was moved back into the kernel to boost performance.
a most excellent synopsis of what really went down. well done.
I know from personal experience with NT4 that a (accidentally) corrupted Word document could consistently cause a BSOD, on multiple machines (but possibly all the same basic Dell Optiplex system)
Well the main selling point of Windows NT was that it ran all Windows 3.x and 9x Software (as well as OS/2 and Posix, at least initially). However this was the 1990s and both programmers and API were preety bad by then. It was not uncommon for Windows 3.x and 9x Software to directly access the hardware since that was actually simpler than using some barely documented API. I remember a Telnet-Server for Windows 3.x which even brought its own Scheduler along.
So in the 1990s it was normal for your software to need administrator rights. For example if you had a POS system it somehow needed to talk to the serial ports. Few programmers back than managed to do that without running as Administrator, so everybody using a Windows 2000 system productively was essentially an administrator.
Then there are problems which weren't foreseen back then, like the application distribution problem. There simply were no software repositories with trusted software, or package manager. You were supposed to get an executable, run it and it is supposed to put all the necessary files onto your system.
Windows NT is not insecure because of Kernel design. It's insecure because of the Ecosystem around it which it inherited from Windows 3.x 9x.
> Well the main selling point of Windows NT was that it ran all Windows 3.x and 9x Software (as well as OS/2 and Posix, at least initially).
It certainly did not run _all_ 3.x and 9x software. Most of it maybe. In particular many Windows 9x games did not run until XP.
The OS/2 personality was text mode only. No presentation manager. What they called 'POSIX' was not.
And full ACLs throughout from the ground up - not as an after thought - like for instance in UNIX type OSs.
I don't know what you have in mind when you say that Unix only has security as an afterthought. It was built from the ground up to be multi-user, with strict separation among those users (both for in-memory applications and on the file system). It also had the novel setuid mechanism and associated su and chgrp functionality pretty much from the outset. I think that the creators actually got a patent on the setuid mechanism, possibly combined with its use with the passwd program which effectively allowed each user to change their own password in a single system file while not allowing it to change anything else there.
Almost anything that can be implemented using ACLs can also be implemented using the user/group and setuid/setgid mechanisms. About the only area that I can think of where Unix is perhaps more permissive than it should be (for a paranoid sysadmin) is in allowing network access for all users (*). But then again, Unix wouldn't have been such a resounding success without networking, I think. If the designers had wanted to include some sort of "access rights" for the network, then they'd basically end up with something like VMS's security model instead. But then, it obviously wouldn't be the Unix that we know and love :)
* Actually, I realise that this can be done in modern Linux using an iptables command to drop traffic based on userid. I don't actually know how early Unix implementations implemented network access. For all I know, all the network access functions might have actually used a device file at the lowest level. If so, then it actually would have been possible to restrict net access on a per-user basis using the standard user/group security mechanisms...
I know from personal experience with NT4...
NT4 saw a huge change in the system architecture, as others have noted: the HAL (Hardware Abstraction Layer) was gutted and graphics, among other things, pushed into Ring 0, where there's no protection. It's difficult to prove this, but it seems likely that most crashes in NT4 were the result of Microsoft's violation of the separation of concerns in the NT 3.x architecture.
In other words, they demonstrate that most of the kernel-level reliability and security issues  are the result of Microsoft undoing Cutler's security model, rather than it being an "afterthought" on his part.
 As opposed to those at the UI level, particularly the utter lack of attention to users' need to switch between privileged and non-privileged access. It took Microsoft years to deliver an equivalent to UNIX's su utility, and when they did, for a long time it was an add-on (part of the Windows Resource Kit; Windows Services for UNIX also had an su implementation). That, and a failure to encourage application developers to get their privilege models right, led to the "run as administrator" culture mentioned in the article, and ultimately that has been one of the two or three greatest Windows security weaknesses.
Windows was only a GUI, dependent on DOS for the boring OS stuff, until 95 & NT!
Not that old chestnut again. Consider the tasks performed by an operating system: managing processes and memory, controlling devices, managing the filesystem... Windows 3.1 did all of these. Dos was little more than a boot loader. You may as well say that Linux isn't an OS - it's dependent on GRUB after all.
"Dos was little more than a boot loader. You may as well say that Linux isn't an OS - it's dependent on GRUB after all."
I hope you don't believe your own words. Linux the kernel can be loaded in a number of ways, grub is not needed. First. Once the kernel is loaded, grub is not running any longer. Second. W[1-3] was an application running on DOS. Remove the underlying DOS, and Windows goes to its knees. Third. An Application that controls resources and has been loaded by a program loader is not necessarily an operating system.
Windows 95 was a disaster. It consisted of a somewhat hidden MS-DOS 7.00 on top of which ran Win16 on top of which ran a Win32 adapter.
Windows NT did not drop DOS, but added a DOS subsystem, a 16-bit Windows subsystem, a 32-bit Windows subsystem, a 16-bit OS/2 subsystem and a POSIX subsystem. It ran without crashing for days or even weeks on end.
First. Once the kernel is loaded, grub is not running any longer. Second. W[1-3] was an application running on DOS. Remove the underlying DOS, and Windows goes to its knees. Third. An Application that controls resources and has been loaded by a program loader is not necessarily an operating system.
It isn't as simple as that. For a start, none of your criteria are in any formal definition of an operating system I have ever seen. All of these highly dubious criteria also apply equally well to the relationship between e.g. Solaris and Sun's OpenPROM monitor (on Sparc, obviously). In some respects even more so: one of the first things Windows did was to unload command.com: with OpenPROM the whole shebang remains in memory and indeed it is still running in the background, which is why you can get back to the monitor at any point from the system console. Is Solaris not an operating system?
Also consider that as pointed out elsewhere Windows 9x still used DOS, only it came packaged with Windows rather than in a separate box. If we accept that this inclusion suddenly turned Windows into an operating system the definition becomes essentially a marketing rather than technical one. You may find it comforting to put DOS and Windows into neat little boxes where one is an operating system and the other is not, but in reality any subjective technical assessment shows that this abstraction is not a clean one, indeed it is so messy as to be essentially meaningless.
> managing the filesystem... Windows 3.1 did all of these
No, you are wrong. From Windows 1 through 3.11 the filesystem was controlled via the underlying MS-DOS (or DR-DOS). Windows 95 changed this by using VFAT which was part of Win95. Though the underlying MS-DOS still resided under Win95 it was only used for very minor functions dictated by compatibility issues.
Actually, you are both wrong. 32-bit file access was introduced in 3.11. That completely bypassed DOS for filesystem access.
"It turned out Microsoft, having hired kernel guru Dave Cutler from Digital Research in October 1988, was already hard at work on Windows NT."
Wrong - Cutler used to work for Digital Equipement Corporation where he was one of the main OS designers (RSX, OpenVMS, ...).
I think it was this very organ that pointed out that XP in Greek is Chi Rho...
Sooner or later the UI will change and evolve, I just hope its for the better, looking at the NT4/Win95 up untill win7 there hasn't been much change in the way the GUI works except for extra bells and whistles, osx had the biggest change from Mac os1 though to 9 and even then they still maintained the menubar thing at the top. I think for new PC uses it might actually work quite well as for people that have been using PCs for along time its going to be a very odd transition (I still use WindowsXP x64 on my new PC even though Win7 is the newer OS (I still use it with the classic win2k interface).
Wrote :- "Sooner or later the UI will change and evolve, I just hope its for the better, looking at the NT4/Win95 up untill win7 there hasn't been much change in the way the GUI works "
The UI has changed in the story of PCs AWAY from a plain desktop with app icons. Take another look at that Windows v1 GUI and you see that Windows 8 has returned to the same primitive look. That is not progress, it is going in circles.
Most people were dumb about computers in 1985, and a dumb GUI was needed to appeal to the mass market. Now MS are assuming we are dumb again if they are advocating Win 8 for desktops - as Gates is in fact : http://www.theregister.co.uk/2012/10/22/gates_windows8_phone8_merger/.
Try this link for a good summary of the situation, from which I quote "Windows 8 totally pisses in the face of over 30 years of user interface research" :- http://toastytech.com/guis/win8.html
"Most people were dumb about computers in 1985, and a dumb GUI was needed to appeal to the mass market. "
I disagree. If you were using a computer in 1985, you had to know a few things. There wasn't a universal pointy-clicky Crayola interface, you had to use DOS. So you had to know how to do things like copy files, delete files, change your current directory, etc. You had to worry about the commands in your config.sys and autoexec.bat files. And you had to learn things like how to get a printer connected and working in each of your programs, because each program (Lotus, Aldus, WordPerfect, etc) had it's own "drivers" and funky way of setting up and communicating with your printer. So you had to learn which port the printer was on, what IO address it used, etc. And if you were really fancy, you had a modem and could dial into a BBS, or even the Internet, so that was more port and IO information that you had to figure out, not to mention figuring out the rat's nest of jumpers on your internal modem so you could make sure the modem knew to use the same port and IO that you had Telix set to use. And if you wanted to hear music on your PC (later in the '80's, iirc) then you had to do all that again with a Soundblaster card.
So, no, you couldn't use a computer in the 80's and be "dumb". You had to learn stuff. There was no Plug-And-Play, no unified driver model, none of that. Actually, "dumb" users couldn't come on the scene until at least Windows 95, and even then Plug-n-Pray wasn't too good, so it was more like Win98SE before the true dumb-asses could start using their CD tray as a coffee-cup holder.
But I do agree with your larger point about Win8. It's a total mess and another step in the absolute dumbing-down of the WIndows interface that really kicked off with WinXP.
yeah, well said. They weren't called the good old days for nothing - people would pay me to set the jumpers and i/o addresses for their cards and peripherals. Gimme back my livelihood, missa gates!
"So, no, you couldn't use a computer in the 80's and be "dumb"."
Early Acorn user here. I was pretty dumb. Most things sort of worked. It got much better after RISC-OS instead of Arthur. Just about the tail end of the 80s I think...
I take your point: no way could you do much with an IBM PC without help or a lot of knowledge.
"I disagree. If you were using a computer in 1985, you had to know a few things."
That's the point. Few people used computers and those that did, had to know things.
The vast majority of people were scared off by the command line, etc (GEM had a lot of fans and was easy to use, but it got litigated out of existance in the GUI wars)
Hi Pirate Dave, AC 13:46 and Alan Brown:
I said "Most people were dumb about computers in 1985, and a dumb GUI was needed to appeal to the mass market"
Pirate Dave replied (and the other two concurred) "I disagree. If you were using a computer in 1985, you had to know a few things"
You are right, but I think you missed my point. When I said "most people were dumb about computers in 1985" I did not mean that most people using a computer (2% ?) were dumb about them, I meant "most people in the general population" were dumb about them. Indeed, most people had never even seen a computer.
My point was that a relatively dumb GUI was a pre-condition to getting the other 98% of the population on board, before MS, Intel and others could massively expand their market.
You are now reading this in the voice of Mr. Plinkett ...
It was just that because there were more things you needed to know to set one up, there were more garage mechanics around then only we called ourselves IT techs. And we could all get together and share information without violating NDAs, patents, and/or trade agreements. Even more pointedly, when we shared information we did so in plain ordinary English (pond side aside) because the more people who knew how to do it, the less harried we were. The secretary who could barely remember to type wp.bat for the C:\ prompt was as common as the person who doesn't know what the Start button is in Windows 7.
If you were using a computer in 1985....
...there's a good chance it wasn't an IBM PC or compatible. The Commodore 64 had 30-40% of the personal computer market, and there were other popular non-IBM PCs. There were many brands of UNIX workstations, widely used in academia and industry, and UNIX servers used through LAN or dial-up connections. Minicomputer and mainframe systems from IBM, DEC, and other manufacturers had a large presence.
Some of those were much friendlier than MS-DOS, and while many of them weren't, users often knew just how to use a handful of applications - they never dealt with things like directly manipulating files.
So, in fact, you could "use a computer in the 80's and be 'dumb'", if by "dumb" we mean "not particularly informed about the technical details of the system".
Wo there are so many issues with this :
"Many users did not go onto the NT range until Windows 2000, the first popular version for business and consumers."
-No that was XP, 2000 could never really be described as popular with consumers
" Windows 7 has failed to win users back from Macs, whose market share has continued to increase, though even in the US Gartner estimates Apple's share as only 13.6 per cent.
A bigger Apple factor is the touch user interface which evolved from iPod to iPhone to iPad"
-13.6 perecent is now not then it was much lower then. And so far as touch interfaces go the iPod didn't have one until 2007 when the iphone came out, where as MS had pocket PC, windows mobile and windows tablet PC (not finger touch because of the hardware but stylus tocuh) by the time the iPhone came out so your point about "A bigger Apple factor is the touch user interface which evolved from iPod to iPhone to iPad." is nonsense. It makes more sense to say that MS blazed the tocuh interface trail which apple then extended when the hardware to remove the stylus became available (capacitive touch screens). please don't expect people who used tocuh interfaces for years before the iPhone to buy into the Apple invented touch interfaces junk.
>when the hardware to remove the stylus became available (capacitive touch screens).
That, and the use of multi-touch gestures to allow fingers to express more, thus making up for the drop in accuracy that ditching the stylus entailed. Obviously, the UIs of the OS and software had to be designed to take advantage of it... No one says Apple invented multitouch, just that they bought the company that made a good go of it for RSI reasons. Microsoft must have been aware of them, since they too were in the Human Input Device game:
A FingerWorks device built into a Microsoft keyboard: http://www.dustyneuron.com/fingerworks/images/small_photos/retro_split_sm.jpg
"It makes more sense to say that MS blazed the tocuh[sic] interface trail which apple then extended..."
Only if you completely disregard the work done by Apple and Palm and others in the late 80s and early 90s. Windows PDA's blatantly rode on those coat tails.
> 2000 could never really be described as popular with consumers
But for many, Windows 2000 was the best OS Microsoft ever introduced. Solid as granite and stingy CPU usage. Needs less RAM than XP, too. Many viruses refuse to run on Windows 2000.
I have an old Windows 2000 laptop with 512M RAM and 800MHz processor that controls a security camera 24/7. It's been eight months since boot and it still runs like a champ.
"I have an old Windows 2000 laptop with 512M RAM and 800MHz processor that controls a security camera 24/7. It's been eight months since boot and it still runs like a champ."
By the time W2K came out, I had linux boxes with uptimes passing 3 years. Don't forget that in its early iterations W2K would crash at 42 days because a clock overflowed.
W2K was NT with a decent GUI and virtually all of 95's security holes (trying to run as a non-administrative user was difficult) but it was STABLE. I could see which way things were going even then and moved my remaining windows equipment off 95/ME onto w2k (It's virtually impossible to go entirely windows free. I have to boot the laptop into windows every few months in order to run xyz software which isn't available for any other environment.)
Windows security and stability is better than it ever was, but I still wouldn't trust it with mission-critical services.
I'll accept palm, (and also PSIOn before them) but not the newton, that was a dead end that lead nowhere and sold few. But I wasn't saying that MS started the touch interface, but windows mobile definitely provided many of the links in the chain between the palm 3 and the iphone.
Not the issue really, Windows 2000 was much bteter than Windows NT4 , but was not a consumer hit.
I have had absolutely no security issues with Windows 2000. The secret is one reason why it wasn't a consumer hit - Windows 2000 shipped with no firewall and so stability depended upon installing a good one, which most consumers didn't know how to do. Also, on very first boot Windows 2000 doesn't have a nice configuration wizard to setup user accounts and options like XP has.
Another important security issue is that on both Windows 2000 and Windows XP, the daily logon needs to be either a LUA or Power User. Without that you are just asking for trouble.
Longest I've gone with Windows 2000 running is 2 years because I had to shut it down while moving to a new house.
Not at all. I see what you did there, you pretended that the problem is with the USERS, not Windows 8 UI itself.
I am not a cretin, I know what I want, and it's not a mobile phone OS of tiles on my 24in desktop monitor.
I am also not a "linux nutter", as Microsoft like to pretend that everyone who hates Windows8 must be. I also hate Unity with a passion too...
I will be sticking on Windows 7 for the forseeable future, as Win32 and Start Menus are here to stay for me. I won't be buying surface, nor Windows Phone, nor Zune, nor Xbox. There are better alternatives to ALL of those product lines, that don't try and railroad me into be dependent upon Microsoft.
You're just looking for a fight. They don't blame the users, they simply state users don't want so much change.
Get over yourself.
I totally agree, the Win 8 security gains may be useful, but the MS style mobile display is horrid on anything else and the obsession with forcing it on users is NOT a user problem. It is a problem FOR users.
The real shame is that Windows 8 should be a welcome visitor, it is faster and should be more agile, but is being hamstrung by the silly presentation fixation and MS's Taliban style approach to people's reactions. Why prevent desktop users having a workable system by their standards?
As for the mobile interface. On my mobile I have had a list of numbers function for years, a green button to make a call and a red one to end the call. Stick that on a PC? I don't think so!
It's been mentioned on el Reg before that Microsoft always touted the familar Windows interface as being a major selling point when looking at upgrading.
So now they're admitting that was a lie, and I hope people look at Linux distros and Mac when they do upgrade, "because I'm worth it".
"I am also not a "linux nutter", as Microsoft like to pretend that everyone who hates Windows8 must be. I also hate Unity with a passion too.."
Unity != linux, so what is your point ?
"With Windows 8, Microsoft is trying to exorcise its ghosts"
Yes, but are they throwing the baby out with the bath water?
The thing with Windows is that it is familiar, comfortable, even. Change it too much and people may view the task of learning Windows 8 isn't any less onerous than learning to use certain fruit or penguin based operating systems.
Perhaps the ghost they should really be worried about is the ghost of Steve Jobs rubbing his wraith-like hands together in glee.
Biting the hand that feeds IT © 1998–2018