Feeds

back to article Happy 20th birthday, Windows NT 3.1: Microsoft's server outrider

It started on the server, became the desktop, it's still there in Windows 8 today, and it just turned 20 years old: Happy birthday, Windows NT. Windows NT 3.1 was released to computer manufacturers on 26 July, 1993, and initial sales of Microsoft’s debut server operating system were modest – fewer than 500,000 units sold in the …

COMMENTS

This topic is closed for new posts.

Page:

Silver badge
Pint

Next Technology? Damn..

I always thought it was "New Technology", which was a fantastically silly name for anything.

1
0
Windows

Re: Next Technology? Damn..

I thought it was "New Technology" too. However, rumour has it that "Windows New Technology" was a backronym created for WNT - WNT being to VMS as HAL is to IBM. Not impossible, I guess, given the number of ex-VMSers on the project.

3
0
Silver badge
Devil

Re: Next Technology? Damn..

Wasn't it called "Not Tested" when it first limped out?

13
1

Re: Next Technology? Damn..

I remember that in the contemporary computer press it was often referred to as "Not There" on account of the delays.

Guy Kewney penned an article on the then new OS/2 version 2 ("better DOS than DOS, better Windows than Windows", remember?) in which he begged IBM to call it anything other OS/2. Tracy, even. So for me, NT always stood for "Not Tracy",

The Windows 2000 startup screen proudly declared "Build on NT Technology"; so that's "N[ew]|[ext] Technology Technology". Right.

-A.

5
0
mkc

Re: Next Technology? Damn..

I thought it stood for Northern Telecom, a reference to the networky part of Windows?

0
0
Silver badge
Headmaster

Turned me into a Newt..echnology

From the Hallowed Halls of the Troll TowersEl Reg Editing Room:

http://www.theregister.co.uk/2013/06/10/openvms_death_notice/

The architect of RSX-11M and VMS was Dave Cutler, who planned a portable, object-oriented successor, Mica, running on hardware codenamed Prism. When DEC wasn't interested, he and some of his team decamped to Microsoft, where they were given the project of reviving the moribund OS/2 3 project after the IBM-Microsoft split. While OS/2 2 was the Intel 386 version, OS/2 3 was to be portable to non-x86 processors. Cutler drew upon his previous Prism and Mica work to bring Microsoft's OS/2 3 to the Intel i860 CPU, a RISC/VLIW chip Intel had hoped might be a successor to the x86 line.

There were two versions of the chip – the basic i860XR, codenamed the N10, and the enhanced i86XP, codenamed N11. Microsoft built its own i860 workstations for the development effort, based around the i860XTR and consequently nick-named the "N-Ten". The initials of these – NT – is where the eventual name for Cutler's finished OS: Windows NT.

Don't know whether true....

2
0
Bronze badge
Mushroom

Re: Next Technology? Damn..

Add one letter to each of VMS....

2
0
Headmaster

Re: Next Technology? Damn..

NT was called 'Not There', because it was released well past schedule, just like Windows NT 6.0 (called Longhorn or Vista or even 8).

Microsoft's first Operating System was Xenix (not counting DOS), that later became SCO Unix. They tried to build a server Operating System, call LAN Manager, to beat Novell Netware. Some of the functionality later went into Windows for Workgroups and Windows 95 and NT, but it was just peer-to-peer networking. LANMAN mostly failed.

WinNT 3.1 was purely a Workstation OS. By Win NT 3.50 Redmond came up with a server edition, which was inferior to Netware, OS/2 or Unix. It took until NT 5.0 (Windows 2000) for the server edition to become worthy of its name.

3
3
Anonymous Coward

Re: Next Technology? Damn..

"I thought it stood for Northern Telecom, a reference to the networky part of Window"

That would be Notel you are thinking off....aka Bay Networks

0
0
Pint

Re: Next Technology? Damn..

It was my understanding that the origin of "NT" was N-Ten, aka N10, a codename for the Intel i860 processor that the 32-bit version of Windows was originally meant to run on.

0
0
Bronze badge

Re: Next Technology? Damn..

I remember that in the contemporary computer press it was often referred to as "Not There" on account of the delays.

When OS/2 2.0 won the race against NT to get the first boxes on shelves - the race that famously had Balmer saying he'd eat his hat if Microsoft lost - IBM ran magazine ads with a picture of the NT logo and some sarcastic expansion of "NT" - might have been "Not Today" or "Nice Try" or something like that. I think I may still have one around somewhere, but I'd have to dig through piles of stuff to find it.

1
0

Virtual NT

Tim. In a fit of curiosity a few months back I not only managed to get a copy of NT Server (3.51 rather than 3.1 admittedly) running using VM Ware Player, I have since upgraded to NT4, 2000 and 2003. not made it to 2008 (R1) yet which of course would be the end of the line due toR2 and later being 64 bit only. But it works! :-)

2
0
Pint

Not quite old enough.

I wasn't quite old enough to be exposed to NT 3, but I do remember seeing NT4 after Windows 3.11 for Workgroups and recognising it was the future. I could never understand why MS persisted in trying to develop two separate windows lines and was glad when they decided to unify them in Windows 2000.

The NT line was pretty robust compared to the 9x kernel line which would crash if a butterfly in taiwan flapped its wings at an inopportune time.

I feel like I've grown up with the NT kernel; we're both older, wiser and better for it, although still prone to occasionally doing something unexpected for no real reason.

1
1

Re: Not quite old enough.

IIRC the issue was games. They had great difficulty creating an environment under NT that would support DOS/WIN Win9x games adequately, or at all. This was not really sorted till WinXP, so while many business machines moved to Win2K, MS introduced WinME as a (horrible) stopgap for home computing until XP was ready.

2
0

Re: Not quite old enough.

I should have corrected you in my reply: the existence of WinME means, of course, that they maintined 2 lines until XP., not until Win2K.

1
0

Re: Not quite old enough.

I always thought that Win ME was a quick drop in replacement due to Windows 2000 Home (Neptune) when it ran late, a quick rewarmed hack job of 98se, that shouldn't have been put out there.

I did really like the watercolour theme from Neptune and it's a shame that they didn't use that for Xp, rather than that kiddy Luna theme....

0
0
Silver badge

Misleading

Windows NT 3.1 was the biggest remake of the Windows family until Windows 8 came along

True, if you are looking "under the hood" i.e. at the kernel. (But note, the Win 8 kernel is still derived from NT). However, the kernel is not the first place most Windows users look. The other revolution was replacement of the Windows 3.1 GUI by the Windows 95 / 98 / 2000 / XP GUI, which pretty much defined a (small-w) windows desktop until Windows 8 was dumped on us.

NT 3.5 (at the time it shipped) was unbelievably stable, but still ran the 3.1 desktop. (It was basically NT 3.1 with most of the bugs fixed). NT 4.0 ran the newer desktop, which had required driving a coach and horses through the carefully designed VMS-like security model of NT 3.x. The system's architect, David Cutler, formerly architect of VMS at Digital, left Microsoft around the NT 4.0 release, possibly because of Microsoft putting image above security considerations. Microsoft has probably been paying the price ever since!

7
3
FAIL

Re: Misleading

Amen. They started with a good idea but screwed it up.

NT 3.1 was basically the alpha test; 3.51 the usable beta. It had HAL, the hardware abstraction layer, which helped make it compatible with the DEC Alpha and later chips. But by 4.0, they threw it away and put the GDI into the kernel, making the whole thing unstable, in exchange for (I am told) about 15% more speed. So a few months' of Moore's Law was the payoff for ruining a much better system.

11
1
Anonymous Coward

Re: Misleading

NT 3.5 (at the time it shipped) was unbelievably stable, but still ran the 3.1 desktop.

We ran NT 3.51 at the first company I worked for. It was rock solid, albeit a bit slow even on the dual Pentium Pro 200MHz Compaqs that we used. Scripting was definitely a problem, but ActiveState produced a Perl 5 module that allowed a certain degree of control. Then NT 4.0 came along and things became very unstable. I was happier using the SunOS stuff that the majority of the company was running on, so when it looked like they were going to switch wholesale to Windows I jumped ship to a Solaris based outfit.

5
0

Re: Misleading

What you mean by David Cutler left Microsoft? The man is still around...

2
0
Windows

Re: Misleading

"... put the GDI into the kernel, making the whole thing unstable, in exchange for (I am told) about 15% more speed" - and this was the first time that I started to wonder exactly what it was that the Emperor was wearing.

I was a huge fan of Windows NT. Having moved from a mainframe environment to writing stuff for Windows via a brief experiment with Macs, Windows NT and especially v3.51 felt like a return to a world run by adults after being sent back to the playpen. I'm sure that MS would still have been able to make dumb decisions even if they hadn't moved from the original design philosophy, but for that brief period they seemed to be getting things right and moving in the right direction. Every time MS do something brain dead I think, "GDI moved from Ring 4 ...", sigh and shake my head.

I feel old again.

8
0

Re: Misleading

But as of today, Windows does not run the graphics in the kernel anymore. The graphics is running outside the kernel. Windows7 is the most stable Windows Ive tried. It works well. Sure, it is not as stable as Unixes, but it is stable enough.

Strange enough, Linux has lately moved the graphics into the kernel to gain more speed. This move has made Linux more unstable.

0
5
Anonymous Coward

Re: Misleading

"It works well. Sure, it is not as stable as Unixes, but it is stable enough."

Excluding Linux. It is rarer to get a stability issue with a recent Windows server than a Linux system in my experience across thousands of boxen of various flavours...

1
4
Anonymous Coward

Re: Misleading

I disagree, I think that Windows and Linux are pretty much on a part in terms of stability. I think that the majority of stability issues come about for similar reasons of education on each system.

Too many Windows administrators are unfamiliar with the command line, and just jab away at the GUI until whatever they want to do is done.

Too many Linux administrators think that because they know commands by rote (ie: if X happens, issue command Y) they somehow understand what they're doing.

In both cases there is a lack of understanding of how the system actually works. In the case of Windows the mindset tends to be, if the GUI let's me do it, it's safe. In the case of Linux the mindset seems to be I know the command to run, therefore I understand what I'm doing.

4
0
LDS
Silver badge

Re: Misleading

The real difference is that the GUI lets unskilled administrator to work somehow with Windows systems. It's not that a Linux admin knows the commands, therefore understands what is doing, it's viceversa - to know what command to run he (or she) needs to have an idea of what to be done and how, and that cuts out many unskilled admin (although I've seen some just cutting and pasting from Google, and perform some taks without a clear idea of what they were doing, copying someone else setting without properly adapting them to their needs).

While with a GUI an unskilled admin can look for something that looks to perform something alike he (or she) needs, and get something done somehow, not always in the proper way.

Both GUI and command line are good to perform tasks they were designed for, and a good admin uses both depending on what allows to perform a given tasks the best and fastest way. And a good administrator before using fingers checks if brain is connected.

1
0
LDS
Silver badge

Re: Misleading

Yes, today if you use good hardware and certified drivers is very, very rare to get a BSOD, I've several Windows machines that gets rebooted only for monthly patches.

Machines that becomes ridden of malware competing to obtain the full control, cracked software, or cheap hardware with bad drivers are usually unstable, but any OS would be in such a situation.

0
0
Silver badge
WTF?

Re: Misleading

Thats not the only misleading comment.

"was so good it also changed the direction of computing"

Sure, as long as you'd never used anything other than DOS and your idea of "computing" is a desktop PC. Solaris blew it out of the water GUI wise and TBH still does on the server side even today. Along with most unixes frankly.

1
2
h3
Bronze badge

They did something for performance with NT 4.0 that broke things.

Don't count Windows 8 as a big change due that hack that makes metro apps run on the Desktop.

If that is possible the change is no more than something like WPF

0
0
Anonymous Coward

"They did something for performance with NT 4.0 that broke things."

Under statement of last century. MS let every man and dog run processes in ring 0, initially so they could make graphics faster which started the bad spiral of bad drivers crashing the whole system and ended, now, with anyone pawning your system.

8
1
LDS
Silver badge

You can't run code in ring 0 but writing a driver. Now unsigned drivers raise a big red dialog box asking you to accept it. And again, to install a driver you need to be an administrator. So if your systems are configured to let every man and dog run as "root", sure, you have a security issue...

2
0
Silver badge

Security

The Security was actually good and still good on later NTs...

But there were three HUGE problems.

By default there was no Ordinary User account created, only the Admin account.

People didn't write applications properly so they could be installed by Admin and used by User. This especially was an issue from NT3.51 when people starting to use the Workstation product and applications written by WFWG / Win95 developers.

Only with PROPERLY configured permissions on NTFS. Out of the box the permissions on directories not set to the idea.

The Token based scheme and ACLs was very powerful for people that bothered to use it properly. The Problem was that folks treated it like WFWG / Win9x (and increasingly MS themselves from Win98). Other often ignored features of serious value:

Named pipes (can't be created on Win9x, but even DOS clients can connector them)

Using files as Arrays (sort of persistent virtual memory)

Streams in Files (a little like Apple Resource Forks).

The problem was that most people never bothered to learn how to configure it or how it worked as 1/10th as much as a Linux/UNIX admin/User. Eventually this applied to MS too, which is why they did REALLY STUPID stuff (GDI to Kernel in NT4.0), gratuitous moving stuff around (W2K, XP, Vista/W7, Win8) for no good reason. Buggy Explorer. Stupid defaults on Share and Device names and security.

So the BIGGEST problem is the install defaults. 2nd Biggest was similarity to WFWG & Win9x. Win9X should NEVER have been released. It and Win98 helped degrade NT4.0 Win2K, XP, Vista/Win7 and Win8 to becoming ever more bloated, unreliable, less secure and more broken.

NT4.0 major security & reliability flaw was GDI moved to Kernel top make video 10% faster. Stupidity given how fast PC performance was improving 1995 to 1996.

I did have NT3.5 on a 386DX-16 MHz with 6M of RAM. Worked fine as a file server. NT4.0 was fine with Internet Proxy (wingate), Mdaemon for Mail, MS-SQL server, File & Printer server etc in 20M RAM on a 486.

So NT3.1 wasn't "bloated" or "Slow" for a 32 bit server, nor even was NT4.0.

NT 4.0 ran on Alpha, PPC, MIPS and 64bit Alpha as well as x86. It had Clustering (developed by DEC) from 1998/1999 that could be implemented really cheaply with two ordinary Servers, SCSI controllers with two channels, two external storage shelves.

Where did MS go wrong? Concentrating in eye candy instead of real suitability and REALLY badly done installer Wizards with BAD silent defaults. STILL. Why is EVERY service on by default?

12
0
Silver badge

Re: Security

"This was the moment when Microsoft could have enforced isolation between system files, application files and data, but perhaps for the sake of compatibility with legacy Windows applications, it is too lax: an enormous amount of effort was needed later to patch up its vulnerability to DLL version issues, malware and user security."

And that's why we are still menaced by botnets of badly maintained legacy Windows boxes.

4
0
LDS
Silver badge

Re: Security

People maybe don't understand that in protected mode only code running at or below IOPL can access the hardware through I/O ports - and usually physical memory is accessible by highly privileged code only as well.

The problem is that due to privileges checks and other operations needed when a ring transition is needed (switching stack, copy parameters, move data from user space to kernel space, switch CPU state, ecc. ecc.) the more the calls that need a transition (back and forth...), the slower the code is. That's also one of the reasons that most OS running only on Intel hardware don't use the full 4 rings, but only two. Using all the rings would lead to more secure and stable code, but also slower.

To minimize this transitions, MS moved most of GDI code and video drivers to the kernel, thus when GDI code must call the video driver code it has not to go throuigh a transition (drivers in user mode would need a kernel mode counterpart to access the hardware). The real problem was (and is) that Windows drivers may be complex to write, and with many small companies writing drivers without employing skilled developers the risk of a bad driver was high (that's why it was always better to buy from reliable companies).

But ask yourself - where Linux graphics drivers are? In user space or kernel space?

1
2
LDS
Silver badge

Re: Security

Yes, the problem is the amount of bad written code around. Too many Windows programmers learned to code with Windows 3.1 and stubbornly refused to learn to code properly for later versions.

IMHO with Windows 7 MS should have started to block such code wholly, and show a large dialog box tellin "This application was coded by a moron who refuse to write modern code. Please change it with a better one".

1
2
Silver badge

Re: Security

Trouble with NT was that you could do bugger-all as an ordinary user.

You had to be admin to open the network settings dialog to find your own IP address

And with no "sudo" the only way was to log off and back on as admin

3
0
FAIL

Multi-platform misunderstanding

"NT 4.0 ran on Alpha, PPC, MIPS and 64bit Alpha as well as x86. [...]."

The problem that I experienced with one company was that they actually were trying to unify their x86, PPC and Alpha machines under NT. Which did not work, because the various compiles of Windows would simply not run much software compiled for one of the other hardware platforms--I remember the simple un-zipping of a ZIP file created with FreeZIP on an x86 became an unsurmountable challenge on an Alpha--so we ultimately decided to split between Solaris for the servers and MacOS on the workstations at the time... with a few, rare x86/NT machines for the bookkeeping crew.

I guess the main problem was that most software distributors simply did not go along with the idea of supporting multiple hardware platforms and so, for the most part, only offered compiles for NT on x86 and/or MacOS on PPC, and if lucky, HP/UX on Alpha.

2
0
Anonymous Coward

Re: Security

"And that's why we are still menaced by botnets of badly maintained legacy Windows boxes."

That phone home to armies of remotely hacked Linux based command and control servers...

2
2
Silver badge

Re: Security

You're saying that the system call interface was slow and put an overhead on software. Improvements such as threading and memory mapping have come about partly to improve that situation. It's not the drivers that are at fault it seems to me, but rather the system call / privileged access interface which is inefficient. Ok maybe bad drivers too.

Basically, Microsoft appeared to rush Dave Cutler into doing a bad job, releasing NT without the proper multi-user safeguards a grown up OS ought to have. Result: 20 years of virus anarchy.

2
0

Re: Security

So you never tried adding your self to the power user group ?

0
0

Stability for a time

I well remember running NT3.1 and NT3.5 as my DESKTOP OS (remember NT Workstation?) because I just couldn't deal with the garbage that was WFW/Win32s at the time - constant crashing and running out of "resources" caused by poor GDI heap management. You needed what was, at the time, considered just GOBS of RAM to get the job done. However, these OSs finally let me run my development environment for more than 24 hours straight :-). I stayed in the server OS world for my desktop through NT4, WIN2K, and WIN2K3 because WIN95/WIN98/WINXP were much less stable. Windows 7 finally delivered what I would consider to be equal stability to the server counterpart. However, as many have already mentioned, NT4 architecture put the graphics handling into ring 0, which was was a bad move. NT4/WIN2K definitely suffered more crashes because of this move. Anyway, here's one guy that's happy they loaded the desktop GUI into their "server" product. It made my work day much more productive.

3
0
Silver badge

Stability

NT was not as stable as VMS by quite a long way but it was incredibly stable when compared to anything running on a PC in those days. I've had up-times of well over a year with NT server boxes - generally you just have to turn them off to clean out the fans and power supplies.

I turned the last NT box (running a mail server and FTP) off about two years ago - never hacked even once.

3
4

Re: Stability

"I turned the last NT box (running a mail server and FTP) off about two years ago - never hacked even once."

FTP on NT4, how we laughed (not) when users with bad passwords got hacked, the anons created directory names like com0 and filled them up with warez that you couldn't see or access with file manger.

4
0
Anonymous Coward

Re: Stability

Windows NT whil emuch better than what came before was appallingly unstable. We considered using it for a product but stopped when we were seeing 2 or 3 crashes a day running light workloads of standard MS apps. The idea this was the best on PCs at the time is a joke. As an example we were running an RTOS with a unix process model on PCs at the time and never saw any crashes except doing driver development over a three year period and thousands of installs. NT used as part of the IT environent using carefully managed restricted workloads and frequent preventative reboots managed up times of 2 to 3 days.

MS have now achieved the reliability that used to be the norm 30 years ago.

5
2
Anonymous Coward

Re: Stability

"We considered using it for a product but stopped when we were seeing 2 or 3 crashes a day running light workloads of standard MS apps."

Your experience and mine differ. Most of my colleagues were using the IT-supplied Win98 on their desktops and laptops. I was using my own NT setup (unsupported by corporate IT). The W98 users were frequently "out of memory" or unproductively blocked in some other W98-specific way that my proper 32bit-OS environment just treated as routine. So if their spreadsheet was too big to print, they came to me to get it printed. Etc.

1
0

Re: Stability

Carefully managed by whom?

Someone who couldn't fix their way out of a paper bag it seems.

0
0
Silver badge

Re: Stability

>NT was not as stable as VMS by quite a long way

To be fair, Stonehenge wasn't as stable as VMS.

The only way to stop a VMS machine was to put a stake through its CPU and bury it at a crossroads at midnight. And even that didn't work if it was part of a Vaxcluster

2
0

Re: Stability

I still pine for the days of my VAX/VMS and OpenVMS servers. The reliability on those environments was truly amazing. System uptimes could range into the years! You just couldnt kill them. They may not have been pretty, i.e. no graphical interface, but you got things done. The eventual add-on of the CDE interface was okay but seldom used.

2
0
Anonymous Coward

Re: Stability

"The only way to stop a VMS machine was to put a stake through its CPU and bury it at a crossroads at midnight. And even that didn't work if it was part of a Vaxcluster"

You (and others) might enjoy the video at www.hp.com/go/disasterproof - VMS and some other stuff

VMS is still around, despite HP's best efforts, but if you want to buy it new, you have to buy it on an IA64.

If you just want to play, there are lots of zero-cost emulators for VAXes and Alphas. At least one blog has details of how to set one up on a Raspberry Pi, and there's another one packaged for Android. The software is available at zero cost via a hobbyist program for OS and tools ("layered products" in DECspeak).

1
0
Anonymous Coward

"proved bad both for security and for the ability to script common tasks."

I only ever came across one thing that I couldn't script on WinNT 3.x/4 and that was the user can logon using dialup networking checkbox in the user manager applet, IIRC.

0
1
Bronze badge
Go

Expiry of patents in NT

Surely by now some of the major ones have expired, or did they apply late for many?

0
0

Page:

This topic is closed for new posts.