Would there have been a PC revolution had Intel decided in the late 1960s to stick to making memory chips and turn its back on microprocessors? Almost certainly, but the company did get into CPUs and IBM chose its 8088 chip to build into its first Personal Computer, the 5150. The 8088 and its sibling, the 8086, evolved from the …
Shouldn't that be the office PC revolution?
The most important CPU for the home computer and video games market was the MOS 6502.
Not to mention that the 16-bit machines people owned at home were largely Amigas and STs.
Nobody I knew owned a PC until the early 1990s and only when Windows 95 came along was it actually nice to use. Even then the responsiveness was abysmal compared to the much lower specced Amigas and STs.
Not just MOS 6502... I would also include Zilog Z80 for the home computer for the ZX81, ZX Spectrum and Amstrad CPC.
And I agree that the Amiga was better at multitasking than Windows 95.
The 6502 was important thanks to Commodore, Apple and Atari but it was an Intel 8080 that powered the Altair 8800, the genre-defining home computer, and also the 8080 that CP/M was originally defined around. And in the UK it was the Z80 — an improved 8080 from the same team, albeit as a different company — that ran the ZX80 and the ZX81, which started home computing there. There are also a raft of other notable Z80 machines, not least the Spectrum, the Colecovision, the Master System and, approximately, the GameBoy.
I guess there are various alternative strands, like the 6502 inspiring (in at least a couple of senses) creation of the ARM or the 68000 and its progeny of the PowerPC (that, though gone from the desktop, powers the major consoles), but I disagree that you can write Intel out of the computer and video game market.
Of course different people will have different experience, but mine tells that since mid-80s people really interested in computing and not video games started to move from Commodore/Sinclar and the like to IBM PCs and clones. People used to commad-line interfaces had not any issues with DOS, and didn't wait for Windows. I got my first Intel PC (an IBM clone) in 1987 when I started studying Physics at the university, and it was clear my Commodore couldn't help me much - I saved the money to get a math coprocessor so I could run Matlab also.
I made a good use of Borland Quattro to quickly perform lab experiments calculations and graphs (some raised eyebrows from professors thinking I was "cheating"!) instead of using handheld calculators and later writing custom applications in TurboPascal. Thereby no, it wasn't the "office" PC revolution, they were really the first "affordable" personal computers capable of real "professional" tasks and not only simple "home/game" ones.
> "Not just MOS 6502... I would also include Zilog Z80 for the home computer for the ZX81, ZX Spectrum and Amstrad CPC."
I'd also like to throw in my tuppence as a former 6809 owner! Er ... quietly slinks away ...
remember learning assembly language on the 6502 in college 1987/8 then using autocad on 286/386 + math co-pro. Got my first PC in 1990 at uni 486DX 33Mhz trumped my mates that all had 386SX 16mhz all a part from one poor sod that had a 286 (bit out of date by 1990)
Motorola 6800 inspired the MOS 6502
The Moto 68K was the evolution of the most excellent Moto 6809E (16 bit internal/ 8 bit external). The 6502 was an enhanced clone of the 68000 (one that MOS got sued for.)
Re: Motorola 6800 inspired the MOS 6502 (@AC)
My understanding is that the 6500 was pin compatible with the 6800 since MOS Technology hoped to be able to walk up to Motorola's customers and sell the 6500 as not requiring any wider system changes. Motorola obviously had something to say about that, especially as Chuck Peddle — chief designer of the 6500 — had previously been a Motorola employee on the 6800 team, suggesting a trade secrets angle (spurious, but beginning the action was enough in itself to do the desired damage) . MOS backed down and pushed the 6502, identical to the 6500 except for the pin out.
I don't think the two processors shared any internal design features; they're not semantically equivalent (different numbers and sizes of registers, different addressing modes, different ways of handling decimal arithmetic, just a different instruction set overall) and certainly aren't byte-code compatible.
What's up with the Celeron and Xeon?
Can anyone explain to me why the Celeron and Xeon (the Celeron especially) look so different to the others? Instead of the standard 'functional blocks' look, they appear to have a fairly regular grid pattern. If I didn't know better I might have though that someone had taken photographs of the wrong side and those were connectors emerging or something.
Enlightenment gratefully received.
Re: What's up with the Celeron and Xeon?
It looks like those pics have the package pin layout superimposed, or something like that.
I used to "game" on 8088/8086 machines that a mate's dad had at home.
From Top Gun wireframe b/w to Joust and Mean18 Golf on CGA screens. I also used Windows 1.0 before 3.0/3.1/3.11, and DOS 3.0-6.22.
I never owned an ST or Amiga, and owned my first PC back in 386 days.
Great to see all this old tech, and what could be achieved on so few resources!
Having had a very early clone of the PC/XT (Olivetti something, running with a 8088 CPU) in the late 80s, after having had fun with a ZX81 and then an MSX, I also have great souvenirs of 4-colour graphics and silent games (no sound card included).
I actually had an EGA graphics card (640*350 in 16 colors!), but no software I had could use its potential.
I also had Windows 1.0 disks but cannot really say I managed to use it. The first really usable version of Windows was 2.1 and then there were only a very limited number of software that made use of it, Aldus Pagemaker being one of the few ones.
My next PC was a 486 33MHz with 8MB of RAM! It was actually more powerful than the diskless Sun workstation I was using at University. With a DOS Extender, I could already use up to 4GB of virtual memory for my image processing courses... Those were the days...
Olivetti M16 - it had an 8086 with a 16 bit data bus as opposed to the 8088 8 bit data. A bit more expensive to implement but memory access in one fetch vs 2 for the 8088.
The "holy grail" in those days was IBM video compatibilty, usually measured by the ability to run MS Flight Simulator. The M16 didn't but did run Dbase II ! which I used to write simple accounting programs.
Had a 8088 power Ajwad manufactured XT clone. Despite sitting in the loft for many many years it was still working perfectly in 1999 when I finally got rid of it, fantastic machine. Had a Viglen 80286 (great machine) after that, and then AMD ever since. Recently experimented with an Intel i3 1156, disaster! Proved to be hugely unstable (run times of minutes before randomly auto-resetting itself), so went back to AMD where I've stayed ever since.
Despite their unarguably amazing progress in the past decades (and AMD too, hanging on to their coat tails) Intel are remarkably conservative. Perhaps that's why they've done so well. I can't help thinking that it's killing them. X86 as an instruction set is woefully long in the tooth, and it's preventing them from getting power down as low as ARM has managed.
ARM have shown that there's a load of money in a simple RISC ISA implemented in a core containing less than 50,000 transistors coupled with moderate caches and specialised co-processors for video codecs, etc. It is making Intel's "The CPU Shall Do Everything" approach look even more antiquated than ever before. For example, remember when they were trying to turn x86 into a GPU? That didn't exactly work, did it.
Another lesson that ARM is teaching the world is that the software industry *can* change ISA. X86 binary compatibility is clearly very unimportant to the mobile world, and even the server world is beginning to contemplate a change.
It's something that I think Intel have been overly afraid of. They botched Itanium by not making it fast enough when emulating x86 in the eyes of some. I think that they shouldn't have ever made it x86 compatible in the first place, and done a deal with AMD to co-develop the architecture. That may have averted the rise of x64, which has no doubt been hugely successful but somehow one can't help but think of it as an opportunity missed.
Anyone remember the NEC 8088 lookalike which was 1.3 times as powerful? Mad a massive improvement in Lotus 1-2-3 speeds. Also, do you remember cracking the Lotus 1-2-3 so it didn't need the floppy disk in the drive before it would start up (for those with 10mb hard drives - a massive amount in those days).
NEC 8088 clone
The V20. Yes, it's sitting in my loft in a "Falcon" clone machine.
Re: NEC 8088 clone
Seem to recall that the ICL M30 PC clones used the V20 as a go faster option.
Re: NEC 8088 clone
The V20 was also superior to the 8086 in that it had an 8080 compatibility mode, though I'm aware of exactly one application that used it — a CP/M-80 emulator for MS-DOS.
"Or did you long abandon them for x86 rivals like AMD, VIA Centaur, Cyrix or other makers of compatible processors?"
Since about 18 months ago, I've been steadily replacing the x86 boxen with ARM ones as and when upgrade times arrive and memory capacity requirements of individual nodes allow.
From 23000 to 1.4Bn transistors in 35 years!
I worked at an Intel Disti and we had the 80386 core poster. It was quite large and and always seemed to catch the eye of our customers, they were astounded by the fact it contained 275000 transistors.
I also recall the first generation Pentium and the Pentium Pro, the first ever x86 CPU with integrated L2 cache, in either 256K or 512K if one's budget could afford the latter.
Those were the days, Pentium II and Pentium III in those massive Intel "Slot 1/Slot II" enclosures,
Not to be excluded were the Pentium II Xeons that were also in Slot 2 packaging.
It's amazing how Intel (and others) have managed to shrink the die and packaging as they've shoehorned more transistors into the CPUs.
Marvelous stuff El Reg, a trip down memory lane, even reminded me of the ISA, EISA, MCA, VLB, PCI and AGP eras.. when jumpers were king and now I'm feeling nostalgic and may just dig my old Pentium MMX system out and boot her up...
Yeah I just love the numbers that topics like this can throw around. It's part of a course I sometimes give to new entrants to our company, and when you can pick up a (not very new) iPod for example and tell them that there are more transistors in the little box you hold in your hand than there are people on the planet it really gets their attention.
Especially given there are still some of those people who were born before the transistor was invented...
But why did it take until the 386
Before they started doing the images in colour?
Re: But why did it take until the 386
Educatedly guessing there, but it might be that the 386 was the first one manufactured at structure widths on the order of magnitude visible light wavelengths (i.e. not significantly more than a micron). That'd give color effects because the structures will work like diffraction gratings then, and the whole die looks like areas of color. Larger structure sizes don't cause this effect, at least not at close to perpendicular angles of incidence, so they would look largely grey - apart from intrinsic coloring of the material used.
If you look closely enough, you'll notice some parts look reddish on the 4004/8008 (probably copper contacts), and the 286 one has that little red coil-like structure on the right edge. I'd contest all these pictures are color.
Re: But why did it take until the 386
> Before they started doing the images in colour?
The world was in black and white back then, it only turned colour in the mid 1990's
(with apologies to Calvin & Hobbes)
I started with the 8080a and 8085 and CP/M (having worked with various minicomputers before that, including DG Nova and PDP11, and IBM systems even before then). In addition with the stuff I used at work, I built my own 8085 based CP/M system, and a 8086 CP/M-86 system, both using 4x6 format vectorcards. The IBM PC with its slow 8088 was a step backwards from my homebrew 8Mhz 0-wait state 8086
heh, I noticed Intel left out some other chips like the i432. I wouldn't have included the 4004 or 8008, as neither was ever suitable for a general purpose computer (the 8008's hardware call stack was just too shallow), nor the Atom or Celeron as both were downsized forks of the mainstream architecture... and the original Xeon was just a Pentium-III with multiple socket support.
Elonex PCs with 6MHz chips, and the "Turbo" button that overclocked them to 8MHz. We had 12 in the office, and by the end of the first year all had been sent back to be repaired at least once. The 680x0-based single board systems were a delight in comparison. Solid, far easier to program with no segmented memory and a decent assembler. It's like VHS/Betamax, the better marketing won.
Still, give me a PDP 11/83 over any of them :)
It wasn't that it "overclocked" them per se. The "turbo" was their standard clock speed, when disabled it slowed the processor down a bit. If I remember rightly this was so that old games designed for slower processors remained playble!
I go back to the 4004, though I didn't build one myself until after I'd built an 8080A system. I did assemble and test 8008s for others, though I never owned one. I wouldn't really want to go back to any of those--multiple supply voltages, too many chips to implement the system core. I do, however, still use the 8085 for fun: saundby.com
The 4004 was obviously not an x86. I wonder how much Reg authors get from copying some images from some other site?
I think the friendly article, which I assume you read says that Intel supplied the pics.
Feel like a kid companred to all you old boys talking about your processors made of wattle and daub.
my first was a 3rd hand compaq 486-66. cost me every penny of my savings account in 1992. The joys on windows 3.1 on about 8 floppys, and playing mortal kombat from DOS.
Commander Keen anyone?
I'm not sure even the first PC-clone I owned (an Epson 286) had an Intel processor; everything after that has been AMD for x86-architecture machines, except for one Cyrix MII that I can't recall buying but did emanate from my parts hoard somewhat recently, and a dual PIII found in a skip.
Re: Intel competitors
It was a NEC V series processor.
Re: Intel competitors
OK, NEC V40 then.
Coming to micros via mainframes the 6800, 6502, and 68000 families seemed logical implementations of instruction sets. The Intel cpus' non-orthogonal instructions felt clumsy and illogical.
The 286 was particularly galling in the way it handled extended memory. The 386 was the first of their processors that handled large memories "properly". After that it seemed possible to keep expanding existing applications for new generations with extra memory by just recompiling. That ease seemed to stop with the Wintel 64bit PCs - where whole rewrites were needed to handle more than 2/4GB of application memory.
It also seemed that Intel never took advantage of larger gate counts to implement a more secure hardware architecture - like that developed on the ICL VME mainframes. The use of descriptors to define the limits of a piece of data would have gone a long way to preventing buffer overflow exploits. Do Intel support such features now?
A few microprocessors...
Started with playing with Motorola 6800 and Mostek 6502 (in an Apple ][) but found more fun working with CP/M on 8080s and Z80s. I can't remember how many times I produced CP/M BIOS's for various systems and configurations for various S100 boxes. CP/M 2.0, 2.2 and 3. Last one was for Concurrent DOS not so long ago.
During the 1980's programmed AMD 2901 processors on a bit-sliced custom integrated maintenance computer for a big mainframe. Wasn't bad, but the next generation of maintenance processor used Motorola 68010 / 68020s programmed in assembler: and that was such fun! I was really gutted when Intel won the microprocessor architecture wars, the 680x0 processors were so nice to use.
After that I worked on mainframe OSes, so no more microprocessor fun.
Six years ago I threw away my original Apple][ and a number of Altairs and Imsais and a few other odds & ends and just kept one Altar 680, which decorates the top of my bookshelf. I'm purely an appliance computer user now, by preference Apple Macs.
However, there is somebody producing Altair kits again, and I'm sorely tempted!
Whilst compatibillity is one of the strengths of Intel centric computing
I really miss the diversity of different manufacturers making radically different machines.
I remember in the mid '90s when all of the articles in the PC magazines were essentially describing the same machine (IBM PC compatible) with the only differences being the clock speed, memory or disk capacity, processor generation or case-colour. It was the point that I stopped reading the magazines on a regular basis as computing was no longer exciting.
I am not looking forward to the point where Intel have driven everyone else out of the server-processor market, and just hope that ARM can continue to make inroads into the desktop and mobile market. If Intel can achieve total dominance in all segments, then expect innovation to slow-down as the accountants try to extract more revenue out of each processor generation to maximise the R&D costs.
Re: Whilst compatibillity is one of the strengths of Intel centric computing
Oops. "Maximise the return on the R&D costs"
I just don't see how anyone could count the 80286 as one of Intel's "great" CPUs.
Started with a Z80 myself in an Exidy Sorcerer, but always liked the 68k's design and programming model.
In the 1980s I worked as a Pascal/C programmer in a company that was using the Burroughs/Unisys B20 series of computers (designed by Convergent Technologies, a company that Unisys eventually bought). The B25 was powered by that rare beast, the 80186. Most PC makers skipped that one completely.
The 80186 was quite often used in embedded system electronics.
Also used by Fujitsu for their MSDOS 2.11 compatible PC, which came to our company because we used their IBM compatible mainframe, and by a number of White-Box (beige actually) suppliers, including the luggable my dad used.
Those chips were screamers. Large chunks of 8086 microcode had been replaced with dedicated silicon, so they ran like the later 286, (like an underclocked 286, since they didn't clock as fast). I used to take my final year thesis project to work to run compile/modify/compile cycles, since the 80186 ran compile cycles minutes faster than my V20 at home.
Plenty of companies made 80186 motherboards, but not IBM. That's the only reason I can think of that the 80186 didn't get much publicity, and always gets left out of lists like this one.
Like Pokemon - I've pretty much caught them all. And I'm only 33!
The early ones were hand me downs from my Dad. I still had my 286 at uni though in 1998, and it was a perfectly adequate word processor.
8086 8mhz (Amstrad 1512)
286 12mhz (Tandon)
386-SX25mhz (Tandy own brand)
486 DX2-66 (Viglen)
Pentium 2 133mhz (beige box inherited from work)
Pentium 2 333mhz (beige box inherited from work)
Pentium 3 900 mhz (home built, overclocked to a bit more if I remember rightly)
Core 2 duo 2160 mhz (HP Pavilion notebook)
Core i5 650 no idea of clock speed, irrelevant these days it's all about the cores (Acer predator)
There's probably a few I missed as well.
> processors made of wattle and daub
Though the main issue wasn't the processors being made of wattle and daub - it was the programming and I/O mechanisms being hand-cranked. My first encounter was with an 8080A development box - this had a "Program" mode, in which we entered individual CPU mechine-code instructions and data using 8 little switches and a "Next" button, and a "Run" mode, in which the status of the data and address busses was shown by banks of LEDs - 8 for data and 16 for address. Debugging? Well, the Run mode had a single-step button...
Of course the proper computing types got to play with really advanced stuff, like punch-cards. But for us in the electronics world, "stored program" meant "write down a long list of hex numbers on a piece of paper with a pencil".
You try and tell that to the young people of today...
Re: > processors made of wattle and daub
" But for us in the electronics world, "stored program" meant "write down a long list of hex numbers on a piece of paper with a pencil".
Once spent Easter week writing a very large patch to alter the way a CTL MOD 1 Coral66 "application" worked. It could not be recompiled as the customer site did not have a development system. The whole thing was hand coded in hex on paper - then typed up on papertape for the hex patch loader to read. Those were the days when we still lived and breathed machine hex code 24/7.
In 1970 on the System 4 J O/S there was a intermittent bug where a software I/O queue would freeze. A reboot was out of the question as jobs were halfway through. It was a neat trick to halt the machine - then use the engineers' panel to find the queue header in memory and set the missing flag. It's hard to believe now that it was one of the biggest IBM 360 compatible machines in the UK - with a whole 1MByte of memory. The 600MB hard disk had watercooled bearings and weighed one and a half tons - and took eight hours to archive to magnetic tape.
Re: > processors made of wattle and daub
I do. They sneer at me.
I too used to be able to write down 8 bit machine code, and I built my first eprom programmers by hand, until I got a machine with big floppy drives and never looked back.
One eprom programmer ran on an 1802 and the other used an Intel embedded processor with built in eprom. to read punched tape or a keypad. The handmade 1802 machine was needed to program the first Intel machine, after which it wasn't needed any more because I could simply use the Intel processor to clone itself.
Re: > processors made of wattle and daub
Been there, done that! We were quoted £72000 to run the compiler and output the object code. But we did not hand code it - we wrote an assembler on an HP 85! It took a week and was an enormous cost a time saver.
The F100-L also had an engineer's panel that was needed at every boot to fix an error in the bootstrap rom. Happy days indeed.
Our office bought an IBM XT in late 1985, to help with premium calculations (I worked for an insurance broker at the time).
At that stage all calculations were done by hand (using a tape calculator), then handwritten on A3 sheets that were taped together before being sent to the typing pool to be typed on an IBM DisplayWriter, using massive 8" floppy disks that could store 256 KB.
It would take two of us about 14 days to do three years' motor claims statistics for one client (just double-checking the figures was a major undertaking: one person would read the numbers whilst the other would add it up on the calculator - then we would switch places and repeat. More often than not the totals were different, forcing us to repeat the exercise).
Once the typing pool had finished typing it all up, we had to verify everything again, plus correct typing mistakes. Once we were happy that everything tallied up, it would be presented it to the account handler for approval.
Once he was happy, it would go to the branch manager (who would need to present to the client), who almost invariably asked for alternative calculations, using different excess amounts, et cetera, kicking off another two weeks of calculations.
Since I had done Computer Science at university, I was asked to spec a system to automate the process as far as possible, saving time and improving accuracy and, most important of all, to enable quick recalculations.
The system we eventualy bought comprised an IBM XT (running at 4.77 kHz), with an EGA Graphics card, capable of displaying 16 colours simultaneously, a massive 10 MB hard disk drive (it came standard with a 5 MB drive, but I reckoned we would fill it within the next three years or so, whilst a 10 MB drive would last forever), plus a 256 KB 5.5" floppy drive. We also upgraded RAM from 256 KB to 512 KB, soon afterwards going to 640 KB.
Software was DOS 1.0, Harvard Graphics, Lotus-123 and MultiMate, whilst output was handled by a dot-matrix printer and a four-colour plotter.
The whole lot was about 10% more expensive than a new BMW 518i (so you can imagine management's reaction when one of my colleagues suggested that everyone in our department should have one of those on our desks!). To put it into perspective, my gross annual salary was about one third of the cost.
After spending a couple of weeks to set up the spreadsheets, hiring a temp to capture all the data and then creating the necessary formulas to do the calculations, the big day finally arrived when I had to demonstrate to management how the system worked (and justify the expense. Whilst the project was approved, they still needed to see it for themselves).
It was amazing: half the office was jammed into our office to watch the show. I gave a short spiel of how it all worked, then changed a couple of key values (like rates and excesses) and pressed F9 to calculate (you could not leave autocalc on, otherwise it would take an age between entries, just to recalculate the whole thing).
In less than 5 minutes we had an answer! Two hours later I had three different scenarios printed out, plus some graphs (We used to use a dedicated plotter, printing on 2.5" wide paper, to print graphs. This was also a painfully slow process, as the machine did not have any RAM, so you had to enter the co-ordinates and colours for each graph every time. The graphs for 10 booklets, containing only 12 graphs per booklet, was a week's work, as each graph had to be cut out and glued into place as well) - a whole month's worth of work for four people!
Needless to say, when the first 286 came out, we got one and shortly afterwards a couple of 386 screamers arrived.
Paris, as she looks as if she is also wiping a tear from the corner of her eye.
...running at 4.77 kHz
MHz! MHz, not kHz.
To Intel - give us the chip art references !
Like the Smithsonian collection does. You guys must be sitting on a secret list of your own chip design easter eggs, how about 'fessing up a little there ?
- Facebook offshores HUGE WAD OF CASH to Caymans - via Ireland
- Justin Bieber BEGGED for a $200k RIM JOB – and got REJECTED
- Microsoft teams up with Feds, Europol in ZeroAccess botnet zombie hunt
- Review Bigger on the inside: WD’s Tardis-like Black² Dual Drive laptop disk
- Inside Steve Ballmer’s fondleslab rear-guard action