Doesn't that mean you only need 0 bits per pixels? So 1024x1024 only needs 0 bits. That's not very impressive, they could have easily made much higher resolutions given the amount of memory available.
The Acorn Archimedes is 25 years old this month. The first machines based on the company's ARM (Acorn Risc Machine) processor were announced in June 1987, the year after the 32-bit chip itself was launched. Four versions of the Archimedes were released in 1987: the A305, A310, A410 and A440. The first two had 512KB and 1MB of …
Grey-scale images these days normally range from 0-255, which is usually represented by 8 bits per pixel. I say normally because you can really do whatever you want - like upping the bits per pixel to 16 and having a finer set of intensities, or having 1 bit per pixel for a binary image.
If you had 0 bits per pixel you'd have a blank screen since you couldn't even represent white/black or whatever a 0 and 1 would equate to in your display (green/black?).
I've no idea how it was done in these machines though, I was only a wee lad when I was playing with Lander at primary school and being blown away by it.
Wasn't there a weird doctor who game for it two? Side scrolling platformer with a mad-scientist looking guy in it?
"If you had 0 bits per pixel you'd have a blank screen since you couldn't even represent white/black or whatever a 0 and 1 would equate to in your display (green/black?)."
think original comment was a joke.
Anyway it was a colour display ... after all, to quote the Blues Brothers, it "does both colours, black and white"
Actually the high res mode (which needed a special monitor) was 1152x896, using one bit per pixel. Sometimes it's worth using Wikipedia for fact checking. All the Arabella-based systems could do 1, 2, 4 or 8 bits per pixel. There was also a bash at resolution independence - coordinates were downsampled according to the mode, so in 1152x896 one coordinate step mapped to one pixel, but in mode 12 (640x256, 16-bit) pixels were two coordinates apart horizontally and four coordinates apart vertically (the next pixel above and to the right of 0,0 was 2,4). Mode 13 (320x256, 256 colour) was downsampled by 2 in each direction.
You *could* do 640x512 in 256 colours (mode 21) with a MultiSync monitor, but before the ARM3 turned up with a cache, it didn't leave much bandwidth for the CPU to do anything. Later systems like the A5000 with faster RAM added mode 31 (800x600, 256 colours) etc. And the VIDC20 in the RiscPC added 16- and 32-bit modes and a more programmable video clock.
There are even more rabid fanboys than I in the RISC OS world, so I'll preempt them (or maybe just warm them up) with a few points:
"Risc", as you point out, is an acronym. Thus it should be RISC. Ditto "RiscOS", which should also be two words: RISC OS. "Rom" should be ROM. Confusingly, the last Acorn machine was the RiscPC (in lower case), but the official logotype has a sort of half-space, so most people write is all joined up as I do. Finally, the mouse buttons on an Acorn machine were referred to as Select, Menu and... Adjust, not "Alter".
There seems to be some confusion about the difference between Lander and Zarch: the former was a demo of the latter. The demo (Lander) shipped with most machines in this era, while the game (Zarch) was a commercial full product. It was exactly the same game on the PC, but known as Virus. The player's ship in that was not triangular (it was non-symmetrically pentagonal), and is the same shape as the Copperhead ship from Bell and Braben's game Elite.
Two final points. First: "Archimedes morphed into the Risc PC line, a series of ARM-based boxes designed to run Windows – on a co-processor, and presented in a Risc OS window." No. Just no. The RISC PC was never *designed* to run Windows - it was designed to run RISC OS natively, and use hardware-based emulation to run Windows within RISC OS. The "co-processor" wasn't a true one in the sense you seem to be inferring; the primary CPU was an ARM610, 710 or some variant of the StrongARM processor, while the secondary CPU was a specially designed 4x86 or 5x86 card which could *only* be used by the emulated copy of Windows. It was also not a default item included with the RiscPC, but usually a seperate purchase.
Finally, there's the fact that you say the line stopped with the death of Acorn/Element 14, without mentioning anything about the two spin-off companies RISC OS Ltd or Castle Ltd. Acorn's demise left RISC OS at 4.02, and ROL/CTL developed this further in two confusingly-numbered parallel brances, known as RISC OS 5 and RISC OS 6 (also known under the names RISC OS Select and RISC OS Adjust). There were also the the Iyonix, the Omega, and some other hardware designed and sold in the post-Acorn era. And last but not least, there are ongoing efforts to port RISC OS to small boards like the BeagleBoard and the Raspberry Pi, these mostly happening through RISC OS Open Ltd (ROOL), a spin-off created when Castle decided to open parts of the RISC OS source.
Reg Hardware style (generally) is to present acronyms as they are pronounced. Thus 'Risc' because it is pronounced 'risk' not 'r-i-s-c'. Likewise 'Rom' but 'CPU'.
'ARM' is not 'Arm' because it is a trademark.
And yes, I'm sure you can find inconsistencies if you're nerdy enough to look for them.
Jason: Thank you for picking up most of my ranting! Although I'd point out that Virus was not *exactly* the same as Zarch (and not just because it ran slower). I've clocked Zarch (I still have the disk), but I struggled with Virus on an Amiga.
This has reminded me that my wife made me get rid of my Archimedes (A310, upgraded to 4MB) a couple of years ago. I cried, even though I still have my Spectrum and my RiscPC is still in the family. I may get one from eBay and hide it somewhere, although it obviously won't be the same.
I had a 3010, and still have a RiscPC StrongARM - which still runs!
Christian, Acorns before the RiscPC had to share video memory with program memory, all 0.5 to 4MB of it.
The RiscPC was available with an optional 1 or 2 MB VRAM stick.
Note that those memory amounts are MEGABYTES not GIGABYTES. ISTR paying over 100ukp to upgrade my A3010 to 2MB.
(Mine's the one with a Risc OS Programmer's Reference Manual set in the pocket)
Fantastic machines massively ahead of their time. The 8MHz ARM 2 (not 4MHz as in the article) was about 25x faster than the BBC B, 15x faster than an original PC and 6x faster than a PC AT on Byte benchmarks. It came with built in better than VGA colour graphics, and 4 channel sound, no need for graphics and sound cards before it could do even the most basic things. They could also take 4 (not 2) podule expansion cards including one providing all the digital and analogue I/O we enjoyed on the Beeb.
I remember frequenting a small computer shop which had Atari and Amiga sections. One day they put the new Archimedes on display and my jaw hit the floor when I saw the lander demo. While Zarch eventually appeared on the Amiga & ST (and ZX Spectrum!), at the time it was just incredible to see the raw power on display.
Sadly for Acorn, the power was matched by absurd prices which instantly meant it stood no chance supplanting either the Amiga or ST. It cost over 2x the price of the integrated versions of each of these which meant nobody could afford it. Being Acorn they found sales in the educational sector but I can't help thinking they would have had a lot more success outside of that if the A3000 had been there from the beginning with it's ST520 / A500 style form factor and lower price.
Still I guess someone from Acorn had the last laugh given how ubiquitous ARM processors are these days.
"Sadly for Acorn, the power was matched by absurd prices which instantly meant it stood no chance supplanting either the Amiga or ST. It cost over 2x the price of the integrated versions of each of these which meant nobody could afford it. "
IIRC the schools had a "Must provide kit at half RRP" requirement for suppliers so RRP for kit targetted primarily at the edu sector was price doubled so they did not lose out on the majority of sales.
Of course, this just meant that parents ended up paying double pricing for "edu" targetted kit or ude kit suppliers went bust.
Neat machine, but ultimately beyond my means. A year later I was coding image processing software for a living on an 8 MHz 80286 with 640 MB RAM, and a Matrox PIP1024 image capture and processing board, which had a whole 1MB of RAM. I yearned for the vast 4MB RAM of the A440.
You'd only need a 32-bit CPU to natively address 640MB (actually what you'd need is a CPU with an address bus at least 30 bits wide.)
The 80286 did come with an on-chip MMU , which allowed for logical addressing up to 1GB in protected mode, but sadly it was limited to 16MB of physical RAM (24-bit address bus). You could use additional hardware to provide bank switching (a la Expanded Memory), but I don't personally know of this being used to access up to 640MB of RAM.
Sounds pretty much the same as mine, there was one that only the teacher was allowed to use, but on the one occasion I got to use it (all the others were in use for some reason) the speed was amazing in comparison :)
I don't remember much from the lessons, we were only introduced to them when I was about 6 and the school replaced them with some Windows 98 beasts when I was about 8. All I do remember is that game where you have to transport animals across a river in a boat without them eating each other (I finished first out of the group and got a sticker. Proud.) and being very annoyed at someone in my class who kept adding incorrect spellings to the dictionary rather than correct her mistakes.
El Reg: you're making me feel old!
All misty eyed now, Zarch was great, there was a great feeling of motion about the landscape, especially using trust and letting gravity drop you back down again.
I had a A310, extra RAM, backplane, extra floppy drive, must have cost me close on £1000. Well worth it!
I was writing games in assembler back then for Spectrum, C64, Atari ST. I could render 3D shaded graphics faster in Basic on the Archimedes than I could in assembler on the 8/16bit platforms!
I still have my working A5000 20 years later. I keep changing the hard-disc in it every so often (they seem to last about five years or so), but it runs as well as it ever did.
I ended up with nearly 1GB of storage in it (a couple of near 500MB hard discs). The thing is built like a tank, with very sold metal casing and just seems to keep going. I still have my black and white "hand scanner" (you know the sort, works on rollers) and a pile of software. Probably the best thing was that I ran Windows 3.1 via !PCEm and it capably got me through my college days whilst everyone else seemed to be buying PCs.
ARM Machine were way ahead of anything else at the time, I just think that Acorn sucked at marketing it in the right way.
Have you tried replacing the hard drive with an emulator? There are projects out there that replace the floppy and hard drives of old kit with something that can interface to USB. Probably the best one is the HxC emulator, which definetely supports the Acorn A3000 - http://hxc2001.free.fr/floppy_drive_emulator/support.htm
"Have you tried replacing the hard drive with an emulator? There are projects out there that replace the floppy and hard drives of old kit with something that can interface to USB."
Sounds overly complex to me. An old 1GB CF card and a cheapo 99p CF <-> IDE adapter from eBay, and my RiscPC has both an upgrade and an extension to its life. It's also miraculously silent - I didn't realise that for years, the loud whirring noise wasn't fans or other components, it was my clunky old 800M HDD!
Yes I recall that the very early machines (of which I had one) contained the OS on EPROMs (which have a transparant window in the top, to enable UV erasing) and there were both expensive and re-usable, which is why they wanted them back.
I'm pretty sure that I paid something like £1200 for my A310 (including the Acorn branded monitor) and was somewhat taken aback to arrive home one day and find that the courier had left the boxes piled up on the doorstep (and I wasn't living in a very salubrious area!). Luckily no harm done and I was blown away by the speed and the graphics - there really was nothing comparable at the time for the home user.
I went on to upgrade to the RiscPC before finally turning to the "Dark Side" of a Windows PC late in the 486 era, primarily because it had better games. Still miss the speed of boot and simple "rightness" of the engineering (software and hardware) those old RiscOs machines.
I remember a friend's brother had an Archie, and aside from being completely blown away by Lander (it's reincarnations on ST & Amiga were good, but nothing like as smooth as the Archie version), but most impressively I got to play around with a realtime fractal tool which allowed me to create a fly-by landscape almost the same as the Genesis sequence in Wrath of Khan. On a home PC.
Fantastic machines, but the bit about Lander/Zarch/Virus or whatever else it was called not being possible to implement on Motorola 68k based machines is rubbish. I played Virus on an Atari ST - it was pretty much indistinguishable from the Archie version (and just as hard).
At the sixth form college I went to, the Archie was the only computer they were allowed to buy. This caused annoyance in the music department, since the studio standard at the time was the Atari ST. The department head finally gave up trying to source a MIDI interface for the Archie, and got around the computer purchasing rules by buying a hardware sequencer.
We had an A5000 in our music department at school with a MIDI interface.
We were aware that MIDI interfaces were available, but the problem for my college was getting one as their supplier didn't sell it. As for Sibelius, wasn't that more geared up for producing sheet music rather than sequencing? What the head of the music department wanted was an ST running Steinberg Cubase (or it may have been Pro 24 back in 1989).
Rubbish, nothing to do with number of "bits" (both 32-bit ISAs), everything to do with a huge difference in cycles per instruction. (32-bit instructions were only slightly slower than 16-bit ones on the 68000, most of the difference between the ARM and the 68000 was due to the fact that one was implemented as a pipeline because it was easy to do so, the other was not - because it was next to impossible.)
I remember these from my school days, and how in awe we all were about the GUI. Having sneaky games of Lander when the teacher wasn't looking. Using the art program to draw the usual cock-and-balls, etc.
I found a great bug in the operating system that you could use to bork your classmate's floppy disks. You could drag and drop a parent directory into it's child, and then it would recursively copy until the disk filled up. For some reason, it wsn't possible to simply delete the directories, so your only recourse was to reformat the disk.
My dad bought an A310, then we got an 80MB HDD for it, then we got an A5000, then dad got a RiscPC and upgraded to a StrongARM with the PC card as well. Finally when I was a student what else was I supposed to spend my money on ;)
BASIC V was amazing, I believe the whole module would fit into the cache on a StrongARM so it ran interpreted BASIC ridiculously fast.
You're right about the BASIC on the ARM. As well as being one of the better BASICs around, with proper control structures, it was incredibly fast.
I wrote various programs in both C (compiled with the Acorn ANSI C compiler) and BASIC and in almost every case the BASIC version ran faster. This included number crunching programs.
I loved my BBC micro. The first affordable ARM machine was the Acorn A3000. So I got that when it came out. By then I'd already been programming ARM code on a Uni Archimedes 305. I've known ARM assembler for almost 25 years. I feel old!
I still have my A3000 (plus a spare for parts) and my RiscPC which I replaced it (with a couple of spares including one which was an ex Acorn development machine). My RiscPC had the 486 card, 4MB RAM, 2MB VRAM and a 130MB SCSI HD. Alas the HD has died.
Basically those machines founded my career. On the A3000 I developed my first piece of commercial software. I produced an ARM Linux distribution for them in the 90s. I am now a software consultant and my work still requires me to do ARM dev. Finally to go full circle my Raspberry Pi turned up this morning. I will be putting RISC OS on it to play with.
Don't worry, it you want to feel old try still remembering 6502 instruction codes in hex!
I rather missed out on playing with ARM in the early days. I suspect I have a few years on you. I do remember playing lander on one once, and having a true jaw on the floor moment. An insanely powerful machine. Unfortunately I couldn't afford one :'(
Apart from that, I'm not going to talk to you any more... I'm still waiting for my RaspPi :-(
IIRC the first 3 byte pairs of the 6502 memory map were the jump points for reset, NMI and IRQ... Although I could be wrong, it's been a while!
I shall consult Rodnay Zaks when I get home tonight!
Other scary things stuck in my mind:
No wonder there's no room left for anything useful!
> try still remembering 6502 instruction codes in hex!
Haha! Those were the days.
Who didn't enjoy hooking the reset key on an apple ][ to move an image onto the graphics page and display it?
At least for my A-level exams they had the decency to base questions on assembly. I know - they were soft on us!
I credit the Amiga as the real computer that I cut my programming teeth on, but without the Archimedes at school I think I would have lost interest in computers as at the time I only had a C64. Can't for the life of me remember the name of the database application we used but at the time I was very impressed!
I'm not sure why people are saying that Lander was written in BASIC. To the best of my knowledge, it was pure ARM code, and I'd be astonished if it went anywhere near Acorn's triangle drawing routines. BBC BASIC on an Archimedes is fast (and the Arthur, although not RISC OS, desktop was written in it), but it's not *that* fast. Minotaur, one of the first commercial games launched for the Archimedes (alongside Zarch), *was* written in BASIC, however.
(Speaking of BASIC, I'm not sure about this "press reset twice and you get the program back" thing. Not in my memory you didn't. I also remember the reset switch being on the back, right next to the headphone socket, where it was easy to reset the device when plugging headphones in. It was still on the back on the RiscPC, even though the power switch was on the front. Never understood that...)
As for comparisons to Virus, I believe Virus had some more enemies - it got harder faster than Zarch - but it was also noticably less pretty; for example, there was no depth cueueing of the background (in Zarch, everything got darker towards the back of the screen). I'd assume that the Amiga version used the blitter, although since there was an ST port I can't guarantee that. I've never played the Spectrum version, but it's high on my "most preposterous port" list. Lander, of course, didn't have all the enemies, let you blow up on the launch pad, and didn't clip the front edge of the landscape properly - but it was very good at training people to play Zarch! (I still maintain that I ought to be able to fly a helicopter, should I ever need to, because of this game.)
Part of the exciting colour scheme of Zarch was that it could use the 256 colour mode, back when the best PCs had original VGA graphics. The 256 colour mode had 16 palette entries (that most people didn't touch) and the rest of the values derived from them; the default mapping was an effective perceptual HSV scheme, accessed from BASIC by 64 base colours (setting the top two bits of each channel) and four levels of "tint" (setting the bottom to bits of all three channels at once), giving you fine grained luma control. It might not have had all the colours of HAM6, but it was pretty effective. Despite a brief foray into VU-3D on the Spectrum, it was probably Render Bender that taught me to think in 3D graphics (and now I work in graphics professionally).
I'll believe you, it's just that this thread is the first I've heard of it. There can't have been much logic on the BASIC side, and even a CALL statement to thunk between the two had quite an overhead, so it just seemed unlikely to me. If there's a reference to this, I'll be interested. (Or I may be able to find my old disks and have a look.)
I'm prepared to believe that it might have used BASIC to set itself up, but that seems less likely than doing any BASIC when the program itself was running. I'll go and google this now, but I would have thought that I would have remembered...
[And, while I'm eating my words, the reset button on the Archimedes was, of course, on the back of the keyboard, where it was useful if slightly prone to getting poked by the keyboard coily caable - although it meant the keyboard was nonstandard. The RiscPC's "normal" PS2 keyboard meant the reset button was, as I said, at the back of the machine, where you'd hit it plugging in headphones. Clearly I'm going senile.]
I absolutely agree about the mouse. Even after Microsoft eventually worked out that their mice had a second button, mousing on Windows still seems stupid compared with the Acorn approach. (Actually, menus at the edges of windows are the worst of all possible worlds - not near the mouse, not where you can get them quickly. I take the Amiga's scheme - like the Mac but invisible until you hold down a mouse button - as second best.)
Add in MouseAxess (we don't need no stinkin' window borders to move things...) and you got a system which was far more usable without a keyboard than most modern PCs.
Which brings me to the things you can do with the three button mouse. None of this "shift-click" to multiple select, that's what Adjust was for. Drag a window without bringing it to the front? Use Adjust. In the file manager, decide whether you want a new window for the directory you're entering or to re-use the current one? Select or Adjust again. I want to say the same thing about the difference between a copy and a move, but it's been too long for me to remember. And, of course, Acorn had the most sensible file save mechanism I've seen (why on earth does every application in Windows need its own way of viewing the file system?)
Ah, rose-tinted goggles. Shame about the lack of pre-emptive multitasking,..
"Having to track your mouse pointer up to the top of the screen in order to access a menu on Apple Macs"
You might want to read up on the reasoning behind Apple's approach: it makes a lot of sense to put menus and icons right at the edge of the screen. The physical edge means the pointer can't overshoot, so the "height" of the elements is effectively infinite, making them practically impossible to miss.
Compare with the MS Office 2007 "Ribbon", or the Windows "menu-in-each-window" approach, both of which require the user to stop the pointer much more precisely over the icons and menus. Similarly, pop-up menus still require greater aim.
That's why Apple has generally preferred the single mouse button approach: the mouse is conceptually a finger that you point at things; the button is a tap of that finger. Simplicity. The mouse was never intended as a primary device for power users: the idea was that you'd learn the keyboard shortcuts to your most common operations. (OS X lets you map any key combination you like to any application menu. This is handled at the system level, not at the app level, and it has always been there, even in the pre-OS X days.)
There are three modifier keys on Apple keyboards as standard—"Command", "Option / Alt" and "Control". The PS/2 keyboard only has two, official modifier keys; the Windows key can also be used as one, and Windows itself supports some commands on it, but it's rarely used.
Add in the shift key and you have a lot more key combinations you can use to access application shortcuts directly than on a normal PC. Literally thousands of combinations, so you have plenty of scope to avoid clashes with hard-coded shortcuts. All without using the mouse at all.
The mouse should never, ever, be seen as a primary input device for application functionality. It should only really be needed to manipulate graphical elements. Accessing menus and icons is something it can do, but experiences users should be using keyboard shortcuts for those, not accessing them with a mouse.
You guys do know there are textbooks and proper science behind all this UI design stuff, right?
"You might want to read up on the reasoning behind Apple's approach:"
ok so i get that. but why not abandon it when technology made it annoying/stupid? Using an APPLE 30" screen with a giant desktop, got a window down the bottom of the page, all the way up to the top again for the menu. then all the way down the bottom again. Stupid, stupid, stupid.
Dual monitors? HAHAHAHHA lets track all the way back over to the 'main' monitor', then alll the way back to our app. very, very ,very, VERY stupid.
Just because there are text books does not mean that the way of working is correct or to everybody's liking. Take sociology for example......
I'm sure that there are many things that are completely insane that I can make rational sounding arguments to support. Try reading Douglas Adams' books for rational absurd reasoning (although, yes, I know he was a Mac User, but I'll forgive him that because of his genius)
The Mac way of working is fine if you use single or small numbers of applications. Not for many applications on a screen, like I use all the time.
And the argument about power users using key combinations is crazy. In my world, where I use Windows, CDE on UNIX, KDE, GNOME, and (god forbid) Unity, you just cannot learn every one of the myriad of key sequences. And in case you ask, I am an Emacs user, so am used to quite complex sets of key strokes.
Yup, studied the textbooks, own "The Design of Everyday Things", did HCI as part of my CompSci course, was a member of SIGHCI for a while. The Mac/Amiga solution is better than trying to aim for the top of a window - as you say, there's a hard stop. However, it doesn't scale well to large or multiple monitors. The Acorn solution of popping up a menu in the same position relative to the mouse meant that muscle memory for menu access worked very well - compared with flinging the mouse at the top of the screen between each interaction, at least; you're incorrect about claiming the need for "greater aim" because the menu was already under the mouse when you start. It's true that context-specific menus (changing the mouse pointer, there's an idea for Microsoft...) needed aim, but no more than pressing a button.
There were plenty of keyboard shortcuts available for common operations on RISC OS, but they were much less necessary than on other systems - claiming power users weren't what the mouse was designed for doesn't mean that making the mouse interface as powerful and usable as possible was a bad thing. Sure, Impression Publisher (which had its own hot keys should you want them) isn't as powerful as InDesign (although it can occasionally give Word a run for its money), but InDesign is unusable if you're using one hand to hold the reference document that's the source of your layout. I'm a little confused as to which keyboard you've been using that has only "Alt" and not "Alt Gr", let alone separate Windows and Menu keys, but - much though I love Emacs, ctrl-alt-meta-cokebottle-x is not a user-friendly short-cut. Acorns had a "copy" key that, who knew, copied things.
I like Macs, but my HCI lecturer was a bit prone to claiming that their interface was perfect - notably "don't make nested menus too deep because you have to click every time to expand them" (not on RISC OS you don't). Acorn's interface guidelines would still do a lot of modern developers good - particularly "never write a large amount of text in a dialogue box and put OK and CANCEL at the bottom - name the buttons for what they do". Acorn never asked you to eject a floppy disk by dragging it to the recycle bin or popped up a "disk not recognized - format? [ok]" box, nor did it expect you to shut down the system by clicking "start" or decide whether it was going to copy or move a file according to where you were putting it. There were some really nice touches - expanding a window off the bottom/right causing the top/left to grow springs to mind.
Not that everything was perfect. It's nice to be able to resize windows from more than one corner (I had a plug-in, although I still think twm had the nicest solution). There was still the odd UI clanger ("Please insert RISC OS 3 ROMs and press any key to continue"), as a co-operative system it could still get locked up by a misbehaving app (although app killers help), it wasn't as dynamic or secure as a modern OS. But some stuff really was done right, and still isn't by almost anything else.
I spent many hours playing 4th dimension games on this... E-Type, Saloon Cars, Nevron, Apocalypse, etc. Maybe you could do an Antique Code Show of these.
We had a word-processor, a paint program and a 9 pin dot-matrix printer. And an adapter that allowed you to use Amiga joysticks via the parallel port. And a handheld scanner!!
The best time came when my mum brought home a digital camera from school that used floppy disks!
Crikey, nostalgia o'clock.
Spent playing Lander on the A5000, as well as other games, I remember E-Type being compatible from the A3000 that the next door neighbours had but on the A5000 the average speed ofthe car would be about 300mph, faster processor seemed to result in a faster car! It was also my introduction to SimCity (many lost hours) and I suspect if I asked my parents would still have it in the loft somewhere and would no doubt still work.
Oh and !Draw was an awesome free package, my brother designed an extension for our house using it (never got built) and the kitchen for our new build house (which did get built), just so much easier to use than Visio and massively cheaper than any fancy CAD package.
And the boot up time of 2 seconds eclipses Widows 7 on a SSD no problem...
I remember these machines so fondly from school.
We had three rooms packed with A3000s that were hooked up via EcoNET to a shared SCSI hard-disk that was turned on and off with a key, and sounded like a gravel chipper. Just before I left school all the machines were updated to A7000s - they were just beautiful, so fast.
My favourite package at school was called Wordsworth (I think?) - it was a word processing and DTP package, incredibly easy to use and very powerful. Reminds me of Apple's Pages today.
The number of fonts included for free with RISC OS was astonishing too. Anyone else write EVERYTHING in that tube/bubble font?
I'm not sure about "problems" as such, although it wasn't until RISC OS that there was a proper multi-tasking interface, the draw module is an epic piece of useful coding, vector font rendering (as of later versions) was way ahead of its time (although so was the bitmapped antialiasing of Arthur), and in the newest versions, the SpriteExtend module (dynamically expanding JPEGs to render them stretched and dynamically mapping the output to the screen palette...) makes me wonder why a lot of modern systems struggle so much.
Hands up if you remember the dark blue/light blue version of Arthur?
Without question, the original Archimedes computers were the heralds of a new era in computing primarily because of the ARM processor. At the BBC@30 event in Cambridge a couple of months ago Steve Furber made a few remarks on the panel that were pretty revealing.
Firstly, yes they had tried to liaise with Intel about the use of the 80286, except they couldn't understand why the pinout had a shared address and data bus as it crippled the performance in their view. They wanted Intel to license the core to them and they'd repackage it. Intel declined, the rest is history.
Secondly, they visited WDC (which produced the 16-bit enhanced 6502, the 65816) and discovered it was essentially a 1 person operation. They figured that if one guy could develop a CPU, so could they.
Hence the Archimedes was released, using a UK-designed RISC processor at pretty much the same time as the FIRST commercial RISC computers in the US, except that the Archimedes was aimed at consumer machines rather than high-end workstations, a decision that lead to ARM being the dominant 32-bit CPU today, eclipsing Intel sales units by an order of magnitude, and the standard-bearer for RISC itself.
-Cheers from julz (the designer of the 8-bit DIY Computer FIGnition).
Being a child of the 80's, I grew up first with the BBC's then the truely spectaular (for the day) Archies. Started with A3000s, then A310s and A320s, and completed my GCSE coursework by demonstrating a database application built using dBase, running under DOS within the PC emulator upgrade card the school had bunged into a Risc PC. I also got a saturday job working for a computer shop specialising in selling and repairing Arcs, but sadly that didn't last long as by the mid 90's, the x86 architecture had more or less killed off the demand for the green acorns.
I loved the software that was available (often at 'less than RRP' ;) ) as it was passed around between friends. Lander was amazing, but also there was some kind of tank programme that looked a bit like Lander. 'Funky Demo' (video shown on Youtube) was mindblowing for the time with a 3d looking animated character daning to music. Top stuff!
The demo I am trying to trace was a little animated sprite piece, with an animated railroad that went along the screen. A steam train would get hijacked by bandits and the gold bullion stolen. Then the cops would give chase. No idea what it was called - any ideas please?
I can't believe its 25 years, feels like 25 minutes.
I remember talking to the owner of a little computer shop in Stamford near where we lived at the time and who had the A310s in when they first came out. He said one of his customers bought one to replace an Acorn Electron he used to help run his business. He said the chap came back in a few days after he'd bought it and told him he'd transferred the application he was using (sorry, brain needs a defrag, no idea what it was) from the Elk to the Arch and ran it as normal. He went off to make a cuppa and came back, with the Arch claiming it had finished. He thought he'd done something wrong so checked it all out, but his results were all there. The surprise was the Elk used to take all night to do the same processing!
If I remember correctly I went to one of the first big shows where the Archimedes was being shown... May have even been the first big show... The visual stuff being shown off at the time was, bearing in mind I'd only really seen 8-bit systems (e.g. ZX Spectrum, Commodore 64) and early IBM PCs, was mindblowing... The various BBC Micro's with Music 500 and Music 5000 systems were blowing me away with the sounds... I came away with the foldout Archimedes brochure (so nice to see the "paving slab" image again) and the separate single page price list, and *SO* wanted one... Never got one...
The 25 year anniversary doesn't make me feel quite as old as the 30 year ZX Spectrum anniversary earlier this year, but look at the difference in technology that "just" 5 years made... In comparison it's feel like we've been standing still for ages now and it's just the software moving on... IMHO... :)
I remember being around 11 when i was introduced to the Archie's at school. It was like the world had been turned upside down after playing around with an Amstrad 286 for the last couple of years.
I wrote a simple basic program to generate numbers to play on the national lottery when it had just started up, i felt it such a huge achievement at the time, sadly i never carried on learning more languages as by the time i hit year 7 (first year of senior school to the uninitiated) windows was everywhere and playing with excel and access seemed to be waaaaaaay more important that learning to code.
also remember our whole network at prep school was backed up onto a couple of zip disks via the "server" RiscOS workstation for our teacher, and a curious box called "Econet" that i still couldnt tell you what it did, other than having to be switched on in a very specific order.
Loved lander, but at my school it was "Chocks" or "Chucks" cant remember.... that caused the *BEEP* Acorn Reset sound when a teacher came in. A primitive dog fight flight simulator.
These were the days!
My first programs were written on an A3020 (later a Risc PC 700), using WIMP BASIC. Brilliant stuff. Did anyone else follow the article in Archimedes World on WIMP development? I remember the magazine got shut down before the article finished, so I ended up emailing the guy who wrote it to ask him for the final articles so I could finish my application!
None of this DLL rubbish, just stick a ! (ping!) at the start of your folder name, and shift click to put a !Run and !Boot file in there. Brilliant stuff.
I seem to remember reading about this funning new thing called 'Java' in Acorn User...funny how I ended up starting a tech career with it!
What amuses me is you know how you could shift click on an application and look inside it as it was actually a directory with a ! at the start of the name. In OSX you right click, select "Show package contents" and there you are. Apps in OSX are just directories with the .app extension!
With the A310 only having 1MB people were soon crying out for more RAM. We (CJE Micro's) designed a memory upgrade that could take them to 2MB or 4MB. To fit it, eight memory chips and one decoder needed unsoldered, sockets put in their place and the removed chips fitted to our daughter board. Because of the work involved we used to arrange collection by courier, fitting and return. IIRC 1 to 2 MB was about £300 and going to a full 4MB a mere £450. I've still got a few in stock, I'm prepared to discount them!
We're now doing Raspberry Pi addons and Accessories. We won't be doing a memory upgrade though!
Acorn PC, I remember seeing the production line in about 1993, and hearing about all the home built PC's from pilfered bits on the production line. Same story for Canon when they made the mistake of setting up in Glenrothes.
Just add Acorn to the long list of Fail to come out of Glenrothes.
There were some really great features of RISC-OS as well.
There were no file extentions, each file had a type ID so you didn't have the problem of a dozen different programs all wanting to be default for ".bak". Or have Autocad decide that it needs to register 100 different extentions for itself.
I sadly missed out on the Archimedes, although there were a couple at school. It was a good machine, and clearly ahead of its time in many ways.
It's a shame they were so expensive (then again the BBC Micros were too), so they didn't really catch on at home, especially with cheaper Amigas and Ataris. Then again, if they had been successful, maybe Acorn would never have divested ARM and then we wouldn't have the whole ARM ecosystem these days!
Amstrad released the PC1512 a year before the Archie came out. Base model was £499. So the Acorn did look expensive. For the same price as an Archie you could have had a colour PC1512 with a fancy dan hard disk and probably a memory upgrade.
The Amiga was expensive at launch as well. £599 for an A500 in 1987. By 1992 the updated base model (the A600) cost £299 because Commodore were able to cost reduce the design.
Acorn on the other hand launched at £899, released the A3000 in 1989 for £799 (by which time the Amiga 500 was £399). At that kind of price they were never going to make inroads.
"Acorn on the other hand launched at £899, released the A3000 in 1989 for £799 (by which time the Amiga 500 was £399)."
The A3000 had an RRP of £649 on introduction and was widely available for £599 (yes, Wikipedia is wrong), probably less after a few months. It also had 1MB RAM compared to the Amiga 500's 512K and somewhat slower CPU, but whatever.
HI. I thought I would provide a bit of feedback having been in the middle of it all at the time at Acorn. First a great article and factually pretty near the mark. Yes, Arthur 0.2 was in EPROM and we thought we would have them back as they were expensive and possibly re-useable. We set a very firm date for the launch, June 1987 if I remember and hence the OS was whatever state it had reached; not an ideal approach but it would have been sad to have left the superb hardware waiting for too long. Were Acorn products over-priced? No, the profit margin was reasonable bearing in mind the high level of R&D and the high technology. PC clones at the time were sold at very tight margins and made in the Far East or under railway arches in the UK with virtually no R&D spend as a result of the vast volumes. When I look back at Acorn Marketing it was some of the best I have ever come across in my career both before and after. Acorn made the best of being a niche player and inevitably struggled just as Apple did for a long time. Someone comments that Archimedes had 'Windows emulation'. No it didn't, it had genuine MS-DOS licences provided by Microsoft. It was the hardware that needed some emulation as many PC applications at the time wrote directly to the PC hardware bypassing MS-DOS much of the time (it was just a Disk Operating System after all not a Machine Operating System as in Acorn products). Finally as someone else has commented acronyms should always be in upper-case so it should be ROM not Rom etc. Why UK journalism insists on treating acronyms in this way when the USA gets it right is beyond me; it makes speed scanning of technical articles more difficult as I always scan for the techno words. So when RISC-OS was developed and launched it was all upper-case and never anything else; rant over :-)
Wondering if anyone is still secretly satisfying their: Zarch, Elite, Chocs Away.... addictions through a virtual hit on one of the many emulators e.g.
(Many years back I was a Red Squirrel addict)
Yep, both hardware and emulated both - although there are some decent ones out there, from the free Red Squirrel and Arculator and RPCEmu, to paid-for VirtualA5000 which later became VirtualRPC.
Ironically, RISC OS even to this day appears to have a warez scene: https://thepiratebay.org/user/antiques/ but I've no idea how active it actually is. Some interesting looking stuff on there, though, really gets the old nostalgia going :-)
Just like the BBC Micro was the world's best 8-bit micro, the Archimedes - for about a year - was actually the best PC full stop. It wasn't until Intel brought out later generations of its chips in the late 80's that they finally overtook the Archimedes. I never bought a second Archimedes model because I felt that the performance difference wasn't good enough and the gap just widened over time.
Again, like the BBC Micro, Acorn's developers did a fantastic job with the OS - a great (and extendible OS), a much enhanced BBC BASIC and keeping a built-in assembler meant that serious development was possible on the machine you bought. Although paying 50-odd quid to upgrade your ROM set was a stinger back then, it meant that boot time remained 2 seconds, which trounced every other IBM PC clone out there.
Arm (!) yourself with the multi-volume programmers reference manuals and you'd have years of quite fun programming ahead. My greatest efforts were:
* A VT100 terminal emulator written in 100% ARM code that identically matched the behaviour of a DEC VT100 I had access to (including smooth scrolling [in software!], double width/height, flashing etc.). It's extremely rare for anyone even back then to write a terminal emulator entirely in assembly, but I did it.
* Hacks for various games, including completely reverse engineering the codes generated for the Zarch competition that I think was announced in Acorn User. I'd already generated the codes for a massively high scoring game but didn't submit it for fear of being found out. I wonder if the winner did the same thing as me!
* Writing a Scrabble game in a mixture of BASIC and ARM so I could enter the 100 quid Daily Telegraph game (it always found the correct answer). Couldn't release the game because of the terrible copyright hounds of Spears/Hasbro.
* Removing copy protection from games, though I got bored when Magnetic Scrolls The Pawn decided to use a "page 7, line 14, word 2" type protection and also released their game compiled in C (possibly one of the first ever in C?!), which produced such awful ARM code, it became its own form of copy protection :-)
I did eventually tire of the Archimedes though, but I clung onto it for so long that my next machine was an HP PA-RISC Unix workstation followed by an Intel PC with Linux a few years later. I'm still proud today - running CentOS 6.2 on my desktop at work and home - that I can say that my primary desktop OS has *never* been Windows on any machine I've owned.
Went from a Beeb to an A3000, to an A5000, to a RiscPC (and picked up an Arthur A310 somewhere along the way). I use Windows now, but still think RISC OS is the friendlist system I've programmed for, and the ARM is a joy to code. That it is a quarter century old has made all my hair turn grey in, like, five minutes...
Notwithstanding the occasional misunderstandings in this piece, I think it's great to see the anniversary being reported on. This article has generated a fair amount of nostalgic comments here, which are informative in themselves.
Yes, it's a shame that the incredible developments of the last 3 years (shared source RO5 on new hardware) aren't mentioned, but perhaps they're being considered as the subject of a separate article.
And, regarding the acronym style, does that even extend to renaming "Chris's Acorns" to "Chris' Acorns"?
Yes - I loved finding the easter eggs in Acorn's code. That particular one is a reference to an old novel that starts off with the protagonist squeezing some toothpaste onto his toothbrush and seeing a message in it that read:
"Help, I'm being held prisoner in a toothpaste factory!"
Nicely written article, highlighting an anniversary I had completely forgotten. I remember however, using these machines at secondary school, we had about 15 of them, all running Clares' Artisan art software. Some rather natty artwork was put together by the kids, sometimes there were tears when that nasty 'Filecore in Use' or 'Address Exception' error plastered itself on your artwork, hanging the machine.
Fortunately, I was able to look beyond such trifles, and learned to appreciate the Arc. I later got an A3000 then a few years later, a RISC PC, spending many lost hours playing E-type, Chocks Away, Zarch, and many more. Even did some programming on it, before using it for desktop publishing and some initial first steps onto the internet.
Happy Birthday Archie!
The Archimedes and the RISC PC were and are suberb machines to use and to write software for. RISC OS in particular has a user interface way ahead of Windows. I worked on software running under RISC OS (and Arthur) until 2002 and now have to use Windows in my job. It's amazing how often I find myself frustrated when using Windows, because it lacks the features and quality of user interface which RISC OS has.
The Archimedes had no paging of RAM to hard disk, and the MEMC certainly wasn't capable of doing it on its own! :)
What MEMC did was offer a virtualized address space so that each application could have its own memory paged into the same address space while it was running, with other applications' memory protected by being hidden. This is a basic feature of most 32-bit processors but it was a first for a home computer.
MEMC had all the hooks for proper virtual memory with disk paging, but it wasn't a feature of either Arthur or RISC OS (although presumably RISC iX had it). Similarly, neither Arthur nor RISC OS offered pre-emptive multithreading - the application had to specifically yield so other apps could get a timeslice.
Oh, well, quite possibly ... except RISC OS 3.5 was pretty much the end of the line for the machines, and Dynamic Areas required special magical coding to use. Still a fair way from virtual memory as familiar to most of us today. Dynamic Areas were like the DOS Extenders of RISC OS. I'd mercifully forgotten about their existence until now :) It's possible I used one for DOOM, although it's equally possible I didn't.
We had BBC Micro's, lab of Amstrad PCW-8256 (with Hitchikers Guide to the Galaxy) and a single lab of these Archimede's. They were fab. I was into a lot of art and used the paint package to use all 256 colours (from a huge palette - compared to other computers at the time). Also lost a lot of lost hours to that lander game.
Biting the hand that feeds IT © 1998–2019