I had ARM shares which plummeted in the .com crash. I wonder if they're a good buy, or if Intel is about to mount its first serious step into mobile devices.
The Story so Far At Acorn, Sophie Wilson and Steve Furber have designed the BBC Micro, basing the machine on the ageing MOS 6502 processor. Their next challenge: to choose the CPU for the popular micro's successor. Now read on... While Sinclair attempted to move upmarket with the launch of the QL in early 1984, Acorn was …
I had ARM shares which plummeted in the .com crash. I wonder if they're a good buy, or if Intel is about to mount its first serious step into mobile devices.
You bought at the peak last time and you're wondering if you should buy at the peak now?
And with Bubble 2.0 getting into full swing too ...
I'd say look at their 10-year valuation unless you're day trading.
I'm glad I'm not the only one wondering when bubble 2.0 will burst... instagram for $1billion !! WTF !!
They are expecting facebook to be valued around $100 billion...for a company with increasing costs and decreasing profits ($205m last quarter) I don't get it :S
I bought ARM during the .com bubble, have held on to them, and I'm still waiting for them to get back to where they were then! Just about made it during the last couple of months... think I'll hold on to them a bit longer though.
I'd say it depends on whether Intel or ARM manages to implement a stable "PC-like" mobile platform leading the mobile market out of the "home computer" market to the "PC era", when suddenly platforms _really_ span multiple vendors and you can separate the software from the hardware.
Currently the mobile world is like home computers used to be. You couldn't be sure that software bugs ever were fixed. Back then that only meant some minor inconvenience, today it can mean serious security holes. While on a "PC" (or whatever stable hardware platform) you can simply update, upgrade or exchange your operating system without your hardware vendors consent. It is, in fact even quite easy. You boot from an USB-stick or CD-Rom and there you go, having another operating system without touching any stored data.
This would be the next revolution.
for not mentioning the name of the person responsible for Acorn's "unravelling".
Good article, though.
If the The Register has one habit that bugs me, it's that their prime articles, and the accompanying banner images, stick around on the right-side bar FOREVER!
I swear we've been staring at those two guys holding that PARIS plane for MONTHS now.
Please, don't tell me I'm still going to be looking at this article's banner pic when Christmas comes around!
(Great articles thought)
Agreed, they stick around forever. I could swear they do it on purpose - there's nothing more annoying than having a photo of some lego characters regularly on your work screen over a period of months, people get the wrong impression ;)
Nice informative article.
If this article had anything resembling a "like" button, i'd press it now.
You could always use the "Rate this article ..." gadget underneath the link for the comments.
(Disclaimer: The only time I ever do that is by accident, when reading on my tablet.)
I do that all the time, it's so very annoying.
Oh no, you are not alone. I suspect web designers will learn to place important links right next to ads in the near future.
Almost brought a tear to my eye. A beautiful article about a beautiful piece of technology. My jaw dropped when I read the ARM worked without applying Vcc :)
Very much enjoyed the article.
I might be wrong, but I think the VCC issue was originally flagged up because the test unit would occasionally crash for no apparent reason, so they started to monitor just about everything, and discovered the apparent lack of current. They then backwards reasoned that it was getting power via its inputs and that the crash happened when all inputs (data & address) were zero.
I would just like to congratulate the author of this article. Beautifully written. A real credit to yourself, and The Register, sir. Thank you very much.
It struck me that Hauser, despite not being the technical guy, is as bright as they come. He clearly recognised the talent he had with Wilson and Furber.
I completely riviting read. Actually, the story of Acorn & ARM would make a very good book. Ditto Inmos IMHO.
Have a good day all.
This is why I come to the Reg, so much nicer to read a well informed structured article as opposed to the usual "my dads bigger than your dad" fanboy rantings on other sites
when did wilson have a sex-change - the pic on the top of P2 looks different to the others.
Quite a few years back now, though evidently not that far back... Don't know the exact date, sorry.
He's been dressing as a woman since the 80's. Not sure when he had the chop though....
"He's been dressing as a woman since the 80's. [...]"
*She* has. Sorry to nitpick, it's a subject that's rather close to my heart.
Great article. Wow ... running on leakage.
The IBM ROMP chip (aka the 801) was never intended to be a general purpose RISC processor. It was intended to power an office automation product (think of a hardware word-processor like WANG used to sell).
As a result, although it could function as a General Purpose CPU, it was not really that suited for it. It was never a success because at the time, IBM could not see justification for entering the pre-Open Systems UNIX world. RT 6150 and the 6151 were intended as niche systems mainly for education, although they did surface as channel attached display front ends for CADAM and CATIA run on mainframes (and could actually run at least CATIA themselves). This changed completely with the RIOS RISC System/6000 architecture, where IBM was determined to have a creditable product, and invested heavily.
In comparison, the ARM was designed from the ground up as a general purpose CPU. Roger Wilson (as he was then) greatly admired the simplicity and orthogonality of the 6502 instruction set (it is rather elegant IMHO), and designed the instruction set for the ARM in a similar manner. Because the instruction set was orthogonal (like the 6502, the PDP11, and the NS320XX family), it makes the instruction decoding almost trivial. It also made modelling the ARM on an econet of BBC micro's (in BBC Basic, no less) much easier, which allowed them to debug the instruction set before committing anything to silicon.
They had to make some concessions on what they wanted. There was no multiply-add instruction, which appeared to be a hot item in RISC design at the time, and to keep it simple and within the transistor budget, all they could do was a shift-add, (the barrel shifter), which although useful, was a barrier to ultimate performance, but great for multi-byte graphics operations.
It was also simple enough so that they could design the interface and the support chips (MEMC, VIDC and IOC) themselves, achieving early machines with low chip counts.
This is all from memory of articles in Acorn User, PC World, Byte and other publications. Feel free to correct me if my recollections are wrong.
If ARM and WANG had worked together they could have greated the Jenerally Executable Reliable Kompiler, which would have been a major release.
"There was no multiply-add instruction."
The ARM processor in the original Archimedes had multiply and multiply-add instructions (MUL and MLA), though I seem to remember them being very slow. Perhaps the designers were disappointed because they didn't have the transistor budget for a fast multiply.
Ah...! You're thinking of the ShortARM™ project.
the ARM1 chip had mo multiply (or multiply add) instructions. it was used in the Tube coprocessor for the BBC Micro that the article talks about (the £4500 one..). I actually got to use one briefly long after it was obsolete...
lack of multiply was discovered to be causing performance problems, so a slightly revised chip (ARM2) was used from then on (ie: all the Archimedes series) which had MUL/MLA implemented, although it was a bit of a hack - every instruction took a single clock cycle *except* MUL or MLA which could take up to 16 clock cycles (still way faster than emulating multiply in software).
@starsilk. Thanks for the correction. I certainly knew about the multiply-add being missing, but I deliberately avoided talking about the multiply instruction being missing, because I just could not remember.
MUL & MLA were indeed slow when both sides of the multiplication were variable, but lots of multiplies have a constant one one side, often sparse in bits (e.g. 2^N - 8, 16, 256 - or 2^N+2^M - 10) and the great trick (of ARM assembler hackers like me, and the - at the time - brilliant Norcroft compiler) was to unfold the multiply into shift-adds (one per bit) using the barrel shifter, one cycle each.
One my most treasured possessions is an original ARM-1 dot-matrix instruction set description with CONFIDENTIAL scrawled over it in red ink...
ARM1 didn't have multiply instructions. ARM2 did have MUL and MLA, and used the ALU and shifter to perform multiplies 2 bits per cycle. It used early termination so that multiplies by small values was much faster. When writing assembler for the ARM2 I always made sure that the smallest value was in the right place as X * Y would use a different number of cycles as Y * X... Given the low transistor budget it was the right design, just a pain to optimize for. Today even the smallest ARM CPU has a single cycle multiply.
Thanks for this well presented and interesting article.
More of this kind of thing!
I had an Archimedes around 1987 - specifically an A3000, and it's power became apparent running some simple Mandelbrot fractal code in BBC Basic. I had the original code from some magazine listing and had applied it to my A3000s predecessor, a Sinclair QL.
The QLs SuperBasic was as wonderful as it was slow – the Mandelbot set took 24hrs to draw. Even compiled from Pascal code, it still took 8 hours.
The Archimedes? 45 of your earth minutes. Astounding.
The A3000 wasn't out until about 1989/1990 - the 1987 models were the A310, A410 and A440 if I recall correctly. I had the A420/1 which came out in 1989, and the A3000 came out soon after. A great piece of miniaturisation, but hamstrung slightly by the lack of a hard drive. 2.5" drives appeared that could fit inside later.
As for the Mandelbrot drawing, I converted a BASIC program to ARM assembly and then hand-optimised it. The innerloop was 13 instructions long, and the rest of it was just dumping a value to the framebuffer. It could do 320x256 fractals at 5fps (although, to be fair, it mirrored one half, so 2.5fps). Such was the power of hand-optimised ARM code. You could bash out a program like that in half an hour, and then spend a week teasing out every extra clock cycle. And figuring out that ot only was inline conditional execution faster than branching, but LT is faster than GE (by one cycle).
Still that was ARM2/2.5/3, so I guess things have changed since. Now please excuse me - I've been typing this whilst being assaulted by a 3-year-old...
This is the sort of thing I like reading on here. Something that's had some true care taken to make it worthy of it's audience.
And you even managed not to refer to Apple as a Foxconn-rebrander when they were mentioned! Bravo.
Its staggering to think that the chip that powers billions of phones, cameras, hard drives, tablets, etc started life as nothing more than an accident of design.
And i agree, i dont think failure is a word you can use to describe Acorn. It didnt fail, it just evolved beyond itself.
I agree, it wasn't Acorn and the ARM which failed.
But there is something about the British industrial and financial environment which seems to let the winnings from these works of genius drift away out of reach.
Its not just globalisation, and some factory that is so expensive to build that there can only be one on the entire planet. And we can't expect to spot the right investment choice every time. But what is it about this country which turns a successful entrepreneur into somebody fronting a TV show that tests how people can run a market stall in Essex?
I remember my first ARM chip, the ARM250 fitted to the Acorn A3010 bought for £500 from Dixons which ran at the massive 12Mhz, still a lot faster than the A3000's at school. I then went on to a Risc PC 600 which had an ARM610 chip running at 30Mhz, later replaced the CPU board with the 40Mhz ARM710 and eventually put a StrongARM 203Mhz CPU in which I later overclocked to 287Mhz. I even put a deposit down on the cancelled Phoebe computer.
When that fell through and Acorn collapsed I eventually bought a Castle Iyonix PC which had an Intel XScale 80321 600 MHz.
I still have the Risc PC in my mothers loft but the only ARM chips I use these days are in my Samsung Galaxy Nexus.
I think you probably have quite a few ARM cores around - iPods use them (I'm not sure about today's iPods, but early ones used three), your hard disk drives have them, in fact most of the major components of a PC have ARM in them - network cards, video cards, SSD, even USB flash sticks, and then there's printers, routers, car dashboards, gps, brake systems - ARMs are everywhere!
The sad thing is that Acorn should be occupying the same space that Apple is in now. Instead they let it fall apart, and so Apple is the biggest company in the world, not Acorn.
The reason the Acorn project fell to pieces was partly because so many journalists, politicians and other influential parties were vociferous in opposing the use of a "non-standard" (that is non-Wintel) architecture machines in education especially. That, and the domination of business by MS Office and the need to exchange documents essentially lead Acorn into an ever declining market. There were some great applications written for the Acorn RISC machines (like Sibelius), but it was inevitable that it could not be sustained on that architecture. It's simply impossible to maintain a thriving development community of applications in such a narrow market based largely in one country.
Don't forget there were many other non-Wintel casualties in the US and a whole raft of alternative processors. Apple only just survived as manufacturer of an alternative architecture because of its dominance in some important niche areas, such as the "creative" sectors along with a somewhat grudging support from MS via a porting of Office (grudging, because it was something of a sop to US competition authorities). Acorn were never able to do what Apple did with the non-computing products, like IPod. With all its troubles, Apple was much better financed with a much more supportive investment sector and a larger market.
As it is, it was ineviable that Acorn would end up, as its name indicated, the seed for a number of small/medium enterprises specialising in niche areas. Competing with Wintel was always going to be near impossible. That ARM emerged from it is something of a miracle, but to keep things in perspective, the vast majority of the income from products using this architecture acrues outside the company. Essentially ARM does not compete just on the excellence of its low-power processor designs and associated eco-systems, but because it is very, very cheap. ARM is not Intel who can command income per processor perhaps 100x that of the royalties the former achieves.
I had heard (at the time) that one of the big problems Acorn had was that Apple would sue them out of existence if they released RiscOS systems in the US. This basically killed their potential for worldwide sales.
Does anyone have any links about this?
I'm pretty sure Acorn systems were sold in the US via Olivetti. I think the main reason that RISC OS never gained the staying power of Mac OS is that Apple did the graphical desktop four years before Acorn and so managed to grab niches in publishing and design that sustained them when Microsoft came along and did the GUI for everyone else. Acorn's educational niche wasn't sustainable because, as noted above, there's a lot of political meddling in education and it's easy to score points with 'business picked Microsoft, we should be training them on Windows'.
I guess it's a shame but the triumph of ARM makes it difficult to be very upset.
As I recall Acorn tried selling to education in a small geographical location in the States. Then Apple rushed in and dontated large numbers of computers to those specific schools which of course killed the Acorn initiative.
As I recall, Arthur was out and about in 1987.
While MS's continued support of Office for Mac may well be given grudgingly. It's worth pointing out, however, that the two central constituents of Office (Word and Excel) were both released for Mac in 1985 and weren't really what I'd consider ports. The first Word for Mac was actually the first graphical WYSIWYG version of that software and the first Excel for Mac preceded the DOS/Windows version by 2 years (that not being released until 1987). That the Mac had Excel is often given as one of the contributing reasons (along with DTP applications) for its continued existence following a rather lukewarm couple of years from its launch.
Not sure about who you mean here by "they". Acorn was mercilessly picked apart by beancounters for want of a decent BOFH and a lift shaft! All that on the eve of the launch of the fabled "Phoebe" or RISC PC 2 which, from accounts of the lone prototype known to have existed beyond Acorn's end, was a pretty stunning machine by the standard of computers back then.
It's yet another example of how short sighted money men are asset stripping our industry, our inventiveness, heck even our culture! The only bright point is that, on occasion, we can still shock these idiots with what we can do - I suspect that the Raspberry Pi, for example, must be giving some of the big corporates something to think about.
We don't seem to dominate much as we don't seem to produce products that have a good reputation for build and engineering
Acorn machines were the Linux of the day, less games and commercial software, more educational and development.
That's always been the problem. It doesn't matter how good the hardware is, if there's no software available that you wish to use then it no good.
This is what affects Linux for instance. It may be great at many things, but it's not much good for running the popular tools people want to use. Office, Photoshop etc.
Even the Amiga and ST which did have many cool tools (ST was popular for Cubase in music studios) gave way to the PeeCee. So what hope did the Arc have when even the big US alternatives dried up?
More, please! I'm really enjoying all these retro-themed articles, especially the Acorn ones.
Acorn may live on in various other forms, but the day they went belly up was a seminal day in my computing life. I'd started off with an Electron, graduated to a Beeb and then taken a three year time out in the form of an Atari ST as the Archimedes was way too expensive for a 11 year old school boy. I remember seeing it - finally - at a computer shop in York, and marvelling at it in wonderment.
In 1992, I finally got my RISC OS machine, in the form of an A3010, purchased from Dixons. Four years after that, I got a RISC PC 600. It was utterly brilliant. Friends of mine were predominantly PC users, and my RISC PC was light years ahead.
Then Acorn died. And I was sad. The workstations division closed down for purely financial reasons. I moved to a PC after that, something I was sure I'd never do, and moved to OS X six years back.
Nothing will ever recapture those golden Acorn years for me. Back then, computing was fun. Now, programming for a living, all the joy has gone.
I still have two Electrons, my A3010 and my RISC PC. One day I'll get the RISC PC out again and relive those golden days.
There are a couple of really good history documents on the creation and evolution of IBM's Virtualization Technology and the development of the IBM Mainframe 360 and follow-on 370 architecture. The former was written by Melinda Varian formally of Princeton University, the latter written by Jeff Gribbin formally of Rolls Royce in the UK. Varians contains some great pictures and generally less understood and aspects on the evolution of virtualization.
RISC OS is still a going concern. There will supposedly be a version of RISC OS ported to the Raspberry Pi. I might try it, just because.
Back then I was a Spectrum fanboi and everything else was crap, but looking back the 'right way' of designing those 80s systems was arguably Acorn's way.
Can we have an 'old codger' icon, please Reg sir?
Biting the hand that feeds IT © 1998–2017