back to article Picking apart the circuits in the ARM1 – the ancestor of your smartphone's brain

Ever since the silicon blueprints of the ARM1 – the grandfather of today's smartphone processors – were recovered in November, hardware guru Ken Shirriff has been poring over the layout and reverse-engineering the landmark chip. The ARM1, designed in 1985 by Acorn engineers, was a prototype 32-bit RISC CPU with 25,000 …

  1. Dwarf

    Mighty oaks

    Out of little acorns grew.

    1. BurnT'offering

      Re: Mighty oaks

      Also - Apples (see the AC post below re the Newton).

  2. A Non e-mouse Silver badge

    The simple RISC architecture of the ARM1 makes the circuitry of the processor easy to understand, at least compared to a [contemporary] chip such as the 386

    The ARM processor was a 32-bit clean sheet design, with no legacy baggage to support. Whereas the i386 processor has a lot of backwards compatibility, being able to trace its roots back to the 16-bit 8086.

    1. Tom 7

      8086 was also 8080 compatible too IIRC

      In the sense that 8080 asm could be converter to (shitish)8086.

      Not sure it that put any restrictions on the 8086.

      1. cnsnnts
        Thumb Up

        Re: 8086 was also 8080 compatible too IIRC

        You would set the segment registers the same so that the text and data spaces occupied the same 64K.

    2. Anonymous Coward
      Anonymous Coward

      actually the *86 goes back to the 8008 .

      In terms of evolution, if not backwards compatibility.

      More instructions instead of more processing power and more special purpose registers instead of general purpose ones.

      Evolution by bodge, ARM was a fresh start.

      1. Otto is a bear.

        Re: actually the *86 goes back to the 8008 .

        Wasn't the 4004 the first Intel MP. 740Khz 4 bit BCD, feel the power, feel the speed.

        1. Mike Pellatt

          Re: actually the *86 goes back to the 8008 .

          And one of the guys in my year at IC built a telephone switch using one as his final year project....

      2. Scott Wheeler

        Re: actually the *86 goes back to the 8008 .

        Intel tried a fresh start at least twice, with the 80432 and Itanium.

    3. Anonymous Coward
      Anonymous Coward

      The 68000 had 32-bit sections already, they were thinking ahead too.

    4. cnsnnts

      And the 8086 was designed to make porting code from the 8080 relatively straight forwards. So all the applications (Wordstar dBase, etc.) that ran on CP/M were quickly ported to CP/M-86 and MS-DOS/PC-DOS computers.

      For some reason the software on the Archimedes seemed to before much more advanced than that for Win/x86 machines. E.g. ImageFS - open an image file as a folder and drag and drop multiple alternate formats that seem to be present inside, SparkFS - another installed file system that mad Zip archives behave like folders, Artworks - real time updates of graduated fills as you dragged a pointer around the screen, Techwriter - takes too much description but people seeing how it works would invariably get angry saying "why doesn't my computer do that!"

      1. Andrew Hodgkinson

        Open source these days

        > For some reason the software on the Archimedes seemed to before much more advanced than that for Win/x86 machines

        It is open these days - I'm involved with RISC OS Open Limited, which manages it. Emulators available for those who want the nostalgia and it'll run on things like the Raspberry Pi for those that want something a bit more practical :-)

        https://www.riscosopen.org/

    5. Alan Brown Silver badge

      the i386 traces back past the 8086 to the 8008 and the 4004, if you look carefully.

      Yes, there really are vestiges of a 4-bit microcontroller in there.

      1. kenshirriff

        Strangely enough, the 8008 architecture is a copy of the processor in the Datapoint 2200 programmable terminal, which contained a CPU built from 7400-series TTL chips. The 8008 was intended to replace this hardwired processor, but ended up being sold as a separate processor. For some reason, Intel doesn't publicize this much. You can trace Intel's little-endian architecture back to the serial processor in the Datapoint 2200 - when you're adding one bit at a time, you want to start with the lowest. That's also why the 8008 and descendants generate a parity flag - it's very useful for a terminal.

        The 4004 has a totally different architecture, and the 8008 is not based on it at all. The 8008 of course led to the 8080, 8085 and Z-80. Going to 8086 was a bigger jump - as an earlier post mentioned, 8080 assembly could be sort of translated to 8086 code. One interesting thing about the 8085 is it implemented instructions that were never documented because Intel decided they didn't want to have to support them in the 8086. Well, enough random chip facts.

  3. UK Jim

    Since Sophie Wilson has the circuit diagrams, why bother with the reverse engineering!? It's really not *that* long ago, so the people who did this are still around, know what they did, and can explain it if you ask!

    1. wolfetone Silver badge

      Maybe she doesn't want to talk about it, or divulge these things.

    2. GregC

      Re: why bother with the reverse engineering!?

      Because he can? For certain types of mind, this kind of exercise is interesting for it's own sake.

    3. Robin

      As an aside, there's a great series of videos on the Computerphile YouTube channel, interviewing Steve Furber.

      First one is here.

  4. Torben Mogensen

    Dynamic logic?

    IIRC, ARM2 was fully static, so you could single-step through an instruction sequence or stop the clock indefinitely. So I was surprised to hear that the flags in ARM1 used dynamic logic.

    If the flag logic was the only dynamic logic on the chip, it makes good sense to change that in the redesign to get single-step capability, so it is perfectly plausible that this happened.

    1. Simon Harris

      Re: Dynamic logic?

      Dynamic logic was a carry over from the 6502 that Acorn was so familiar with, and allowed the register transistor count to be greatly reduced. Both the 6502 and the ARM1 punched above their weight computation vs transistor count compared to their contemporaries.

    2. Anonymous Coward
      Anonymous Coward

      Re: Dynamic logic?

      Not just single step. I suspect a major disadvantage of the ARM1 was that the range of clock speeds was limited by the need to keep the dynamic cells alive.

      In embedded designs it's very common to vary the clock speed according to the work being done, as this reduces power consumption. This is now, of course, standard operating technique in phones and laptops. For instance, one design I did back in the late 70s used an RCA processor which normally ticked over at about 10kHz waiting for keypad input. When it sensed a key it went to 2MHz until processing was complete. The whole unit ran for weeks off 3 C cells.

      Going to static for the ARM2 would have vastly increased the possible range of applications.

      1. bazza Silver badge

        Re: Dynamic logic?

        For instance, one design I did back in the late 70s used an RCA processor which normally ticked over at about 10kHz waiting for keypad input. When it sensed a key it went to 2MHz until processing was complete. The whole unit ran for weeks off 3 C cells.

        The RCA1802? Great chip. The original low power micro. Got used in all sorts of things - cruise missiles (where I think they did an entire terrain following radar guidance system on it, which would have been a monumental achievement), British Telecom payphones used them.

        Another thing I miss is 4000 series CMOS logic. Want to run it off 24V? No problems. Fiddly to use (don't dare leave an input undefined), not fast, but great power consumption and good noise immunity. I don't think the 1802 went quite that high, but it was good noise immunity that made is suitable for amateur satellites back in the 1970s.

        1. Anonymous Coward
          Anonymous Coward

          Re: Dynamic logic? - The RCA1802? Great chip

          Indeed. Slow but reliable as anything. However - 24V 4000 series logic? Nothing I ever used was safe above 15V. The 1802 had a version where the core logic ran at 10V and the interfaces ran at 5, and when I enquired what it was used for the sales guy declined to answer.

          While we're completely off topic, let me tell you a story (shut up grandpa). On a system I worked on there was a logic board which kept bringing down its 15V power line. There were, predictably, no schematics. And it did seem to draw a lot of power, with mysterious spikes.

          So although it was working we extracted it and worked out the schematic. It was a state machine implemented in 15V 4000 series CMOS. It was a very clever design. And every single unused input had been left open circuit. The board had then been varnished so that each unused input was a little capacitor. It took me a whole day to work out which inputs should be tied to +15 and which should be tied to Vcc. After which it drew 90% less power and the glitch disappeared.

          In some circles 4000 series had a terrible reputation for unreliability, and I often wondered how much of this was simply failure to terminate properly.

          1. Down not across

            Re: Dynamic logic? - The RCA1802? Great chip

            In some circles 4000 series had a terrible reputation for unreliability, and I often wondered how much of this was simply failure to terminate properly.

            My vote is on the latter. I never had any issues with 4000 series. In fact the wide supply voltage range and high fan-out was very useful.

            They did tend to somewhat sensitive to ESD, so perhaps the reputation in some circles was due to combination of bad design and improper handling.

            1. Will Godfrey Silver badge
              Thumb Up

              Re: Dynamic logic? - The RCA1802? Great chip

              I still make regular use of 4000 series chips. Never had one fail, and brilliant for ultra low power control off a PP3 battery.

  5. Shonko Kid
    Thumb Up

    Genius

    The ARM is a work of art, the deeper you look at it, the more you appreciate the genius of it's design.

  6. Chris Miller

    Proper boffinry

    Give that man a white coat and a pipe.

  7. Crisp

    I like looking at pictures of chip dies

    They are like little works of art all by themselves.

  8. Anonymous Coward
    Anonymous Coward

    Better than that ugly mess called Intel x86. Just a shame that Intel and not ARM or Motorola didn't win on the desktop.

    I recall that when they first tested the ARM they forgot to connect the power line to it but miraculously it was working, it was being powered by the logic signals of the host BBC Micro (it was developed by fitting it a CPU expansion to the BBC Micro). It was power efficient even then, although it wasn't a design goal.

    Also, give Apple credit for seeing what a great CPU it was when they were looking for a processor for the Newton. It was a deal with Apple that led to the formation of ARM as a separate company since Apple was not going to be supplied to a competitor in the computer market.

    1. Anonymous Coward
      Anonymous Coward

      Re: Power line.

      It's always been possible to power CMOS processors (and CMOS logic) from glue logic via their I/O port ESD protection diodes, it's just a feature of ESD protection in CMOS chips so it's not specifically a shining example of low power chip design.

      Especially when you're hooking them up to logic chips that are capable of supplying tens of milliamps per pin like those used in the Beeb, even the venerable 6502 could be run that way if by some oversight you managed to forget to connect the power pin.

      1. Tom 7

        Re: Power line. ESD protection diodes???

        Did we have them then? ISTR shorting the world to earth around then.

        1. Anonymous Coward
          Anonymous Coward

          Re: Power line. ESD protection diodes???

          We did, NMOS 6502 chips used protection diodes in the same way CMOS does. They're nice to have but not a guarantee, you could punch through them with a few thousand volts and it's really easy to generate tens of thousands of volts static.

    2. Faceless Man

      That's what always bothers me when people list the Newton as one of Apple's failures. The platform itself was more succesful than most people acknowledge, but the key thing is that it is singly responsible for ARM becoming the mobile processor of choice for everybody since then.

  9. kryptylomese

    They are going to be successful in the server market in the near future - might be worth a punt on a few shares!

    1. Anonymous Coward
      Anonymous Coward

      might be worth a punt on a few shares!

      Yes, except that the market value of a share is entirely unrelated to the real world value of the company behind it (e.g. Facebook...).

      best of luck anyway.

      1. kryptylomese

        Re: might be worth a punt on a few shares!

        Companies stock normally increases in value when the company has predicted growth so there is a link to the real world.

        1. Anonymous Coward
          Anonymous Coward

          Re: might be worth a punt on a few shares!

          "Companies stock normally increases in value when the company has predicted growth so there is a link to the real world."

          Might be that, might be "illogical exuberance" (helped by share buybacks and such, funded by money printed at taxpayers expense).

        2. Michael Wojcik Silver badge

          Re: might be worth a punt on a few shares!

          Companies stock normally increases in value when the company has predicted growth so there is a link to the real world

          Please provide evidence of a statistically significant correlation, in general, between predictions of growth and subsequent real-world results.

  10. Chris Evans

    No mention that the original design files were found and are visualised on the web!

    I'm surprised there was no mention of: http://www.theregister.co.uk/2015/11/28/arm1_visualized/

    "In the case of the ARM1, to celebrate the 25th anniversary of Brit chip architects ARM, the team have managed to lay their hands on the original designs of the 32-bit RISC processor core, and visualized it for the web" http://visual6502.org/sim/varm/armgl.html

    1. kenshirriff

      Re: No mention that the original design files were found and are visualised on the web!

      Yes, the Visual 6502 team should definitely get credit for obtaining the ARM1 layout and building a simulator - my reverse engineering is based on their data. Also Dave Mugridge has also been doing a lot of ARM1 reverse engineering and his website (daveshacks.blogspot.com) is worth visiting.

    2. diodesign (Written by Reg staff) Silver badge

      Re: No mention that the original design files were found and are visualised on the web!

      The visualization effort is linked to in the very first sentence of the article :-(

      C.

  11. Stephen 24
    WTF?

    I've always wanted to know

    Can an experienced engineer look at a chip design photo/schematic and see how it "works" or do they have to drill down to the details and follow the paths? Are there common "patterns" that repeat across chips & manufacturers or is each chip unique?

    1. Tom 7

      Re: I've always wanted to know

      In the early days - when there were less than about 5k devices on a chip there was not a lot of automation available and many hours were spent laying out stuff and you would learn what bits looked like. Worse than that - I used to layout NMOS on a tectronic storage scope which wrote on a green screen with a very bright stream of electrons that left a glow that was a bit brighter than the dark green background, After a long day on one of these you could stand in the pub and whole parts of the circuit would literally flash before your eyes!

      Towards the ends of the eighties a huge amount of CAD had been developed and you would only hand craft (and hence recognise) repeated parts to get the maximum utilisation of space and these parts would stay with you for a long time.

      I worked on some (for then) ultra high speed bipolar and the high (relatively) power consumption meant chips didnt have too many components and were largely pad limited the pads the connecting wires were attached were on 100u 'grid' so you could spend a month or more trying to shrink some part of the circuit by 10 or 20u so the whole chip size would drop 100u and you could get another 15 devices from a 4" die. When you spend that amount of time on something you remember it.

      Not sure what its like these days - wanted to get into it again but they wanted £40k just to see the process details FFS.

    2. Michael Wojcik Silver badge

      Re: I've always wanted to know

      Even an undergrad student who's done a little chip-design work in class can often recognize some common structures in relatively simple architectures - things like barrel shifters (note the one in the ARM1 picture in the article). I remember looking at chip layouts with friends in the CS and EE programs, guessing what various sections were.

      Someone with more experience can no doubt do better.

      But it's just large-scale stuff. It's like looking at an aerial shot of a city: you can say, oh, here's a park, and here's a high-rise office district, and this is a residential neighborhood. But you can't read the street signs without getting a closer look.

  12. Mike 16

    Maybe

    An experienced engineer can learn a lot by looking at a chip plot, less by looking at a die photo. Much less these days of dizzying numbers of layers. It is not unknown to have "traps", structures that look like one sort of transistor but in fact (due to implant or doping) behave differently than expected. The chip plot will probably be more "honest", but also less likely to be available to an adversary.

    Now, if the question is about non-adversarial (e.g. scholarly) study, again, the increasing complexity of modern chips means things don't just "jump out at you", and much of modern (or not so modern) chips is generated rather than hand drawn. One would need some familiarity with the idioms of the designer, or the compiler.

  13. Anonymous Tribble

    I managed to get a look at some bits of the Acorn internal memos concerning the ARM2 and its support chips on the original Archimedes design. Very interesting. Discussions about what audio capabilities (stereo output and/or input) on the IO chip and "how much sand it would require" :)

    I found the ARM1 to be a great chip to use on my BBC Micro and went on to create stuff on the ARM2 and 3 on my Archimedes.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like