back to article Legend of Zelda cracked with 6502 assembly language glitch

A 30-year-old bug in the iconic Nintendo game Legend of Zelda allows players to finish the game in minutes. A video posted to YouTube shows that, beneath what looks to be a fun game glitch, there is a fascinating bit of code manipulation in 6502 Assembly: Youtube Video While executing the procedure requires a tediously …

  1. Lord Elpuss Silver badge

    Wow. Just - wow. I am in constant awe of people who figure things like this out.

  2. Mike 125

    Things have moved on.

    >>6502 assembly language glitch...

    >>...and in turn to overwrite memory used for other game functions, in effect "breaking" the game

    These days, we use advanced, high level languages like C to do exactly the same thing...

    1. Tromos

      Re: Things have moved on.

      "These days, we use advanced, high level languages like C to do exactly the same thing..."

      And we need to spend less than half as much for the memory to run it. Mind you, that is 8 gigabytes now as opposed to 8 kilobytes then.

      1. Anonymous Coward
        Anonymous Coward

        Re: Things have moved on.

        Well C was around then too and fairly popular. I used to write in 6502 assembler.... and it seems I can remember more of it than is healthy, just a pity I can't remember where I left my car keys....

        1. Mage Silver badge

          Re: Things have moved on.

          Well, C was invented as a sort of portable macro assembler to make it easy to port UNIX. It's still too dependant on macros. In that sense, C isn't a modern high level language in the sense that C++ is, though it suffered from imposed C compatibility.

          Still, 40 years later we are getting vulnerabilities due to array bound violations (often strings) and bugs due to unexpected side effects of macro expansion. So often C++ programs use C style strings which are insecure.

          ARM was inspired by 6502.

          1. Frumious Bandersnatch

            Re: Things have moved on.

            > ARM was inspired by 6502.

            Yes and no.

            http://www.theregister.co.uk/2009/06/11/pcw?page=2:

            Sophie Wilson, the best 6502 programmer ever, became disappointed with what she could do with the BBC Micro, and went off on her own to design a RISC processor that would do all the good things she liked about the 6502, and all the other things which she wished the 6502 could do.

            So apparently the nice thing about 6502 was the simplicity of it, but they were determined to build something completely different (a RISC processor with no real architectural heritage from the 6502 itself):

            https://people.cs.clemson.edu/~mark/admired_designs.html#wilson

            I can still write in hex for [the 6502] - things like A9 (LDA #) are tattoed on the inside of my skull. The assembly language syntax (but obviously not the mnemonics or the way you write code) and general feel of things are inspirations for ARM's assembly language and also for FirePath's. I'd hesitate to say that the actual design of the 6502 inspired anything in particular - both ARM and FirePath come from that mysterious ideas pool which we can't really define (its hard to believe that ARM was designed just from using the 6502, 16032 and reading the original Berkeley RISC I paper - ARM seems to have not much in common with any of them!)

          2. Lennart Sorensen

            Re: Things have moved on.

            Well if by inspired you mean "Definitely don't do it that way." Acorn wanted a 32bit chip and the 6502 makers were making a 16 bit chip next and the Acorn guys thought what they were doing looked easy and decided they could do that themselves, and someone happened to be reading some IBM document about RISC at the time as far as I recall and thought that looked easy too. It probably wasn't actually easy but clearly they were in fact that good.

    2. Michael Strorm Silver badge

      Re: Things have moved on.

      Thing is, C *isn't* really that high level a language by modern standards. It's been around since the early 1970s (i.e. 45 years).

      The version I know best is ANSI C (C89). (I've no idea what the newer versions (C99 and C11) are like, but I assume they haven't fundamentally changed.)

      A lot of the features and design are still very close to the underlying machine. In that sense, it's been compared to a high-level assembler. This is a little unfair, as it's miles more advanced than that, and far more powerful and advanced in terms of programming structures than old-school BASIC. (#)

      It's also- at least not in its C89 form- not a bloated language, having virtually no "fat" and being elegant and logical.

      As a combination of high-level and low-level requirements, I don't think it could have been done any better.

      But it's still ultimately quite close to the underlying architecture. You can (could?) easily get a segmentation fault or buffer overflow vulnerability if you didn't (e.g.) allocate and keep a track of your string buffers- i.e. memory- correctly. (##) But that's because C- or at least, the core language- isn't, and never claimed to be, providing (e.g.) a fancy, abstract, garbage collection model.

      Regardless, it's surprisingly advanced for something that is 44 years old, but it's still not really a fancy modern language any more.

      (#) In fact, the core syntax and general design are the basis for countless newer languages such as C++, Java, C#, etc.

      (##) In fact, limited string handling and the fact you have to manually keep a track of allocated memory are probably its two most notable weak points

      1. Frumious Bandersnatch

        BASIC. (#) ... correctly. (##) But

        1. You're using stringification (#) outside a #define

        2. (##) evaluates to (), which isn't allowed outside function declarations

  3. Anonymous Coward
    Anonymous Coward

    Wow...

    This changes the no sword challenge a tad.

  4. David Austin
    Unhappy

    Did you have to use Iwata as the picture?

    I'm going to be sad all day now - I miss that man so much.

  5. Peter X
    Happy

    This is fantastic!

    Fantastic this is!

    Was discovered by (a). reverse engineering the entire code base by hand, (b). noticing an occasional crash in the graveyard and wondering why, or (c). something completely different?

    I'd imagine (a). Very impressive whatever the reason though!!

  6. Mephistro
    Happy

    This reminds me of a funny exploit ...

    ... I found in the Donkey Kong JR arcade machine.

    In short, you could input some text in the high scores board after beating the game and then delete more characters (I think it was 10) than what you did input. The possible results included the main character's sprite inverted, random scan lines, inverted sprites movement and random colour blocks in the screen. All this suffered by the poor sod playing after me, of course! :-)

    I usually did this only when the guys waiting waiting to play after me were a little too pushy. Some free education for them!

    I checked a few years ago and the exploit works also when the game is run using the MAME emulator, so you can try this yourselves.

    1. Anonymous Coward
      Anonymous Coward

      Old meme is old

      I guess you could say they were being forced to play...

      (puts sunglasses on)

      ...Wonky Kong.<br /><br />

      YEEEEEEAAAAAAAAAAAAH!

  7. RichText
    Pint

    It's not the only Zelda explot, either..

    A saved game exploit allowed Homebrew software installation onto the Nintendo Wii 'back in the day.

    http://zelda.wikia.com/wiki/Twilight_Hack

  8. Mike 16

    Nit Pick

    "Well Actually..." :-) the Famicom/NES does not use a 6502. It uses a "6502-like" processor made by (IIRC) Ricoh. A real 6502 (or licensed version) would have the decimal addition feature, but that would require paying a license fee, since MOS Technology had a patent on it.

    I do have to wonder if anybody ever did DRM based on processor detection.

    1. Brewster's Angle Grinder Silver badge

      Re: Nit Pick

      I didn't know that about the 6502. I never did much 6502 programming and just thought DAA was another thing that had been RISCed away.

  9. razorfishsl

    Same bug in Manic miner for the Z80.

    one of the sprites went off thru memory corrupting the other screens , making them inaccessible.

    In those days we did not have "C" or even reasonable development environments, we also had to ensure the games were really as bug free as possible, because we had no "update" mechanism.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon