Re: That's cheating - he should do it in Minecraft.
Minecraft? It would be cheating to do it as a simulation.
Why not get back to reality and do it with Lego?
A bloke in Cambridge, UK, is building a computer processor using 14,000 individual transistors and 3,500 LEDs – all by hand, piece by piece. James Newman said his Mega Processor relies almost entirely on the hand-soldered components, and will ultimately demonstrate how data travels through and is processed in a simple CPU core …
I don't live nearby and don't have the time but would love to help out if I could. I have joked for several years about buying replacement gates/pixels/ccd cells etc for stuff at work [TV engineering] (no one laughs for some reason...........I know, I know)
I wonder if you can step-by-step the clock for debugging?
As a German engineer, and therefore a rather lazy person, I have to point out that there's a way to make such a computer with _much_ less parts at the expense of speed.
The idea is that you build a bit serial computer. This means that lots of parts will suddenly become a lot simpler. You can still have 16 bit words, but your ALU, for example will just process one bit a time. Your registers become shift registers with a one bit input and a one bit output. All your buses will also have one bit and clock in their values serially.
There's a book describing such a system. I think it's called "Elektronische Rechenmaschinen". I think it describes a 20 bit machine working bit serially. Back in the early days of building computers, reducing the complexity was essential for many teams building a computer. Trading a factor of n in speed for a factor n of complexity seemed a _really_ good idea back then. Particularly since back then as now, computers rarely were fully utilized.
>The idea is that you build a bit serial computer.
That is exactly how most tube/valve computers were designed. It saved enormous amounts of circuitry, although a magnetic drum memory was mandatory to hold all the data and registers. And damn were they slow.
There's also bit-serial parallel computing ... SIMD, with one instruction at a time broadcast to an array of one-bit processors. The ICL DAP, if there's anyone else out there who can remember that ill-fated project. I had great fun one summer learning to program it in assembly language.
Even though I am a geek I oficially don't get why anyone would spend so much time/money/space doing this as you can just go out and buy a functionally much better CPU for pennies that would fit easily into a matchbox.
Heck if he just wanted to see his processor design implemented he could/should have just programmed it into a PLD or something.
"Even though I am a geek I oficially don't get why anyone would spend so much time/money/space doing this as you can just go out and buy a functionally much better CPU for pennies that would fit easily into a matchbox."
It's quite tricky to teach visually about CPU logic design using a CPU in a matchbox. It would be like training a chef to prepare food by giving him a fiver and telling him to go and buy a big mac.
And you matchbox wouldn't get you publicity, a shining CV, unlimited admiration.
Or have I been whooshed?
I can think of no nightmare worse than hand-wiring 14000 of the TO-92 transistors pictured. They wiggle around unless you bend the leads, and bending the leads causes solder bridges. After all that, you'll find that one is in backwards and the slightly bent leads have anchored it down with such incredible strength that molten solder spatters everywhere when you pull it out.
The 6502 is not a stack-based machine. It does support a stack for push, pop and subroutine call instructions, but so does almost every other microprocessor. I think the 6502 can be best described as an accumulator machine, where all arithmetic and logical instructions require one of the operands to be in the accumulator (A) register. (The first CPU I ever tried programming in machine language, on the Oric 1, which is why I go on about it...).
If anyone is tempted to do something similar , a nice compromise might be to start off looking at the AMD 2900 family of chips . https://en.wikipedia.org/wiki/AMD_Am2900 . This stuff starts at the ALU level . But being interested in this stuff , does anyone know if he went for a RISC setup with microcode ?
--That is an amazing undertaking, and a smart guy to build a processor from scratch.
--He obviously has a lot more free time and ambition than I have.
--Troubleshooting shouldn't be that bad as long as he has a diagram. He obviously understands the principles of operation, so he would just need to put a scope on the correct test point and make sure signals are present. The behavior will tip you off as to what's wrong. Many years ago when I took electronics, we had a donated DEC PDP 11/70. Our instructor would sabotage it, and armed with blueprints and a scope, we'd have to find the faulty component or failed wiring.
--Looking at the MIPs on the processor comparison, it's obvious the MOS6502 was the greatest bang for the buck in its day. Half the transistors and clock speed and still as fast as its competitors.
That is all :)
All the logic that does computation is done transistor by transistor.
The only place that LSI has been used is on the 7-segment display boards - apparently it was too complex decoding hex (octal?) digits to a 7-segment display in transistors for something that is there 'for debugging purposes'. To quote Newman...
"I spent a bit of time trying to work out how to do the 7-segment display using discrete transistors but the answer is vast. Really, really big. It would have near doubled the size of the thing and the circuitry for the display would have obscured the circuitry for the processor which would have undermined what I was trying to do. As its only for debug and not proper function I went for chips. This is definitely NOT cheating, it is just for debug. It is irritating though."
To me it's an amazing achievement of education over rationality. Far as I'm concerned, if "James" doesn't at LEAST get awarded a Masters (even honorary) over this, I'll be annoyed with the universities. Not sure if it's worthy of a doctorate, although perhaps he deserves one in education. That said, many people have received doctorates for a lot less. Even in computer science.
Assuming the poor fucker doesn't already have one, which is why he's doing this in the first place.
Biting the hand that feeds IT © 1998–2019