I'm just glad the Spectrum Next turned out to be a real thing, and not a damp squib like the Vega+ etc...
If you approach a problem with the right tools, right people and right discipline, it will be a success.
The twin planets of business and consumer technologies have been locked in a game of Pong for decades. The Apple II was aimed at hobbyists, but catalysed the revolution that put a PC on every office desk. The GUI needed hardware so expensive it could only come in boxes with corporate-sized price tags, until the Atari ST and …
I backed the 'accelerated' version, so I'm still waiting — patiently, and with an increasingly happy and lunatic smile — for its arrival. I'm really looking forward to getting mine 'in the flesh'. It's been great hearing about others receiving theirs in the mean time. There's been a great buzz in the community about it. I hear the 1980s calling me back :-)
At the MiST project brought the Atari to the FPGA, the MiSTer project is worth looking into. Some people are now selling assembled and pre-programmed units complete with expansion boards... all build on the FPGA to run arcade systems as well as vintage consoles and home computers. It's breathtaking to see what's being done by some dedicated people.
I think we all (1980's techie teens) like to imagine improvements that we could have made to our home computers of the time. I often imagine what the Dragon 32 would have been like with a better video chip and a memory controller. It was possible to write code for that which could be run at any memory location without a linker.
I've repeatedly said that the 68008 should have been the CPU for the original IBM PC. The wholes of the 80s programming was held back by the 8088 and the use of MS/DOS.
I often wonder if the Dragon would have taken over the world if the 68008 had been used in that along with CP/M-68K which I think was available when the Dragon came out. I'm not sure how much use Killdall's XLT86 got but he really shot himself in both feet with that!
The 68008 wasn't ready in time. It wasn't released util the year after the IBM PC was launched, and IBM engineering needed lots of samples of the chosen CPU in order to validate it. The 8088 may have been crappy and slow, but it was available and programatically similar enough to the 8080 that the software world was used to to make it an attractive choice at the time.
For devices where the logic configuration is stored in non-volatile memory that is the case (at the extreme there are one-time-programmable devices), but for those using static RAM to store the configuration there is no reprogramming limit.
e.g.
In the next it's a Xilinx Spartan 6 which does indeed "store the customized configuration data in SRAM-type internal latches ... The configuration storage is volatile and must be reloaded whenever the FPGA is powered up." per its data sheet.
But even if the FPGA itself isn't a concern, surely that just moves the failure to the external flash memory that contains the FPGA's bitcode? There is an update procedure for a Next with a progress bar so I don't think it's streamed from the SD card at every launch.
This is a shame, as I own a Spectrum Next and self-justified that on wanting to play around with FPGA development. "I promise, it's for school" I told myself.
Not familiar with the Next but it may well be transferred to an onboard chip for initial loading, some FPGAs have only partial support for the serial memory protocols and lack e.g. clock stretching if the data source can't keep up with the device. I wouldn't worry about it "wearing out" though, some of the later chips go as far as supporting partial re-programming: you can reconfigure half the chip while the other half is still running whatever it is already programmed with.
The F in FPGA comes from Field where the metal connection was that bit you programmed to set the functionality on a Field of components originally . A chip then would be made with about 10 expensive quartz masks and so for small production runs the original FPGAs where basically a shit load of components that you changed the wiring layer and connections - so two expensive masks to prototype on a relatively mass produced blank set of components where the cost of the other 8 became insignificant.
Newer versions are a lot more flexible as explained elsewhere but are still considerably more expensive than something that could be produced in the millions using custom layout of the whole chip.
The F in FPGA comes from Field where the metal connection was that bit you programmed to set the functionality on a Field of components originally . A chip then would be made with about 10 expensive quartz masks and so for small production runs the original FPGAs where basically a shit load of components that you changed the wiring layer and connections - so two expensive masks to prototype on a relatively mass produced blank set of components where the cost of the other 8 became insignificant.
What you are describing there is an uncommitted logic array or ULA, and they are still very much a thing. FPGAs are not mask programmed by definition: the "field" is where they are programmed, i.e. after leaving the fab.
You can build an entire Galaians machine in logic and then reprogram it to be a Defender machine.Not just the glue logic but the 6502 and the video chip in programmable logic. There is a whole geekdom where they do this sort of thing.
One of the problems is that the ZX Spectrum was designed as a stand-alone computer. It has no networking it has no security and it is used by one user at a time.
The problem with legacy corporate IT is that is generally used by many people at diverse locations, whether it be in an office block or at separate physical locations in other cities or countries. If the system is easy to isolate, there are no problems, but for systems that have to stay online, but are no longer secure, an FPGA won't help there.
On the other hand, FPGAs are a great way for modelling problems going forward. Doesn't Azure already offer FPGA instances?
Non networked desktop machines were certainly.a legacy e.g. Apple II, Commodore PET, plenty of CP/M machines and many early desktop PCs were stand alone.
Networks game in pretty quickly though but in in that era much if it was terminals into mainframes. I guess it's all about which aspect of Legacy IT.
zX spectrum did have networking quite early on via interface 1, BBC machines had Econet, many also dialed into to central servers e.g. BBS type services and MUDs.
I'm.pretty sure Intel have developed CPUs with FPGA components on die for specific compute too.
"I'm.pretty sure Intel have developed CPUs with FPGA components on die for specific compute too."
The Intel Agilex SoC FPGA range have Quad-core 64 bit Arm Cortex-A53 CPUs on board - not sure if that's a case of a CPU with an FPGA on board, or an FPGA with a CPU on board.
Networks cams in quickly in some places. A lot of floppies went around before the price of the cards and cabling dropped to the point where it was better than wandering up a couple of floors to drop off the data/code. In fact ISTR it coincided with MS Word producing massive files where a page of A4 and a couple of different fonts wouldnt fit on a single floppy and management types couldn't work out which order to put Disk1 and Disk2 in even if they could remember how to concatenate files.
I don't think the point of this article is that it's a ZX Spectrum; it's that it's an FPGA programmed to act precisely like a legacy system, but with improvements, which could have great applications in the bigger IT world. Knocking a Speccy for not being corporate IT is not the aim of this game (which is not played via a Kempston joystick).
It's always the support. Those legacy systems aren't a problem because they're old or because they're slow, they're a problem because the hardware supplier has end of life'd them, the OS won't get any more patches and the last person in the company who understands the software is due to retire in 18 months.
Imagine a mid-80s DOS machine running something, both DOS and the software came with lifetime licences (as was done then), why would they need any more patches?
Now put this esoteric software on an emulator or a VM on a Windows 10 machine. Now you do have a problem with updates and licences and the possibility that the emulator or VM won't replicate everything 100%.
The last person in the company who understands it has 18 months to write and rewrite a document until the PFY can follow the instructions. That's not an insurmountable problem.
Explaining to management that the senior engineer needs to spend time writing documentation... Is a harder problem. We have heard time and a again the tails in these halls of retired engineers being called back at £$€¥ cost.
Having both the PFY(s) and the BOFH out of action for training (cus y'know... together) is, if not unsurmountable, going to be like pulling teeth.
Back to the article, the simplicity of the machine has a lot going for it in terms of software reliability, instead of the gigabyte-sized monstrosities which constantly update and pull the rug out from under your feet.
I think this is the droid you're looking for (might need some Jedi magic though).
Maybe initially in the USA?
Anyway it was business people due to Visicalc that made the Apple II a success. Only very rich hobbyists could afford it and knowledgeable ones assembled S100 based systems.
The Apple II had a built in Keyboard, only color in the USA and had 40 column (maybe upper case only).
I bought one for our business, falling for the hype. We spend more on accessories and upgrades than the computer.
"The Apple II was aimed at hobbyists, but catalysed the revolution that put a PC on every office desk"
Putting your experience together with the above quote, it was still a small minority. It was the IBM 8088 and then MS DOS that "catalysed the revolution that put a PC on every office desk".
I'd like to see the size of the chip that could do Windows in hardware.
On another matter, "GPUs grew out of gaming". Globally I have to admit that that is likely true, but the very first graphic card I bought was an Orchid Farenheit 1280 in 1992. It had an entire MB of RAM !
I bought it because it promised accelerated performance for Windows 3.11. For Windows !
Of course, the next graphics card upgrade I purchased was a Diamond Stealth in 1995 (not entirely sure it's that exact version). That was not for accelerating Windows, I'll admit, although by that time, accelerating Windows was, apparently, par for the course.
It's not until 1997 that my hunt for performance started, badly, with the Matrox Mystique. Needless to say, I went Voodoo 2 in the year that followed, and then it was Nvidia that reigned supreme in my PCs.
But I'll never forget that Orchid card.
"I'd like to see the size of the chip that could do Windows in hardware."
Seriously, why not. Then again our computers use the Von Neumann architecture. Basically the CPU is the pump at the centre and the memory and hard drive are tanks, the data being the liquid.
What the FPGA offers is to make everything in the computer pumps and tanks.
So instead of machine code having to wait in RAM having for it's turn through the CPU, it will run itself.
Matrox excelled in CRT driving- they had much better DA controllers than anyone else, back in the day. You needed to pair them with something like a Sony Trinetron monitor (if you could stomach the ghost lines). From the G200 they could drive twin monitors. Nowadays - I think 6-10 are possible.
After the mystique (and later to an extent the G400) they dropped out of the consumer market. The Parhelion was no use for gaming - it was uncompetative when it came to market..
GPUs are the co-processors of our time. I remember Intel sold the maths chip separately and it made a big difference. GPUs don't integrate so cleanly into the computer as co-processors. The CPU had hooks for either calling software or hardware if present. The GPU has to be specifically loaded with a procedure and data and set on it's way. The load/unloading process is more work than many of the tasks so it has to be a really heavy task to make it worth the effort.
I can see AMD more closely integrating GPU tasks into the CPU now they are putting mini VEGA cores onto RYZENs.
Mostly an FPGA is used when the volume is too low for an ASIC.
Almost none are in applications were the configuration is changed during use. They simulate a HW design, which obviously can have a designer designed CPU core, a real CPU core or no CPU.
Almost all are only seen during product development. The same files can produce an ASIC design, which will use a fraction of the power and may have fewer pins.
You'd only NOT use an ASIC if:
1) You are using initial customers as beta testers
2) You are not quite sure of customers requirements (see 1)
3) Volume is too low.
4) The EXTREMELY rare case where the FPGA functions will change during use and this can't be done by loading a table into Flash or RAM paired with an ASIC or inside the ASIC.
Also Spectrum, Apple II, Amiga etc have generally nothing to do with Legacy IT.
It was Spreadsheets, then databases and wordprocessing that made PCs a success in business. Hobby/Home computing was more about games and quickly diverged apart from niches like MIDI.
Perhaps the PCW series was the first and last home computer mainly NOT for games till laptops & PCs were cheap enough for home users that were neither hobbyists or gamers.
Now games on PCs bought for games rather than phones, tablets and consoles for games are a niche.
It was Spreadsheets, then databases and wordprocessing that made PCs a success in business. Hobby/Home computing was more about games and quickly diverged apart from niches like MIDI.
There were people crowbarring spreadsheets, databases, and word processors and into all sorts of computers before the x86 PC took over. Also, there was CP/M and and towards the end of CP/M's life many home computers could run it (BBC Micro, Amstrad CPC, Spectrum +3, Commodore 128).
Also, if hardware is better than emulation for computers up to about the mid 90s. If you want an example look at this (LGR, 20 minutes). Not quite FPGA (but FPGAs for 8088 and x86 do exist) but it's still a new version of old hardware.
You could replace one 35-year-old box which is just about to die with brand-new hardware which maintains 100% compatibility with the old software and leave it running another 35 years should you need to, only hopefully with better hardware maintenance this time around. You'd have to be brave to claim you'd get the same lifetime out of a modern PC running Windows 10 and emulation software or a VM.