@ jason 7
You're absolutely right - I had my start trying to get games to play on an 8086, and its successors through the nineties! I was also exposed to Archimedes Acorns and later networked (beige) Apple LC-IIIs at school, and to Atari STs by people who had coveted MIDI keyboards.
I fell asleep before I developed my point, which was at that time faster CPUs and more RAM dramatically improve the user experience. Therefore, buyers would buy a new machine on spec against price, and money spent elsewhere was a 'waste' - so PC vendors would naturally put together the best components and sell them in the cheapest box. This meant there wasn't any incentive to make machines with the 'rough edges' taken off.
And that is fine. There is no reason for a company to make machines that are more highly priced than their competitors, unless that they are adding something that helps to sell it. In the PC market at the time, these things would happen but required collaboration from various parties. Therefore, genuinely helpful technologies (Remember when 'Plug and Play' was a selling point? Amiga and Apple already had it) took a little while to filter through.
The disorganised PC market had advantages, though... what became standards tended to start as a propriety solutions to genuine user needs. Lots of sound cards were sold as being 'SoundBlaster-compatible' for example, and later 3D video cards had to convince game developers top develop for their platform.