For a regular user, a computer is basically a toaster or microwave oven that shows information instead of toasting bread or heating food.
Seriously, how many of you know what components are inside a microwave and how they interact (I bet precious few know what the "power" setting actually does, how it modulates the power). Okay, some may know that a magical device called a magnetron has something to do with it (seeing as how we're all a bunch of nerds), but how many understand how the magnetron actually works (hmmm.... is that a traffic increase on howstuffworks.com that I see?).
More to the point, does it help?
Up to a point, maybe (understanding the power modulation lets you heat the food without getting that disgusting microwaved surface texture, if you have the patience). But most people won't care for those little differences, and they shouldn't have to. They spend their days perfecting their skills as auto-mechanics, doctors, store attendants, or possibly rocket scientists. The computer is a tool to use to reach a goal such as assisting with their jobs, relieving tension through playing a game (or hardcore pornography...).
The regular user only wants an information access and manipulation tool. Getting it in one package, working, with enough of everything to run well is all they care about. 1GB RAM Vista (or 7) machines are NOT what people want. Cables and crap? nope. Upgradeability? Some people THINK that they need/want that. Usually so that they can buy a wildly underspecced machine with the aim of upgrading it in a year or so. They come back in two or three years, by which time the CPU socket on the motherboard they bought isn't supported anymore, so no new CPUs are available, but I can sell you an old one which is 15% more powerful, on the cheap. That still leaves their computers as old and weak, often with cheap PSUs marginal HDDs and such. The upgrade bill (with work) is half the cost of a new machine. Possibly more.
This is not economical for most users. Results in horrible computers, a horrible user experience, the user can never do what they WANT to be able to do because even when new their machines are horribly underpowered due to their misguided attempts at saving money. I'm not saying that cheaper machines are inherently bad, only that for the general user they are usually bad. I run a 9 year old small form factor machine (Shuttle) running Linux, and it runs great AND is pretty. My mother runs a computer that is at least twice as powerful and hates every minute of using it because it won't support what she's trying to do. Cheap machine, didn't listen to advice...
Face it. The only people for whom easy upgradeability is really an issue is a few nerds like us. We are something like 2-5% of the market. And at least a part of these nerds are experienced enough to rip apart even the toughest all-in-one. I mean, hell, we change our own iPhone/Android phone screens. I've replaced capacitors in two failing Samsung LCD monitors.
We, the tinkerers and nerds, will fiddle with stuff no matter what, and figure it out. The rest of the users would be best served by a decently powerful all-in-one. What Apple have done well is market their particular AIO machines as being much more than good enough for any normal user. Those users want an information tool into their home, and think of it like their food cooling device (fridge), their food warming devices, their audio/video playing and displaying machines, and their transportation machines.
Users are task focused, that is, they want to complete tasks of their own choosing. They have no interest in the process that makes that task possible, only whether the task is or is not possible (a binary approach to understanding the problem).