Re: Missing the point, I think...
10^9 core MPP systems are now viable. At that scale we can drop the Turing Machine model,
Not really. We're still stuck with the Turing model in an abstract sense and the Von Neumann model in more practical terms. We just have to adapt them to be more aware of multi-core and multi-processor systems. And in fact, we pretty much have done so years ago and there hasn't been any great paradigm shift.
and progress to an Object Machine, where every element of data is an active processing element
It sounds like you're talking about agent-based programming. Again, it hasn't caught on, except in writing botnets and perhaps back-ends for massively-multiplayer online games.
generating random sequences of logic and see if they do anything useful
And just how do you decide what's "useful"? Or as Robert Pirsig put it in Zen and The Art of Motorcycle Maintenance, "And what is good, Phaedrus, And what is not good—Need we ask anyone to tell us these things?" You'd probably enjoy reading that since it's really about philosophy, not hard computer science.
because all software will be written by software(*)
Of course. And the Singularity will arrive and bathing in Unicorn Milk will keep us young forever.
do not just binary digital processing but higher-base hardware processing ... feed a base-3 digital processor a compatible pair of base-2 & base-3 instructions
Hmm... Are you really amanfrommars in disguise? If so I claim my £5.
But seriously, do you even know what Turing-complete means? In particular, a Turing machine can be re-expressed in terms of Gödel numbers, which it turn can be mapped onto the set of natural numbers. Crucially, all practical number bases are isomorphic to each other, so binary, ternary or base 10 (or balanced ternary or whatever) all have the same expressive power so there's no theoretical reason to favour one over the other. It only comes down to issues of practicality. For most purposes binary is good enough, and it's only if you want to represent certain numbers with a finite number of digits that you might want to consider other bases (the string to represent 1/10 is infinitely long it binary, for example, while it's just "0.1" in decimal or binary coded decimal, for example). And in case you're wondering, going from the natural numbers to the reals doesn't magically grant your computer new powers either: the naturals are perfectly sufficient for "universal" computation, so, eg, a phinary-based computer can't do anything more than a binary one can, except be a pain to build and program. Another book recommendation for you: you might like Godel, Escher, Bach, and Eternal Golden Braid...
(*) Actually, there is one kind of "program that writes programs" that can benefit from having massive amount of cores to work with, though I mean "program" in the kind of mathematical sense that Turing did, rather than the way you think of it (eg, a word-processing package). I'm thinking of something like Turbo Codes, which are effectively bit-level programs that tell a receiving computer how to reconstruct some embedded data even if some of the bits are dropped or corrupted in transit.
Another, similar type of application is data compression, since you can treat the compressed data as a "program" that tells the decoder how to unpack the message. I think that that's the most interesting possible application in this realm: given enough computing power, we should be able to try out many different ways of compressing some given data and output a compressed string and a decompressor. Obviously, this still isn't going to be able to magically compress incompressible data and it's quite impractical as a replacement for general-purpose compression schemes like gzip, bzip and so on (since there is an infinite--or worse, transfinite--number of "languages" to consider, and the best compression ratio possible is sensitive to the choice of language) but it still could be quite useful for discovering good compression schemes for certain types of data. See Kolmogorov Complexity for background details.