back to article Happy Birthday, Turing's universal machine

It's just 71 years ago this month that a seminal paper from Alan Turing was published, which helped pave the way to today's multi-billion dollar IT industry and confer on Turing the title of father of modern computer science. As is often the case in scientific endeavour, Turing was actually working on an entirely different …

COMMENTS

This topic is closed for new posts.
  1. Anonymous Coward
    Gates Horns

    The abelard.org link is IE-only

    Shame that.. MS just WISHES it was the Universal OS!

    On a more topical note, I'm halfway though reading "Alan Turing: The Enigma" - Andrew Hodges takes something interesting and makes it more boring than a VCR manual, and never lets you forget "this Turing guy, HE'S A FAG" to the point it really gets in the way of the story.

  2. Name
    Stop

    Chronology

    > At the start of the twentieth century deep questions were being asked about science and the tools used to think about scientific problems. In the same way that Einstein's work had challenged orthodox views of physics, so Hilbert and others questioned mathematics.

    Hilbert's list of 23 questions came in 1900, a year before Einstein's first published work.

  3. Robert Ramsay
    Boffin

    <puts Goedel hat on>

    Technically, Goedel proved that mathematics could not be both complete *and* consistent. We went for the second one.

  4. Wayland Sothcott

    Computer Emulation

    Turin simplified the computer down to a one bit machine that travels up and down miles of ticker tape reading and writing dots.

    He proved that any computer can emulate any other computer given enough time and enough memory and the correct software. All the I/O would just be memory so you would have to imagine the VGA screen if the real computer did not have one.

    With the current generation of computers we are making practical use of this with virtulaization.

    Perhaps the most startling thing is to see that a computer is just a simple binary logic machine with loads of memory and loads of software. No matter how smart the software the computer can only do stuff that binary logic can do. This means that artificial intelegence using binary computers is a huge waste of time no matter how big the computer is.

    AI needs a computer that does not resolve down to a simple Turing machine.

  5. Anonymous Coward
    Joke

    Wow

    The 71st anniversary!

    Does that mean we celebrate this event annually then? I seem to have missed the big 70th celebration.

  6. Anonymous Coward
    Gates Halo

    @AC 00:30

    Windows *IS* the universal OS. It is on servers, desktops, laptops, mobiles, embedded systems and even netbooks/SCCs. Linux is just a minor curiosity in the IT world (sub-2% penetration) and mostly used by amateur dabblers rather than true professionals.

    It's Drawinism at work - the best of breed succeeds.

  7. DavCrav
    Boffin

    Answered "consistent" negatively...

    This is slightly wrong: what Godel actually proved is that you can't ever *prove* that it's consistent. If set theory (the branch of mathematics in some sense underpinning all others) is inconsistent, then we should be able to prove that, but writing down a logical fallacy. (This happened before to some attempts that people made.) However, if it is consistent, and there is no contradiction, you won't be able to prove that.

    I suppose that for the purposes of journalism 'negatively' is sufficient...

  8. Anonymous Coward
    Anonymous Coward

    @The abelard.org link is IE-only, AC

    I think I tried to read that once, is it the book with about 30 references per page to notes and figures at the back of the book - got really fed up with it b4 getting 1/2 way through..

    I hope it is that one as now I know the name to avoid it!

  9. Alistair

    An inspiration to the rest of us lowly keyboard botherers

    So let that be a lesson to anyone who says something like "pure mathematics has nothing to offer the rest of us - its just a bunch of f------ w----- t------ o-- into a conceptual hyper dimensional reality bubble". NURSE!

  10. Destroy All Monsters Silver badge
    Boffin

    Ho-hum

    I always had a problem with the idea that Turing provided a "blueprint" for computers. Even before him, general-purpose programmable state machines were conceived of and attempts to build them were made, as Babbage's unfinished Analytical Engine shows. What Turing did was, propose a concrete definition of "Mechanical Procedure" and anchor it to mathematics. This is what Hilbert was asking about -- whether there was a "definite method" to prove any given theorem -- leaving "definite method" undefined and an exercise for the reader. Turing came in and laid pipe between "definite" and "mechancial", and the rest is history. The Turing Machine is an eminently mathematical idea and it pays off big immediately, yielding the theorem that there is a "Universal Turing Machine" which can (be programmed to) emulate any other, that most mathematical functions are not computable, as in, deciding in a finite number of steps whether any given Turing Machine will ever stop is impossible and that Hilbert's question can be answered in the negative because of that. Moreover, as this mathematical idea is chained firmly to our physical world. Thus, in the same way that taking as given that motion is relative but not additive leads one to the conclusion that life is governed by an absolute maximum speed and that space and time are mixable quantities, taking as given that "mechanical procedures" are Turing Machines leads one to the conclusion we will have to slum it on a small "easily computable functions island" as most functions/problems are unsolvable while most solvable functions/problems are unreachable, costing exponentially many "mechanical steps" -- and life is short.

    It helps that the idea of the Turing Machine can be shown to be equivalent to other ideas about what a "definite method" might be - among those, Alonzo Church's idea wherein "effectively calculable functions" were understood to be the set of general recursive functions. No better (in the sense of "more powerful") idea of "definite method" has ever been found. Hence, the Church/Turing thesis (NOT a theorem as it is not provable -- Special Relativity is not a theorem either) which just says that "effectively calculable" is equivalent to "calculable by Turing Machine" and that there is nothing beyond. This rapidly leads to the idea that human minds are indeed equivalent to some Turing Machine etc. etc. etc.

    "Your learning is simply amazing Sir Bedivere..."

  11. DavCrav

    @Alistair

    Speak for yourself: some of us are pure mathematicians!

  12. SpeakerToAliens
    Coat

    @@AC 00:30

    > Windows IS the universal OS.

    Windows isn't universal. You mentioned Linux but you forgot many others, the MAC OS for instance. Isn't that around 5% penetration? Plus, there are several versions of Windows. What's the %age penetration of Vista, Win98, NT, 2000, XP, 3.1, 3.0? Shall I power up my Atari ST or my TSR-80? How about my ZX81? None of those run Windows (Ok, the TRS-80 runs MS Basic, but its still *not* Windows).

    > It's Darwinism at work - the best of breed succeeds.

    So you're equating Windows to bacteria and viruses then. They've succeeded better than anything.

    Mine's the one with the role of punched paper tape in the pocket.

  13. Kevin Whitefoot
    Boffin

    @Anonymous Coward Posted Tuesday 18th November 2008 09:54 GMT

    Where did you get the idea that evolution produces the best of breed?

    Darwinian evolution produces the breed most likely to survive in the conditions in which it finds itself. It also requires that there be a continuous series of steps from one type to another (because large mutations are almost invariably deleterious). See any of Dawkins' books, or indeed 'The Origin'.

    Evolution produced the human eye. The human eye has the light sensitive parts behind other essential parts. There are other animals (octopus?) that have the light sensitive parts in the 'right' place. Which is best of breed?

    So exactly how does Darwinian evolution apply to operating system design?

    Also I think your numbers are suspect, references?

  14. Anonymous Coward
    Coat

    @Wayland Sothcott

    > This means that artificial intelegence using binary computers is a huge waste of time no matter how big the computer is.

    I suspect that you're assuming 1) that whatever underpins our physical reality has infinite metric resolution. Even if it does, there's a further assumption that 'close will never be good enough'.

    Add more bits to a representation of some quantity, and you give that representation more precision. Is precision the only lynchpin of your argument?

    > No matter how smart the software the computer can only do stuff that binary logic can do.

    Tell me what those limits are...

    I suspect this is a troll post.

  15. Anonymous Coward
    Linux

    @Anonymous Coward Posted Tuesday 18th November 2008 09:54 GMT

    Troll

  16. Eddie Edwards
    Flame

    Spritualism in computing?

    "No matter how smart the software the computer can only do stuff that binary logic can do. This means that artificial intelegence using binary computers is a huge waste of time no matter how big the computer is. AI needs a computer that does not resolve down to a simple Turing machine."

    Ah yes, the old "you can't have AI without a soul" argument, favoured by Weisenbaum and others; the last bastion of received religious wisdom in mathematics.

    Actually your brain isn't even as powerful as a Turing machine. It only has a finite tape.

  17. Tom Chiverton
    Boffin

    More than binary

    "No matter how smart the software the computer can only do stuff that binary logic can do. "

    Err, no ? It's perfectly possible to have it emulate a non-binary system such as trinary, for instance, or analogue (just ask Reason :-) )

  18. Tim Ardizzone

    Gödel's Theorwm

    We have to assume that mathematics IS consistent but not complete. Otherwise every mathematical statement would be true (such as 1 = 0).

  19. Anonymous Coward
    Anonymous Coward

    Drawinism?

    What exactly are you drawing in?

  20. Frumious Bandersnatch
    Flame

    @Answered "consistent" negatively...

    You're slightly wrong too... briefly, you can't prove your formalism (ie, "maths") both complete *and* consistent. It is quite simple to prove some formalism complete or consistent (or to design it for one or the other), but Godel *proved* you can't have both. One "horn" will always be able to poke holes in the other one.

  21. Anonymous Coward
    Anonymous Coward

    @@AC 00:30

    > Windows *IS* the universal OS FOR VIRUSES AND MALWARE.

    FTFY

  22. jon

    "true professionals" use unix - without which internet would not exist...

    "true professionals" use unix - without which internet would not exist. ( truncated 'title' entry??)

    http://www.abelard.org/turpap2/tp2-ie.asp looks ok in Opera 6.2...

  23. Anonymous Coward
    Anonymous Coward

    @DavCrav

    > Speak for yourself: some of us are pure mathematicians!

    By which I assume that you're calling yourself a "pure mathematician." Must be a bit galling for you to make such a stupid mistake in describing Gödel's Incompleteness Theorem, then.

    To paraphrase an old joke: what do you call a guy who graduates last in his mathematics course? A mathematician.

  24. J
    Coat

    @AC MS fanboy troll

    "It's Drawinism at work - the best of breed succeeds."

    Drawinism? Is that a competition between, say, Paint and The GIMP, or something?

  25. Charlie van Becelaere

    Pure Mathematics and Binary Logic

    My strength is as 1010, for my mathematics are pure binary!

  26. Anonymous Coward
    Anonymous Coward

    @Windows? It's Drawinism at work - ...

    the best of breed succeeds.

    HAHAHHAHAHAHAHHAHAHHAA

    Sorry, not meaning to offend, but that's funny.

  27. DavCrav

    @people saying I got Godel's Incompleteness Theorem wrong

    OK, so I was assuming that the axioms for which you have your model is complicated enough for one to embed arithmetic, because otherwise why bother?

    And I am a pure mathematician, just not someone studying the foundations of maths, because they are silly and we laugh at them. They do occasionally produce interesting results, and I have had to use model theory every now and then.

    In conclusion, I could fish out my set theory notes from my bookcase and write down the theorem, but I was offering a more accurate version of the story's version. We can keep making subtle changes, as I didn't describe what 'embed arithmetic' is, so someone will blah blah blah bored. I'll go and do some actual maths now.

  28. Anonymous Coward
    Anonymous Coward

    AI and Windows

    On AI - go off and read a series of tomes on emergent behavior. I've no idea how the research has moved on since the early 90s, but even then loosely 'intelligent' behaviors could be seen to emerge from single sets of rules running on binary machines.

    I believe your mistaking intelligence with (1) 'free will' (2) understanding human natural language - which is stymied by the fact that we don't fully understand how either work, and the intractable association of language with knowledge of the world.

    On Windows - if you're not simply a troll (and the irrelevance to subject says yes) you have no idea about the IT world. Windows does not dominate in the server space, and certainly not in the enterprise server and web apps space - i.e. big boys IT.

  29. J.Wild
    Coat

    @Tim Ardizzone

    Actually 2+2=5 and not 4 because some(one) has to be doing the counting.

    It's quite possible that consciousness is Universal, therefore without it nothing would exist.

    Mine's the one with The Cosmic Doctrine in the pocket!

  30. Nanki Poo
    Linux

    @AC - Darwinism

    Sorry, I am by no means an OS expert, but "on servers, desktops, laptops, mobiles, embedded systems and even netbooks/SCCs" there is only 5% Linux penetration..?

    IIRC Linux is the same lineage as Unix, just as Fista is the same lineage as Win 1.0.

    Ah, let's save the time in going into detail... basically you're talking a load of big hairy dangly ones.

    Ahh, coot, look... a pinguin!

    nK

  31. Tony Hoyle

    Is it still true?

    At school I was taught that a turing machine could do anything a modern computer could do.. but is it still true? It seems to me to work only for a simple procedural computer. Multiprocessor architectures, event driven programming.. even interrupts don't seem to work for a machine based on a ticker tape. I can get the idea of a VGA screen imagined (although the turing machines we were taught about had no memory, they were just decision trees), but how then to handle the user doing something as complex as dragging a window if they can't see it?

    OTOH I'm prepared to believe that what I was taught at school was a turing machine was not in fact the whole story, and he had already thought about all this and come up with reasonable answers.

  32. Anonymous Coward
    Paris Hilton

    yes, it is still true

    Yes. It is still all ones and zeroes. A hell of a lot of them to be sure, so yes the model still applies, no matter how fine your gui. And a TM can have states that are connected to every other state and branched to on particular transition conditions ... so your events are there as well ... and your massively parallel supercomputer.

    The interesting question is whether a quantum computer is still a turing machine of the same kind ... the jury still seems to be out on that one.

This topic is closed for new posts.