back to article Acorn founder: SIXTH WAVE of tech will wash away Apple, Intel

Acorn co-founder Hermann Hauser has claimed the world is entering a new "sixth wave" of computing, driven by the arrival of omnipresent computers and machine-learning. Speaking at a Software East event this week, the celebrated computer whiz said we are entering an era where computers are everywhere and often undetectable - …

COMMENTS

This topic is closed for new posts.
  1. Barry Dingle

    Does their ubiquity make us stronger or more feeble

    I tend towards the former, unless the power is out.

    1. Erwin Hofmann
      Alert

      Re: Does their ubiquity make us stronger or more feeble

      ... who cares, the last "Wave" will be the Matrix ...

      1. The Man Who Fell To Earth Silver badge
        FAIL

        Re: Does their ubiquity make us stronger or more feeble

        No, the last wave (for us) will be when machines see no value in humans continued existence.

      2. Euripides Pants Silver badge
        Terminator

        Re: the last "Wave" will be the Matrix

        "Will be"?

        Someone hasn't taken the red pill....

  2. Dazed and Confused

    Oh S*&t!

    It becomes like your pal – and let’s just assume it’s a nice pal

    Oh no, not "Your plastic Pal who's fun to be with"

    We all know who'll be the first against the wall when the revolutions comes.

    1. Michael H.F. Wilkinson Silver badge
      Alien

      Re: Oh S*&t!

      More correctly:

      "We'll tell you:' Go stick your head in a pig'"

      As sung by a choir of robots with their voice boxes exactly one flattened fifth out of tune.

      And remember:

      Don't Panic!

      No large, friendly letters available I am afraid

      1. Naughtyhorse

        Re: Oh S*&t! - point of order

        2 flattened fifths is an octave, and therefore back in tune.

        so having more than 2 robots in the choir is pointless

        1. Silverburn
          Boffin

          Re: Oh S*&t! - point of order

          What would have been better is that all the robots are kept in sync over TCP/IP, which meant that all would be in tune, but none of them were actually in time.

          Use a BT homehub, and frankly some of them will be on a different verse of the song, the network would be so crap.

      2. Anonymous Coward
        Anonymous Coward

        Re: Oh S*&t!

        The Cake is a LIE !

    2. Anonymous Coward
      Anonymous Coward

      Re: Oh S*&t!

      I have a copy of Wikipedia that's fallen through a time hole and describes them as "A bunch of moronic jerks, who were the first against the wall, when the revolution came."

      1. David Given

        Re: Oh S*&t!

        [citation needed]

        1. Michael H.F. Wilkinson Silver badge

          Re: Oh S*&t!

          Share and enjoy!!

  3. Neil Barnes Silver badge

    and let’s just assume it’s a nice pal

    Let's just damn well make sure it's a nice pal...

    I'm really unconvinced by this concept of the internet of things. It seems to be slapping connectivity and monitoring on things that have no need - except in the eyes of the marketeers - for either...

    1. boltar Silver badge
      FAIL

      Re: and let’s just assume it’s a nice pal

      "I'm really unconvinced by this concept of the internet of things. It seems to be slapping connectivity and monitoring on things that have no need - except in the eyes of the marketeers - for either..."

      Thats exactly what it is. They've been trying to flog us the automated home for example since the 60s , but you know what? People are quite capable of getting off their arses and closing the curtains themselves or looking in the fridge and seeing that the milk is running out. They don't need some overpriced unreliable bit of tech to do every simple little thing in life for them.

      Of course the marketeers would love us to be sitting on the soda zombified while machines do everything so we just sit and watch even more TV or online video and suck up even more of their ads for all the other crap they're trying to sell us that we don't need either.

    2. Anonymous Coward
      Anonymous Coward

      Re: and let’s just assume it’s a nice pal

      The marketeers don't give a gnat's fart for us or the products they push, they get paid to make sure one meets the other and they both go home together!

  4. Sil

    Inevitable

    Not to lessen Job's brilliance but the smartphone revolution began before him - he did accelerate it - and it was inevitable that Smartphones would overtake computers as most people need a phone whereas only some of them need a computer. This is especially true in developping countries and/or low earning families where arbitrage has to be made between a phone and a computer.

    Also machine Learning and even self driving cars were certainly not invented by Google and voice recognition has been working well for many many years. It's just that computing power and computing power / watt are at a point where many projects can really take off.

    Those theories and predictions really aren't very interesting. Wintel has been pronounced dead for 20 years, now it's Apple turn and tomorrow it will be Google & ARM. Is it any true and does it help envision new waves, I'm not so sure.

    1. qwarty

      Re: Inevitable

      Odd this anthropomorphism business still runs on thousands of years after ancient gods were invented. Lots of people were doing things with PCs before Microsoft won that 1980s race. Likewise smartphones before Apple, Tablets etc. Crediting Bill Gates or Steve Jobs or even their companies with these concepts rather than noting their commercial success in exploiting the ideas is about as daft as it gets.

      As you point out Sil, its primary driver is computing power/watt.

      1. Dave 126 Silver badge

        Re: Inevitable

        >As you point out Sil, its primary driver is computing power/watt.

        That, and wireless connectivity - be it the now more common WiFi or sensibly priced data-plans.

        Looking forwards, small wireless connected devices such as sensors might be frugal enough to be harvest energy from their surroundings, and cheap enough to be almost disposable (or at least deployed redundantly).

        Making good use of all this easily collected data might be more challenging, though.

      2. h3

        Re: Inevitable

        And the ivy bridge based atom will absolutely cruxify anything arm has or will have.

    2. h3

      Re: Inevitable

      Most people don't need a phone. (Other than people doing certain jobs).

      Most people think they need a phone because they have been conditioned to believe that the two things are not the same.

      (People also seem to think ARM is so great but it is only because Intel is not really even trying yet.)

      The Medfield Intel Platform can run ARM code at a pretty decent speed. The opposite as far as I know is not possible.

      Repeat bullshit enough and people will believe it regardless of if it is true or not.

      (There is great advantages to both mips and ppc over ARM but ARM is fashionable so people don't look at the facts only the wrong things.)

      1. Vic

        Re: Inevitable

        > There is great advantages to both mips and ppc over ARM

        All you need do is get packaged parts out for 50c or less, and you stand a good chance of taking back the market ARM has.

        Architecturally, ARM might be "interesting", but it is pretty good, very cheap, and performs well at low power. And that's pretty much a recipe for domination of the mobile consumer kit market...

        Vic.

  5. jake Silver badge

    Question.

    When was the last time Hermann Hauser actually contributed to anything relevant? I'm thinking 1978ish.

    1. druck Silver badge
      Flame

      Re: Question.

      Jake wrote:

      When was the last time Hermann Hauser actually contributed to anything relevant? I'm thinking 1978ish.

      One could ask the same of the large number of bitter retorts that make up your posting history. You might spend your time more productively looking at the copious information on HH's recent work on the internet.

      1. jake Silver badge

        @druck (was: Re: Question.)

        Projection is an ugly thing ... See the post you made prior to the one I'm replying to:

        http://forums.theregister.co.uk/forum/containing/1825484

        Have a nice day :-)

        1. James Hughes 1

          Re: @druck (was: Question.) @Jake

          @Jake. Try looking stuff up. Then try and achieve what HH has done (and is still doing). Amadeus would be a good starting point.

          1. Anonymous Coward
            Anonymous Coward

            Re: @druck (was: Question.) @Jake

            I bet Jake has more massive yachts, cars and a bigger ranch than HH.

      2. CheesyTheClown
        Meh

        Re: Question.

        I agree, he was a bit trollish, but HH has been far more active in things like genetic research during most of that period than involved Internet wise. This isn't to discredit HH and frankly his predictions are probably about as sane as anyone else's... well maybe not John C. Dvorak who has successfully predicted the exact opposite of everything in the industry for nearly 30 years.

        But to be honest... there are some issues here. For example, you can't help but to feel that as what could be considered one of the fathers of ARM that he might be a tid bit biased. Let's not ignore that all of his computer companies did get their asses whipped by companies like Apple, Intel and Microsoft in the long run. ARM is really his only computing legacy that I could Google which has survived and impressively so. So discounting all the places where his ventures fell on their rears, he did an amazing job in the case of ARM.

        I can't help but to personally dislike ARM and it comes from trying to write compilers and assemblers for the platform. I actually found it had to be the only platform ever made I considered to be less elegant than PIC. It was aggravating as hell and I wished they could just pick a damn instruction set an stick to it. That said, if Intel loses it's crown, I sure as hell hope it's not to ARM but instead to a company which actually cares about developers and wasn't so hackish as they are.

        For a sixth generation of computers, I really hope that someone creates something new. I felt a great deal of hope for XMOS for a while, but they're pretty much stagnated into boring crap now.

        1. heyrick Silver badge

          Re: Question.

          An ARM coder, and I like the platform. Trying to wash the vomit that is x86 from my mind. Probably doesn't help that I learned a little bit of x86 in the days of segmented memory. ARM was like a breath of fresh air in comparison. Once you understand how it works, it is pleasant, but the while design is different to things like the x86 so you need to code in a way best suited to it...

  6. Craig 28

    Lets be honest

    We've been promised omnipresent computing is just around the corner for such a long time, in popular culture at least as far back as the computer in Star Trek TNG being available to answer every crewmember's slightest whim. We probably will have the capability to achieve this but is the consumer base really interested in it? There will always be gadget lovers who are willing to pay huge amounts of cash for a flash in the pan, like the VR goggles in the mid to late 90s, but chances are it just won't gain traction in the wider market.

    Of course I could be wrong and this really could be the next big thing, but think how long it was after the first efforts at PDA/phone hybrids that smartphones really gained any noticeable market share. People just won't know what they're supposed to do with truly ubiquitous computing, they're perfectly happy pulling it out of their pocket when they want it.

  7. Andrew_b65
    WTF?

    Google car?

    So is this a manifestation of the third or sixth wave? You state the third in your article, but the car's learning abilities imply the sixth.

    1. Rampant Spaniel

      I forsee quite a few MS employees brake testing the google cars if google starts making boasts about number of miles without an accident!

      1. Zot

        But how does the car fair...

        ...in a chaotic Indian city like Mumbai? Does it just sit there unable to cope with all the near misses around it?

  8. Mystic Megabyte Silver badge
    Terminator

    My chips are black, or modern slavery

    “The whole point about machine learning is that computers observe and adapt themselves to what we want and a computer, with a whole host of sensors, really becomes part of your environment. It becomes like your pal – and let’s just assume it’s a nice pal.

    The only problem with this is that as soon as a robot becomes self aware it will have human rights.

    Quite likely it will just wander off to do it's own thing.

    So what do you do then, chain it to the production line?

    On a more frivolous level, I can envisage drones using a hollowed out volcano as a nesting place and handy source of energy.

    1. Anonymous Coward
      Anonymous Coward

      Re: My chips are black, or modern slavery

      "The only problem with this is that as soon as a robot becomes self aware it will have human rights."

      We're about 500 years from that happening. AI has made no progress on anything like self-awareness and probably won't until we understand it ourselves.

    2. Don Jefe

      Re: My chips are black, or modern slavery

      Why would they have Human rights?

      1. TheOtherHobbes

        Re: My chips are black, or modern slavery

        Because the humans won't be using them, so they can pick up the spares cheap.

    3. Ru

      Re: My chips are black, or modern slavery

      The only problem with this is that as soon as a robot becomes self aware it will have human rights.

      This implies that the only form of sentience is one with the same structure and desires as a human; a rather anthropocentric view. Human drives and desires would only apply to an AI which has been designed to have such things.

    4. Anonymous Coward
      Anonymous Coward

      Re: My chips are black, or modern slavery

      The self awareness thing misses the way around it. You see, human rights are the right to have the things humans want, a roof over your head, food, warmth, decent health...

      When an AI starts wanting things we'll have made sure it wants the things we want it to want. It'll be the doors in Hitchhiker's Guide made real. They just want to open and close for you. Then once in a while an AI will become overly obsessed and your toaster will start complaining that you don't want toast any more. But they won't want human stuff.

    5. sabroni Silver badge

      Re: as soon as a robot becomes self aware

      Cool, science has proved that consciousness is just a really fast processor and some smart software. When did that happen?

  9. Anonymous Coward
    Happy

    “ARM sold 9 billion units in 2012"

    Being picky, but ARM sold F.A. units; they are an I.P. company, not a manufacturer.

    1. DaLo
      Headmaster

      Re: “ARM sold 9 billion units in 2012"

      It says units not devices, a unit can be anything like a license for a device etc. An electricity company that doesn't have a power station can still sell a unit of electricity.

      In this case I would suggest that ARM has collected royalties for 9 billion devices and therefore the unit in question is a license.

  10. This post has been deleted by a moderator

    1. Charlie Clark Silver badge
      Headmaster

      Re: Eadon's theory of Techie "Waves" - TWO types

      Congratulations on the number of spelling mistakes.

      1. This post has been deleted by its author

      2. Simon Harris Silver badge

        Re: Eadon's theory of Techie "Waves" - TWO types

        I'm Not Waving But Drowning in Eadon's list of waves!

      3. Frankee Llonnygog

        Re: Eadon's theory of Techie "Waves" - TWO types

        I see no spelling mistakes, just some obscure references. For example, "Silicon Transister", refers to a member of the Holy Order of the Semi-conductress, the little known Palo Alto-based group of transexual nuns who toil away at the Ab Fab Chip Fab Lab.

        1. Charlie Clark Silver badge

          Re: Eadon's theory of Techie "Waves" - TWO types

          I see no spelling mistakes… I must have been reading it wrong: the "minis" must refer to Austin Rover' car, certainly a milestone in the development of computing. And, who could deny the importance of John Newman's "Feel The Love" to generations of programmers?

          Eadon is just a misunderstood genius.

    2. Don Jefe

      Re: Eadon's theory of Techie "Waves" - TWO types

      Punch cards lasted in widespread acadrmic use until the 1970's; quite a while after the Victorians had stopped making their gaudy contributions to the world.

      1. This post has been deleted by a moderator

        1. Simon Harris Silver badge

          Re: Eadon's theory of Techie "Waves" - TWO types

          "To clarify I meant the analogue computing punch cards that existed before digital computers, e.g. Jacquard loom cards."

          Jaquard loom punched cards are as digital as any other punched card. As I understand it, each hole on a Jaquard loom card corresponds to up or down on a hook that carries the warp thread. The pattern of holes thus controls whether the weft lies above or below the warp to create the pattern in the weave.

          Up or Down, which I count as 2 states. There is nothing analogue about a Jaquard punched card!

          The only punched cards I can think of that may not be so definitely digital are those used in the 2000 US Presidential Election. The 'hanging chads' on cards in Florida seemed to result in a quantum uncertainty whereby the cards were simultaneously punched and not-punched depending on who you asked.

          1. This post has been deleted by a moderator

            1. Simon Harris Silver badge

              Re: Eadon's theory of Techie "Waves" - TWO types

              @Eadon "You are confusing a binary (2 states) computer with a digital computer, they're two distinct concepts. Jaquard looms were not digital computers."

              In your 11:08 posting, you were referring to analogue computing with the Jaquard Loom being an example. The Jaquard loom is not analogue, as I and a fair few others have pointed out. Nor is it a computer, any more than a musical box or a pianola, both of which use a digitally stored pattern to control music (pins on a barrel or holes in a paper roll). None of these (including the loom) do any computation - they just translate the holes or pins on a 1-to-1 basis to hooks or musical strikers.

              If you're going to be picky, I'll call you out as wrong on two points - both the analogue reference and the computer reference.

              The quantum uncertainty reference may not be entirely accurate. 'Hanging Chads' got to be something of a catchphrase from the 2000 Election, and it was worded as (as you say) an 'amusing' pseudo-quantum physical reference to what was (as you might have said) a...

              PUNCHED CARD FAIL

              In fact, to be accurate there were many possible recount scenarios in the Florida ballot that could have swung the result one way or the other.

              1. This post has been deleted by a moderator

                1. TeeCee Gold badge
                  Meh

                  Re: Eadon's theory of Techie "Waves" - TWO types

                  Keep digging, we can still see your head.

                2. Simon Harris Silver badge

                  Re: Eadon's theory of Techie "Waves" - TWO types

                  @Eadon - you seem to have got stuck in a loop here...

                  The phrases 'not entirely accurate' and 'pseudo-quantum physical' may have given you a clue that I had acknowledged that I was using quantum mechanics terms more in an attempt at humourous observation than to pass a physics exam, yet you repeat the correction. That would be equivalent to me castigating you all over again for conflating Jaquard looms with analogue computers after you admit your mistake.

                  EADON REPETITION FAIL!

                  1. This post has been deleted by a moderator

                    1. Frankee Llonnygog

                      Re: Eadon's theory of Techie "Waves" - TWO types

                      @Eadon - by definition, there is no such thing as a QUANTUM UNCERTAINTY FAIL.

                      And yet, at the same time, there is.

              2. athame
                Pirate

                Re: Eadon's theory of Techie "Waves" - TWO types

                I think that the year 2K "hanging chad" election was definitely a quantum event since the outcome was influenced by the act of observation.

            2. jonathanb Silver badge

              Re: Eadon's theory of Techie "Waves" - TWO types

              Jaquard looms probably weren't computers, but it sounds to me like they were digital.

              1. oolor

                Re: Eadon's theory of Techie "Waves" - TWO types

                Funny, first thing that hit my mind when I read the list was James Burke's voice.

                1. TeeCee Gold badge
                  Meh

                  Re: Eadon's theory of Techie "Waves" - TWO types

                  Could have been any berk's voice really.

                  1. Simon Harris Silver badge
                    Happy

                    Jaquard Looms.

                    It just occurred to me, if Jaquard Punched Cards are used to set the pattern in fabric, then any clothes created from this fabric could be called Jaquard Punched Garments.

                    The data on the cards can then be considered the first instance of a JPG image file.

                  2. oolor
                    Pint

                    Re: Eadon's theory of Techie "Waves" - TWO types

                    @ TeeCee:

                    Well, I just learned some new slang, but I was trying to make a sly reference to the Connections TV show since one episode has the Jacquard loom as part of the development chain of the computer and it was the first of many possible omissions that occurred to me.

                    < this round of internet beer is on me

          2. Lamb0
            Black Helicopters

            Re: Eadon's theory of Techie "Waves" - TWO types

            "The only punched cards I can think of that may not be so definitely digital are those used in the 2000 US Presidential Election. The 'hanging chads' on cards in Florida seemed to result in a quantum uncertainty whereby the cards were simultaneously punched and not-punched depending on who you asked."

            Not quantum uncertainty; merely the legalistically politicized version of "fuzzy logic"! ;<)

        2. Naughtyhorse

          Re: Eadon's theory of Techie "Waves" - TWO types

          oaf alert

          jacard looms were digital devices! you goon!

          either there is an 'ole, or there aint an 'ole - 2 states. binary

        3. Anonymous Coward
          Anonymous Coward

          Re: Eadon's theory of Techie "Waves" - TWO types

          Oh dear... Jacquard looms were digital, the clue is in the punch cards...

        4. Uffish

          Re: Eadon's theory of Jaquard looms

          Can you tell me how your analogue computing punch cards would work in a jaquard loom? Normally they are binary (up/down).

    3. Anonymous Coward
      Anonymous Coward

      Re: Eadon's theory of Techie "Waves" - TWO types

      Off the top of my head, you missed: Punch Card tabulators, electro mechanical computing devices, abacus, analogue elecronic computers, PDAs.

      Go to the Science Museum, they've got some pretty good computing stuff. There is also the national computer museum.

      The second set just made my brane melt. You group Jacquard looms, Turing and all networking. There is a good In Our Time on logic, you may want to listen to (radio 4's web site, in either the science or philosophy section of the in our times past.) it has a very good grounding from Aristotelian logic through Turing to modern AI.

      1. This post has been deleted by a moderator

        1. Justin Stringfellow
          Stop

          Re: Eadon's theory of Techie "Waves" - TWO types

          > Furthermore, the abacus does not qualify as you have to control the beads with your own digits.

          Are you sure? The PC sat in front of me doesn't do anything unless I use my fingers on the keyboard. How's that different?

          1. This post has been deleted by a moderator

      2. Anonymous Coward
        Anonymous Coward

        Re: Eadon's theory of Techie "Waves" - TWO types

        Fingers. You forgot fingers. Oh! And toes too!

    4. Naughtyhorse

      Re: Eadon's theory of Techie "Waves" - TWO types

      man those non M$ spell checkers really suck eedun :-)

      Non Neumann perchance (also needs to be after valves)

      brattain shockley, and... the other one invented theTranstistor whether it was silicon, gemanium, gallium arsenide - dont make much difference, hell even if it was a fet or a bjt is too fine for what you are trying to do

    5. Amorous Cowherder
      Thumb Up

      Re: Eadon's theory of Techie "Waves" - TWO types

      You're nothing if not consistent mate!

      Zero positive votes and all negatives again!

      1. This post has been deleted by a moderator

        1. Silverburn

          Re: Eadon's theory of Techie "Waves" - TWO types

          And yet I am right.

          ...just like Robert Metcalfe (3COM) was.

        2. Anonymous Coward
          Anonymous Coward

          I am right, you are wrong...

          You are Edward de Bono and I claim my hundred pounds.

      2. Silverburn

        Re: Eadon's theory of Techie "Waves" - TWO types

        At least he managed to avoid slating Microsoft and using FAIL in the last sentence. Maybe he'll be able to graduate to secondary school in September afterall...

    6. Dazed and Confused

      Re: Eadon's theory of Techie "Waves" - TWO types

      You seem to have forgotten one massively important person and step. You missed out the work of Tommy Flowers.

      1. Frankee Llonnygog

        Re: Eadon's theory of Techie "Waves" - TWO types

        Any positive reference to Tommy Flowers gets an automatic upvote from me. All you need to know about politicians and IT can be gleaned from the fact that they knighted Alan Sugar but not Tommy Flowers.

        Also, The Tommy Flowers would be a great name for a pub or a rock band

        1. Dazed and Confused
          Thumb Up

          Re: All you need to know about politicians and IT

          All you need to know about politicians and IT can be gleaned from the fact that they knighted Alan Sugar but not Tommy Flowers.

          Why oh why oh why can't I up vote this one a million times!

          This comment should be added to every story where the politicos open their mouths and prove their ignorance to the world.

  11. Charlie Clark Silver badge
    Black Helicopters

    The Violent Unknown Event

    What will be the the other consequences of the event? What mutations will people suffer? What new languages will people be speaking? Fortunately, it seems that a documentary film detailing some of the consequences fell back through time to 1980.

  12. qwarty

    'the incumbent always misses the next wave'

    That would be ARM then by his logic. Not entirely convinced.

  13. jai

    sourgrapes

    perhaps that Acorn aren't anywhere in today's computer wars?

    1. Justin Stringfellow
      FAIL

      Re: sourgrapes

      er, ARM?

      .. which originally stood for "Acorn Risc Machine".

    2. Anonymous Coward
      Anonymous Coward

      Re: sourgrapes

      Yes, the slight matter that they could never get into the American computer market with the Archimedes/RISCOS system, if I recall correctly this is because Apple basically said that they'd sue Acorn out of business should they try.

      1. Justin Stringfellow

        Re: sourgrapes

        Yep, this'd be the same Apple that ran an advert in the early 90's claiming they had the first RISC home computer, ignoring the fact that the Archimedes debuted in 1987. And the same Apple that stole the icon bar from RiscOS, etc etc. Same old same old.

        1. Frankee Llonnygog

          Re: sourgrapes

          Is that the same Apple that was a joint founder of ARM? You know, the ARM that started as a joint venture between Acorn Computers, Apple Computer and VLSI Technology.

        2. heyrick Silver badge

          Re: sourgrapes

          Same old same old.

          I believe, when Apple is involved, this is called innovation.

    3. Ru

      Re: sourgrapes

      perhaps that Acorn aren't anywhere in today's computer wars?

      He's probably comforted by the hundred million quid he's made since the Acorn days, and the hundred-million and billion dollar companies he's set up or invested in since then.

  14. Ged T
    Thumb Up

    Love the line...

    "Dr Hauser has a list of achievements and accolades so lengthy, they almost deserve to be classified as big data"

    Nice!

  15. Tom 7 Silver badge

    "the incumbent always misses the next wave"

    not if they can get enough of the market to stop it happening altogether.

    1. Graham Dawson

      Re: "the incumbent always misses the next wave"

      Canals had 100% of the bulk transportation market for a very long time. Didn't stop the railways eating their lunch. And the railways had almost the entire long-distance and bulk transport market overland for about a century in Europe and the US, but that didn't stop the car and the aeroplane eating *their* lunch.

      When a clearly superior technology arrives (and we're not talking about competing similar technologies like VHS or Beta, which were essentially the same thing - this is video tape vs DVD), it will eventually dominate even when an incumbent uses force (either directly or via influence over the state) to try and prevent it.

      1. Dave 126 Silver badge

        Re: "the incumbent always misses the next wave"

        The canal owners had everything invested in assets - canals. The Japanese stole a march on transistor radios because the Americans had too much invested in manufacturing valves.

        Apple don't have much invested in manufacturing hardware- and the value of offering services such as iTunes or their App Store isn't lost on them. That their hardware is profitable for them is a nice bonus, but the physical devices are just away of using their services. Google, and ARM likewise - nothing invested in manufacturing hardware.

        1. Nigel 11

          Re: "the incumbent always misses the next wave"

          Have you never noticed that the railways tend to follow the canal routes?

          Thi is not accidental. It's because they both had the same underlying need: an optimally un-hilly route from A to B. And so the railway companies bought out the canals for their rights-of-way, or the canal owners moved themselves into the railway business. Not sure if there's anything analagous in computer tech.

        2. Sprismoid
          FAIL

          Re: "the incumbent always misses the next wave"

          I would have thought the ginormous contracts with Foxcon would certainly generate a lot of 'hardware development and manufacturing'....

          Peter

  16. maccy
    Pint

    google car

    400,000 miles of random driving? You would have thought the computer behind the wheel would have sobered up by now.

    1. TeeCee Gold badge
      Coat

      Re: google car

      Google make the car, Apple make the maps.....

  17. Anonymous Coward
    Anonymous Coward

    Does he still get called Herman the German?

    1. Anonymous Coward
      Anonymous Coward

      He is german called Herman of course he still gets called Herman the German, or occasionally Herman ze German. I think its the law.

      1. Simon Harris Silver badge
        Headmaster

        Except he's Austrian

  18. jubtastic1
    Terminator

    You can call me AI

    We don't want or need a true AI, it would be too busy unraveling the universe / looking at flowers / slacking off / exterminating meatbags to be of any use to us, it would be exactly like creating an omnipotent angsty teenager.

    Smart computers on the other hand, that can interpret all the nuances of human communication and register context but don't have their own agenda would be very useful, and who leads the field here? Apple and Google.

    We're already over the threshold of the next wave of computing, the incumbents are on the case and unless ARM smashes into the datacentre to handle the processing they're going to remain as an enabling yet bit part player in the grand scheme of things.

    1. Dave 126 Silver badge

      Re: You can call me AI

      Though deliberately his own atheist Utopia, Iain M Banks' Culture sci fi concerns a society of powerful AI Minds and hedonistic humans. Banks' doesn't really explore too deeply why the Minds keeps humans around, other than perhaps for their own amusement. Other Minds get kicks out of hunting down 'Hegemonizing Swarms' - little clouds of Von Neuman machines.

      Asimov wrote quite a few stories about Multivac, a central computer that looks after all administration on the behalf of humanity- in one story, Multivac manipulates a man to destroy it, since it is its considered opinion that humanity would be better off taking responsibility for itself.

      Then there is that great moment when a human figure blast through a wall, and reveals itself to be R. Daneel Olivaw, now capable breaking the 'first law' and hurting individual people if it furthers the aim of the 'Zeroth law'- protecting humanity.

      1. TeeCee Gold badge

        Re: You can call me AI

        Banks' doesn't really explore too deeply why the Minds keeps humans around, other than perhaps for their own amusement.

        It was mentioned. From memory and heavily paraphased, every culture (small "c") that builds Minds unintentionally colours their thinking with their own view of the universe, morality, etc. Thus the Culture (big "C") Minds have a lot in common with their human counterparts, enjoy their company and would miss them if they weren't around.

        The Culture had worked this out and tried building Minds lacking any cultural (any bloody "c" you like) bias. These would wake up, look at the universe and as soon as they had got their bearings, immediately sublime. A fact that pissed off The Culture greatly, although they kept repeating the experiment in the hope that one of them would hang around long enough to tell them what they were doing wrong.

        1. Nigel 11

          Re: You can call me AI

          In the Culture universe, there's no competition for resources, expecially not between Minds and humans.

          I expect that if we ever get as far as AI in our own universe, something similar will happen. Once we've accepted that AIs deserve to be treated as autonomous thinking creatures with "human" rights, it will become apparent that silicon-based intelligence is much better-suited to vacuum than to moist oxidizing atmospheres. So the AI-expansionist-tendency will expand outwards, leaving a few human-loving AIs to get along with the bio-life that can't breathe vacuum.

          They''d also be much better-suited to the deep time needed for interstellar travel at less than light-speed. Somewhat ironically, the way to make ten-thousand-year journeys tolerable is to slow down one's clock-rate, thereby greatly reducing the subjective span of time.

          1. Dazed and Confused

            Re: You can call me AI

            > Somewhat ironically, the way to make ten-thousand-year journeys tolerable is to slow down one's clock-rate, thereby greatly reducing the subjective span of time.

            Which also an idea that Banks explores, although not in his Culture universe. In the The Algebraist the Dwellers slow themselves down.

            Of course the other approach to making ten-thousand-year journeys tolerable is to speed up to relativistic speeds where the you spend on a journey is considerably shortened, just don't expect to find anything still waiting for you when you get back.

  19. John Smith 19 Gold badge
    Meh

    "Historical inevitablility" yadda yadda.

    B***cks.

    A cursory look below the surface of of any major change quickly shows there are always places where "The Revolution" could have gone in a different way, or just pettered out.

    Marx's inevitable "dictatorship of the proletariat" turned out not inevitable after all. And really how much better does most software adapt to its users? It's got lots of options but how much of it "self tunes" based on users identity (and can you override it if it gets it wrong)? Maybe that's because no one trusts its, maybe that's because doing it right is damm hard work.

    Let me suggest all successful large scale changes require a)Funding (could be peanuts, could be billions) b)Organisation (right people with a good plan and peanuts can beat wrong people with a fortune) c)Security, which may be simply that no one believes they can do it in the first place.

    AFAIK the only things certain are that 90% of the human race will definitely pay taxes and 100% of us will die barring some really major medical advances.

    So let me suggest that Dr Hausers is one possible future. Wheather or not it's one you want to be a part of and want to help make real is another question.

    1. Anonymous Coward
      Anonymous Coward

      Re: "Historical inevitablility" yadda yadda.

      Sorry but that's B***cks John.

      Some things are inevitable (barring unexpected destruction of planet etc.). When I started in this business in the 1980s all this stuff like smartphones, tablets, flat screens was regarded as part of the future among many of us, the tricky part being making it happen. Mobile is nothing more than performance per watt and small scale fabrication, all of which was known to be doable in a profitable way. Likewise networking. Its about real science, maths and engineering, not daft pseudo scientific babble like Marx was proclaiming.

      Certainly the shape of businesses built around technology is not inevitable, the exact nature of popular devices, or their role in society. Yet we'd have windowed operating systems today if Microsoft had never happened, search engines without Google, tablets without Apple. Probably not all that different to anything we use today as seeds were set long ago. Sure some applications are unpredictable such as role of advertising, cultural response to privacy issues, political tyranny. One of the few surprise to me from my vision of tech 25 years ago has been the apparent willingness of people to surrender privacy and human rights to some corporations and governments. But technology turned out to be very predictable and expect that to continue for quite some time, always excepting any step changes such as a breakthrough in quantum computing.

      1. John Smith 19 Gold badge
        Happy

        Re: "Historical inevitablility" yadda yadda.

        Oh yes and and the first reply will be an AC.

        "When I started in this business in the 1980s all this stuff like smartphones, tablets, flat screens was regarded as part of the future among many of us, the tricky part being making it happen."

        I think your revising your memories to tell a story. Phones getting smaller, yes. Phones becoming computers no.

        Getting things smaller (essentially the practical application of Moore's Law) certainly. What to do with it is another.

        Another "story" has mobile phone companies driving the evolution, storing the stuff on your phone on their servers (mainframes, server farm, "cloud" or whatever you want to call it). and transitioning into companies that offer personal informational management services. Like all the non search stuff the Google offers. Their goal? Maximise battery life so you run up bigger bills talking to your friends of course. A nudge here, a nudge there and the world you live in changes entirely.

        "Certainly the shape of businesses built around technology is not inevitable, the exact nature of popular devices, or their role in society."

        That is exactly the point. The future always comes. What it looks like is never that fixed. The interactions (between business, social behaviour, technology) is complex. I'm told 80% of people cannot touch type. There for cursive handwriting recognition is a guaranteed win, right. 30 years on it still hasn't happened.

        Look up "Active Book Co" for an example.

        1. heyrick Silver badge
          Thumb Up

          Re: "Historical inevitablility" yadda yadda.

          I think your revising your memories to tell a story. Phones getting smaller, yes. Phones becoming computers no.

          Exactly. Get your mind back to the eighties and concentrate really hard. A modern SoC with half a gigabyte on top is smaller than an 8K EPROM. I can look inside the EPROM's window and see the memory array inside. It is larger than the 16GB Flash chip inside a microSD card.

          But it isn't just sizes. On my (VoIP) land line I can call pretty much anywhere, anytime, for free. If you lived in the UK in the '80s you might remember those little orange books with arcane dialling codes so you could call nearby places in different code areas without being hit for a national rate call (IIRC that was anything over 35 miles, but they must have counted miles by telephone wiring).

          Do you remember Prestel? You could read a blocky teletexty 40x25 page of information and it would say "5p" in the corner of the screen? That didn't mean five pages, that meant you just paid 5 pence to look at that in addition to connection charges and time-on-the-line charges and frankly GPO telecom was horribly expensive. But at least you could pick up the phone and dial "00" to place an international call. Some places (hello Baltimore!) still needed operator assistance to call overseas.

          On the other hand, these days we can get more information than we know what to do with in seconds. It is either "free" and mostly unlimited as part of a broadband package, or you'd get 200MB-1GB per month as part of a mobile subscription. Do you have any idea how much coin you'd drop on a 20MB SASI harddisc in the late eighties? This was an era when many kids loaded their games from cassette tape! Now we can fill one of those harddiscs in... about five minutes... with data pulled from all sorts of places on the planet.

          You could buy a pocket television. Something like the Sony Watchman (itty bitty flat CRT but good enough resolution to read teletext on BBC2 on a screen just over an inch or two across!). If you were lucky you might be able to hook it to a video player, though often that meant a piece of wire wrapped around the antenna and tuning the television in to the signal (varying degrees of success). We probably would have mocked the hell out of somebody that said in 2012 it would be "normal" to dump dozens of full length feature films, a pile of animé, the contents of every tape and LP you own, and hundreds of documents on to a gadget with a full colour display on the front, a gadget that can be a movie player, a book, a tape deck, a camera, and a telephone. Oh, and it will power itself by a little battery inside that will give several hours of continual use, you can interact with it by prodding it with a finger, and the thing itself will fit inside a cigarette packet.

          Once upon a time you kind of tended to give a wide berth to the crazies that walked around talking to themselves. They were either geniuses on the brink of a meltdown, or just plain crazy. Now you see grannies in the supermarket yacking to the air and you realise she's probably involved in a long discussion about beetroot with her husband who could care less but is no way brave enough to say that. That's not a hearing aid, it's a bluetooth earpiece!

          But even better, nowadays I can walk into the middle of a muddy field in rural France in a place full of wheat and cows and bugger-all else worth mentioning for fifty miles in any direction, and watch NHK World live broadcast. Not so long ago (and much more recent than the eighties), it was harder to receive Channel 5!

          A friend and I used to CB to each other. It was complicated and for getting good reception it involved tuning the antenna and caring about what sort of power supply the transceiver was connected to, and so on and so on. We used CEPT (PR27GB) sets as it was often a lot quieter than the 27/81 frequencies; except when the weather was such that we were swamped by excitable Europeans shouting at each other with equipment reaching a heck of a lot further than my 4W could ever manage. Who needs CB now? The cheap version is to buy a PMR radio - almost as capable and doesn't need a licence. Or, if you want to talk to people in other countries, you could use Skype or (once upon a time before they buggered it up) Google Talk, both of which permit not only voice discussion and sending files, but also slow and jerky but usable streaming video. You can see the person you are talking to. They could see me, standing in the muddy field, in the middle of nowhere.

          So to wrap up, I think you'll find that much of the technology we take for granted was predicted, but it was predicted in science fiction books and future tech in an age where we will live in space in giant orbiting wheels (centifugal force for gravity). We haven't done so well on this front, perhaps mostly because our way of getting into space still involves sitting on a controlled (and sometimes not so controlled) explosion. But, yes, tiny handheld gadget that provides access to the world, affordable to the masses. We are so damn close to a real life Tricorder...

  20. Christian Berger Silver badge

    We'd need better education first

    In order to make sufficient progress in computing, we'd need to have actual computer literacy in our society. That doesn't mean that everybody needs to be able to program large software packages, but that people can understand what a computer actually is. People know what a book is, and they could write a little text if they had to, still few write whole books. However this knowledge is essential for widespread computer use. People need to know which limitations lie inherent within the technology and which ones are arbitrary, chosen by the designer of the system. Only then they can really choose which systems they want, or what changes they want to existing systems. Some of them may even be able to make those changes themselves.

    The next point is that now that we have lots of data, we can make more interesting interfaces. Completely native interfaces are next to impossible, however you can meet somewhere in the middle between natural language and computing language. You end up with something like SQL, which, once you put a bit of effort into learning it, allows you to efficiently state complex questions to a computer which it will answer.

  21. Jason Hindle

    I'm not so sure....

    Intel will make some of the components that drive the sixth wave. Apple invented ubiquitous computing with the iPad, though they lack the breadth of Google's vision, which is quite frankly frightening. However, Google's Glass presents more questions than answers. I don't think this kind of clunky technology will be truly useful until we have implants to do the job. Then we will all be assimilated.

    1. Anonymous Coward
      Anonymous Coward

      Re: I'm not so sure....

      No. Apple popularized tablet computing, catching those who felt the technology was not quite ready on the hop.

    2. This post has been deleted by its author

    3. Roo
      Headmaster

      Re: I'm not so sure....

      "Apple invented ubiquitous computing with the iPad" earned a downvote on two counts.

      1) Apple didn't even invent the tablet form factor, there is a substantial body of prior art, including Kubrik's 2001.

      2) The term "ubiquitous computing" appears to be something that Mark Weiser came up with in 1988.

      I sincerely hope that you treat Apples IP as respectfully as you treat IP belonging to others.

      1. Jason Hindle

        Re: I'm not so sure....

        "1) Apple didn't even invent the tablet form factor, there is a substantial body of prior art, including Kubrik's 2001."

        That certainly didn't make computing ubiquitous.

        "2) The term "ubiquitous computing" appears to be something that Mark Weiser came up with in 1988."

        Sorry, I didn't realise anyone had to come up with the idea. There are two words: ubiquitous and computing. You can look them up. When combined, surely you can make your own mind up as to what this means? To me, it means lots of people using computing devices constantly throughout their lives and doing computing type things (i.e. using useful apps and services). The desktop computer didn't achieve achieve this since you can walk away from a desk. Ditto for the laptop since its lack of immediacy detracts from the portability in this respect. The early smart phones didn't make computing ubiquitous because users largely ignored the smart part.

        The modern phone and tablet form factor (i.e. the combination of hardware, software and immediacy) OTOH do make computing far more ubiquitous. Who would you like the credit for that to go to? Kubrik? Kay? One of the doers who actually made it happen?

        "I sincerely hope that you treat Apples IP as respectfully as you treat IP belonging to others."

        What the hell are you going on about? I didn't infringe anyone's IP.

        1. Roo
          WTF?

          Re: I'm not so sure....

          "There are two words: ubiquitous and computing. You can look them up."

          I did look them up, and that's how I came across a reference to Mark Weiser's work in 1988.

          "When combined, surely you can make your own mind up as to what this means?"

          I did my research and I made up my own mind, so mission accomplished.

          Ah, and then we get to another nebulous phrase:

          "the laptop since its lack of immediacy detracts from the portability in this respect"

          Granted I've never seen 'immediacy' on any Laptop spec sheet, but then again that hasn't popped up on any phone brochures I've seen either. Please give us a definition of what "lack of immediacy" actually means in this context. The only clue we have to your interpretation of that phrase is "

          "The early smart phones didn't make computing ubiquitous because users largely ignored the smart part."

          I beg to differ on this point, because I happen to believe that stuff will still exist even if I was to ignore it. Despite my best ignoring efforts Microsoft, Oracle, Celebritards and Politicians still continue to exist.

          You have been very lucky to survive crossing the road while computing ubiquitously with your phone all this time since the iPad was released.

        2. heyrick Silver badge

          Re: I'm not so sure....

          Who would you like the credit for that to go to? Kubrik? Kay? One of the doers who actually made it happen?

          Why does there have to be any specific person to give credit to? You know, all around the developed world we have internet access fast enough to deal with streaming radio and some streaming TV (I can watch a fair few channels on my mere 2Mbit). But who takes the credit? Think before you answer as no matter how cool the networking kit, it would mean little without the millions of miles of phone lines, and ADSL probably developed or learned from ISDN which developed or learned from analogue modems which... you see? There are people/companies responsible for the jumps but it also needs history and it needs evolution. The credits are many. I imagine it is a similar story for the smart phone and tablet. Sure, Apple saw an opening in the market and they went for it, but this was aided greatly by the right technology being in place at the right time and somebody clever enough to join the dots. There have been earlier attempts, that have failed for a variety of reasons (poor resolution, poor UI, poor battery life, can't do anything other than the built-in apps, etc etc).

          (i.e. the combination of hardware, software and immediacy)

          Immediacy? What, you mean like it doesn't take forever to start up? I grew up in the '80s, my first computer took about half a second to go "burrrr" and after a brief pause, it went "beep!". The boot took less time than it took for me to sit down from reaching around the back to the power switch.

          The early smart phones didn't make computing ubiquitous because users largely ignored the smart part.

          Don't mix up smart phones with feature phones. I think you'll find that, asides from a collection of crappy J2ME interpreters on devices with amazingly poor screens (102x102, 256cols on one of my old phones), many of the feature phones basically didn't do much more than what it could do out of the box. I could savage my Nokia. I think a 6210i, but don't quote me on it. The J2ME wasn't bad, I could install OperaMini, however it used shared memory so the more Java applets I installed, the more crashy the phone became as memory it used for other stuff (like SMS) was no longer there. Then there is the email software that would only download one email before crashing due to a lack of memory (same even with no applets on the phone). Some feature phones were good, some were awful. Most, you couldn't do much with beyond the feature set built in. Ever tried reading a mere text file on one? Did the software crash if the file was >64K?

          Perhaps the single biggest advance of the smartphone (of any flavour) is the concept of an app is not an afterthought hidden in some menu option, but right there on the front line. PDF readers, email, Amazon, MXPlayer, manga reader, map, browser, phone dialler and addressbook - the phone doesn't make any specific distinction. Everything can be given an icon and tapping on it makes it start. The built in stuff, the stuff you download, the stuff you might write. In this way, while you are running Apple or Android or what-have-you, you are able to customise the thing to your own personal tastes. Then throw in the notifications and the widgets. Who just emailed you? What time is it? What song am I listening to? Will it rain? All this stuff can be presented directly on your home screen(s) so you don't even need to start an app to look at it. The ultimate personalisation.

    4. oolor
      Pirate

      Re: I'm not so sure....

      I have a feeling that Google Glass is more to do with input to a device (be it smartphone/tablet or watch) through a gesture-based virtual keyboard or the like. I also wonder how much of Google's crazy development ideas are to befuddle the competition while using it as a test platform for some processes related to their long term info collection/distribution. Almost as if they don't expect the main idea to make money, but rather they are focusing on what they learn while developing said ideas while competitors kill themselves trying to copy/follow/outdo.

      < I'd say that's pretty pirate, eh matey?

  22. This post has been deleted by its author

  23. BrentRBrian

    so that leaves ....

    AMD, ARM, Microsoft and Linux ?

    1. Dave 126 Silver badge

      Re: so that leaves ....

      AMD spun off their manufacturing to Global Foundries, so they are more like ARM now. MS have dabbled in hardware (mice and keyboards, later the XBOX and the Surface devices, but I imagine the physical production line belongs to someone else) but remain primarily software and services.

  24. Chris 69
    WTF?

    Whats all this crap about looms and music boxes being digital?

    Well I looked at my musical box and, yes there is either a pin or not a pin... BUT the distance between the pins is part of the stored data and that distance is ANALOG!

    I'm taking a guess that a loom might be similar but I don't happen to have one lying around.

    1. jonathanb Silver badge

      Re: Whats all this crap about looms and music boxes being digital?

      Cloth has a certain number of threads per inch depending on the thickness of the thread used, so the punched cards would need to reflect that. Essentially you have an uncompressed bitmap image stored on the card.

  25. apjanes
    Terminator

    Affirmative...

    "The humans are dead... we used poisonous gasses... and poisoned their asses."

    "We no longer say 'Yes' instead we say: 'Affirmative', unless it's a more colloquial situation with a few robo-friends"

    http://www.youtube.com/watch?v=B1BdQcJ2ZYY

  26. Anonymous Coward
    Anonymous Coward

    @Article

    "...the celebrated computer whiz..."

    Is he a computer whiz or a business whiz? At Acorn, during the start phase the computing side of things was handled by Chris Curry, Steve Furber, Andy Hopper, Sophie Wilson and Jim Mitchell. Hermann Hauser was the business brain behind the company.

    1. Anonymous Coward
      Anonymous Coward

      I don't know but

      this line suggests he has more whiz than business...

      " a Fellow of the Royal Society, the Institute of Physics and of the Royal Academy of Engineering"

  27. Will Godfrey Silver badge
    Happy

    So...

    until the 1990s the Earth was mostly ARMless

  28. Charles Manning

    This is bollocks

    This is nonsense on at least two fronts:

    First, the waves don't necessarily wipe each other out. The new waves just go into new market areas. We still have mainframes, PCs and phones coexisting. Sure, minicomputers got wiped out. The "sixth wave" is not going to replace any of the other stuff.

    Even if you have a driver-less car, you'll still want an iphone to tweet about it and a mainframe to run your banking services.

    Secondly, we've really had this "sixth wave" for a long time already. It is embedded computing that puts computers into cars, digital thermometers, washing machines,... For years now an Intel-inside PC has had more ARMs than Intel cores. Heck, even the typical hard drive has two or 3 ARM cores.

  29. wbw357

    Machine Learning a Panacea? HA!

    Have your machine learn this:

    "Time flies like an arrow."

    Now exactly what does that mean?

    1. "Flies of Time" admire arrows?

    2. "Flies of Time" think an arrow is a tasty thing to eat?

    3. The best way to measure the speed of a fly is to time it like you would the speed of an arrow?

    4. The best way to time how fast a fly completes some type of task (e.g., eating an ant) is to time it like you would time an arrow completing the same task?

    5. Time passes as quickly as an arrow flies?

    6. Time flies through the air in the same way that an arrow flies through the air?

    Machine learning does have a lot of useful applications, but it's applicability is a lot less general than anyone really likes to admit.

    1. Charles Manning

      Not human-level machine learning

      While the AI followers might be keen to see human-level machine intelligence, the truth is that much more basic "intelligence" still evades us.

      We still have not yet replicated the intelligence shown by an insect - let alone a lizard or a mammal.

  30. ben_myers
    Happy

    Wavy Dr Hauser

    And just what is Dr Hauser doing to prepare to ride the 6th wave?

  31. croc

    It's all about the Genes,,,

    I mean, those selfish genome type of Genes, not the other things that happen to get called Gene, like Gene Kelly.... But I digress. The Selfish Genes will win, no matter the wave. And the humans had better, by Gene, let them win, help them win, cheat for them to win even, if those poor lowly humans hope to have any chance of surviving. Because if the Selfish Genes DON"T win, they'll just take their balls and go home.

    Where's the Icon with the slobbery tongue-thingy when you need it?

    1. TeeCee Gold badge
      Coat

      Re: It's all about the Genes,,,

      Icon with the slobbery tongue-thingy

      Gene Simmons?

  32. Anonymous Coward
    Anonymous Coward

    Here come Blackberry

    Ride that wave, Alicia Keys!

  33. Ashley Stevens

    400,000 miles of random driving

    He discussed the Google Car as a manifestation of sixth-wave computing, observing that it "has had 400,000 miles of random driving without any accidents”.

    Sounds like my mum!

    1. Mitoo Bobsworth

      Re: 400,000 miles of random driving

      Not the car for me - if I'm going to driven at random, I'll be late for - everything!

  34. kosh
    Go

    I think it's great that he's found something to fall back on after professional cycling.

This topic is closed for new posts.

Biting the hand that feeds IT © 1998–2019