back to article Intel: New x86 AI instructions

Intel is adding new x86 instructions ideal for machine-learning applications to its upcoming Intel Xeon and Xeon Phi processors. The instructions are part of the AVX-512 family: they are described by Chipzilla as "vector instructions for deep learning" with enhanced word-variable precision and single precision floating-point …

  1. JeffyPoooh Silver badge
    Pint

    A.I. is still hard.

    Especially in the real world.

    1. Anonymous Coward
      Anonymous Coward

      Re: A.I. is still hard.

      This isn't A.I

      This is AI 2016 buzzword of the year. It means that we can finally sort though that "big data" (buzzword 2014), generated by the InternetOfThings (buzzword 2015).

      It's also an excuse for churnalists to write stories about the robot revolution. A flawed analysis even by mid 20th century tech-optimism standards.

      FINALLY adverts will be relevant. Blocked but relevant.

      1. Charles 9 Silver badge

        Re: A.I. is still hard.

        Bet you they'll use this to develop ways to make the ads unblockable.

        1. Dwarf Silver badge

          Re: A.I. is still hard.

          @charles 9

          And of course, ad blockers can use the same to become more intelligent.

          It will just be another cycle on the endless treadmill

      2. Rafael 1
        Trollface

        Re: A.I. is still hard.

        Big Data? InternetOfThings?

        Don't forget that you'll need DevOPS to create faster, better, cleaner code! Please attend one of our seminars, or just send some money in a box.

      3. Sailfish

        Re: A.I. is still hard.

        #churnalists - snicker, snicker

    2. Anonymous Coward
      Anonymous Coward

      Re: A.I. is still hard.

      The only difference between what these instructions do and what I was doing back in 1998 is that it's assembler. I had to use rather larger amounts of code to accomplish the same results, right down to the training of the beast. That this is found 'hard' has far more to do with the methods of teaching this subject area, which hasn't changed much at all over the last couple of decades. Just quicker to the same results.

      What will be interesting is to see if they allow non-linearity in the model. That's what I was doing back then. Linear is child's play.

      1. Anonymous Coward
        Anonymous Coward

        It's like that Spielberg "AI" movie: You hope for the real thing but in the end it's just schlock.

        Always remember "AI" is just whatever people are interested in at the moment to get more out of the machine or the sales department (i.e. "Advanced Informatics"). Object-Orientation, Functional Programming, Meta-Programming and Homoiconicity were all "AI" once. I am sure that in time logic programming will be back and then we will get special instructions for that etc. etc.

        That being said, wouldn't it make much more sense to have fit-for-purpose specialized hardware off-CPU, which could even do analog processing much more quickly. Sounds like someone has found a hammer...

        1. thx1138v2

          Optional AI? Silly you.

          "That being said, wouldn't it make much more sense to have fit-for-purpose specialized hardware off-CPU..."

          Off CPU? Surely, you jest. That removes the possibility of them cramming it down your throat whether or not you want their AI spying on you. Or Alphabet's or Microsoft's or Yahoo's AI.

          1. Anonymous Coward
            Anonymous Coward

            Re: Optional AI? Silly you.

            This is just some ASIC built in, everyone is right that this is just some marketing crap. Odds are this tech will be hijacked for co-processing multi-angle V.R. porn, even there it won't be considered a money shot.

    3. PaklNet
      Meh

      Re: A.I. is still hard.

      Yes-- A.I. is very hard in the real world. Far easier in games like chess and go.

  2. Anonymous Coward
    Anonymous Coward

    What?! Still no DWIM EAX, EAX?!

    the mysteriously missing body here.

  3. Mage Silver badge

    Marketing

    See title.

    1. Anonymous Coward
      Anonymous Coward

      Re: Marketing

      Diplodocus trying to compete with nVidia?

  4. Anonymous Coward
    Anonymous Coward

    Interesting

    We all know that the software running in/on the CPU phones home with slurp data, and web applications track our every move, but it seems odd that Intel hardware doesn't slurp and phone home ... or does it? I mean, why wouldn't they? Everyone else does ....

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Biting the hand that feeds IT © 1998–2019