back to article Inside Nvidia's Pascal-powered Tesla P100: What's the big deal?

So there it is: the long-awaited Nvidia Pascal architecture GPU. It's the GP100, and it will debut in the Tesla P100, which is aimed at high-performance computing (think supercomputers simulating the weather and nuke fuel) and deep-learning artificial intelligence systems. The P100, revealed today at Nvidia's GPU Tech …

  1. Anonymous Coward
    Anonymous Coward

    I think the most likely consumer devices that chip would end up in would be a space heater.

    1. Brian Miller

      Done that...

      When I worked as a technician, I used a Celerity computer as a space heater, literally. The work shop was located in a room that used to house many mainframes, and the air conditioning was still turned up. Chilly wind tunnel, that spot. So I used one of the mini supercomputers to give me some warmth at the bench.

      However, a couple of these, and you'll be on the bottom of the Top500 list!

      1. This post has been deleted by its author

  2. Mikel

    Drool

    It's... Beautiful....

  3. Charles 9

    OK, all fine and dandy, but what about some stats? How well will this Pascal handle something like Fallout 4 compared to Maxwell?

    PS. Sorry, but Crysis is getting a little old to keep up the joke.

    1. diodesign (Written by Reg staff) Silver badge

      Re: Charles 9

      We'll try to find out - GTC lasts a week. Thing is, Nvidia is only talking about the P100. They refuse to speak (publicly) about future Pascal products.

      C.

      1. WonkoTheSane

        Re: Charles 9

        IF there is an actual Pascal-chipped consumer GPU coming, I think the reveal won't be until E3, in June.

        1. Sir Runcible Spoon

          Re: Charles 9

          It seems to me that whilst Pascal is quite grunty - it isn't aimed at graphics processing as it's primary function.

          So, it will probably compete with the 980ti out of the door, and at a lower price point - but I expect the more consumer oriented Polaris GPU's might well steal the show (for this generation at least)

  4. Anonymous Coward
    Anonymous Coward

    Great if you've got a suitable problem

    As usual, these things are great if you have a problem which is a good fit to what it can do. In the science world, that would be either something which spends all of it's time inside BLAS or FFT, or at least has a compact, highly-vectorizable AND highly-concurrent kernel at it's heart.

    If, on the other hand, you have a problem with a flat execution profile (so that you'd need to rework all the logic and rewrite 1e5 LOC to take advantage of a GPU), and, god forbid, long dependence chains everywhere (so that you gain no advantage no matter how much you rearrange things), the whole thing becomes a very expensive equivalent of a vintage Celeron.

    I know Cray was ridiculed for his "If you were plowing a field, which would you rather use: Two strong oxen or 1024 chickens?", but I think a balanced diet is essential for long-term health - and lately, all we've been having is chicken.

    1. HmmmYes

      Re: Great if you've got a suitable problem

      Yeah but ... what happens when you need 1024 strong oxen?

    2. Yet Another Anonymous coward Silver badge

      Re: Great if you've got a suitable problem

      But what if you were looking for a single bit of chicken feed in 1024 fields ?

  5. Code For Broke

    Deep Learning?

    I'm sorry, but I don't believe a word of this Deep Learning b.s.

    Any concept whose name immediately begs questions about the inverse of its name just rings painfully in my ears as the output of publish-or-perish academics with absolutely no experience in value creation. It's jargon backed up by explanations of jargon with jargon to help you understand the jargon in the jargon.

    Having spent some time with Professor Wikipedia just now, I'm only more confident in my beliefs.

    Has anyone a class of Deep Learning kool-aide for me to drink? Am I wrong? Please enlighten me?

    1. Matthew Taylor

      Re: Deep Learning?

      "Deep" refers to the new algorithms' abilities to learn deep architectures - that is, neural networks with serveral (3 or 4) hidden layers. The multiple hidden layers are what give the NN it's power, but historically, machine learning algorithms have performed poorly on such neural networks.

      The difference in the first deep learning methods was to train the neural networks layer by layer, in an unsupervised manner. This did a lot to "sort out" the data, and made the later back propagation phase more tractable.

      Since then, a variety of different architectures and methods have been discovered, and the phrase "Deep Learning" has broadened somewhat to represent the new resurgence in neural network methods.

      Deep learning is absolutely not B.S. - though calling it A.I. is a little misleading. What it is is a much more powerful data modelling / prediction paradigm, which allows a new class of applications (speech recognition / language translation etc).

      1. Yet Another Anonymous coward Silver badge

        Re: Deep Learning?

        Deep learning is simple.

        Here is an equation describing the answer you should get for a bunch of input parameters.

        Tell me the parameters that give me the best answer - you don't need to show your working.

        1. Code For Broke

          Re: Deep Learning?

          Oh, it's all about finding an undefined "best answer." Perfect. I totally get it now.

          Not.

          1. hattivat

            Re: Deep Learning?

            It's a way of making the computer figure out through trial and error how to do something that you wouldn't know how to hand-code. For example how to tell the difference between a pedestrian and a carbon cutout in pouring rain. Hence "undefined best answer" because you don't need to know (and for may of these problems, couldn't possibly know because humans have limitations) what the solution is beforehand.

            It's like training a dog to jump through hoops - you don't take a microscope and a scalpel and manually reconnect its neurons, you don't even need to know which parts of its brain control which muscle, you just make it jump until it learns how to do it right by itself. In this example, deep learning would be the method of designing an artificial dog so that it can be taught to jump through hoops.

            1. Code For Broke

              Re: Deep Learning?

              @hattivat: Now that was something I could sink my braincells into.

              I'm not agnostic to the concept of AI. And I appreciate that some really amazing programming is happening right now that allows for astonishing patern detection. I clearly don't understand it, and I am frustrated that I'm probably just not bright enough to be able; but I don't deny it.

              But I balk at giving these and other concepts names that attempt to anthropomorphize what the process is doing. That is just the arrogant, self-deification of, again, mostly the academic set who come up with pure theory, no practice and sure as hell no meaningful value.

              1. Matthew Taylor

                Re: Deep Learning?

                "I balk at giving these and other concepts names that attempt to anthropomorphize what the process is doing."

                Given you don't have any knowledge of the subject at all, it's not clear what business you have "balking" at what people in that subject area choose to name things. The anthropomorphic terms have come from the inspiration NN researchers have taken from the brain over the years. Not the other way round, and there's nothing particularly high-falutin' about them once you get to grips with it.

                If you refer to a "block of code", might I say "A block is what you build a tower with, stop anthropomorphizing"?. We use words metaphorically all the time. You just happen not to know these particular machine learning terms, and for some reason have taken exception to them.

                You also mis-characterize the "academic set". Their mindset is that of highly motivated programmers, armed with the specialist knowledge that anyone would have having studied something for a long time. To be honest, it just sounds like you have a chip on your shoulder.

      2. Code For Broke

        Re: Deep Learning?

        Sorry Matthew, but you've delivered a response that is, again, jammed up with jargon.

        Architecture is how the bits of wood or steel are fitted up to make a building.

        Neural concerns neurons, which are nerve cells in a living organism.

        Hidden layers are found in a lasagna, which is a fine meal to take a mourning friend.

        Propagation is what males and females do when bored/anxious/angry to pass the time... oh, and make babies.

        None of these concepts pertain to computing. They are the entropic banter of academics, bent on turning a stream of consciousness into way to pay the mortgage.

        Can someone please explain what Deep Learning is without the shadow puppets and microtonal soundtrack?

    2. Pascal Monett Silver badge

      Re: Deep Learning?

      I don't know if you're wrong, per se, but I viewed a marketing spiel from IBM about its Watson not long ago and I have to admit it scared me somewhat. Was it the capabilities narrated by the casually attractive female voice ? Was it the impact of the vision of the future that it generated in my mind ? Or was it the idea of a computer capable of "discerning intent" ? Probably a bit of everything.

      See if you get scared here.

  6. Skizz

    When the title mentioned Pascal I immediately thought of the programming language and a step backwards in performance. I need to get out more.

    1. To Mars in Man Bras!
      Angel

      @Skizz

      When the title mentioned Tesla I immediately thought of the self-driving cars and a step forwards in performance. I need to stay in more.

    2. jeffdyer

      Why a step backwards in performance?

      Pascal produces native code and is still very fast.

  7. Warm Braw

    Future products...

    Will the newer, faster version be called Turbo Pascal?

  8. HmmmYes

    I remember when graphics cards with just dual-port RAM framebuffers with a blitter shoved on top ....

    Stunning isnt it.

    Who knows, give it 20 years and there'll be one in your toaster.

    1. Anonymous Coward
      Anonymous Coward

      Perhaps. It already chucks out more heat than a toaster.

  9. jms222

    Sorry but I've worked in automotive and can see no conceivable reason why you'd need something like that in a car.

    I still maintain a system written in Turbo Pascal and runs in a DOS box under Windows Me and yes it's connected to the outside world not via a firewall and no this has never been a problem.

    1. Anonymous Coward
      Anonymous Coward

      If the above comments are anything to go by, it may be having things like voice recognition on chip. That is, having the "learnt" part of the code actually residing in the chip for really fast response (and not needing to send data to Googles servers!).

    2. hattivat

      Simple - autopilot. Lightning-quick recognition of pedestrians, other cars, road boundaries, etc. with stellar reliability regardless of conditions (rain, dust, etc.) is absolutely essential for the development of autonomous cars and a decidedly non-trivial computing task. You need advanced neural networks for that, and nowadays these run on GPUs (CPUs are ~10x slower for this application).

  10. schlechtj

    Voice recognition

    Microsoft operating systems have been doing the kind of voice recognition your talking about since vista and if you had Dragon or something similar, before that. I don't know why google still makes you send data to its servers because phones today are perfectly capable of handling that task. It's probably a program to get voiceprints of everyone on earth so they can sell it or leverage you into buying something. Or perhaps the government may want it to identify people in disguise. Humm... I think I'm going back to typing......

    1. Sir Runcible Spoon
      Black Helicopters

      Re: Voice recognition

      "I don't know why google still makes you send data to its servers"

      Go on, have a guess.

    2. Anonymous Coward
      Anonymous Coward

      Re: Voice recognition

      A bit of both. Possible patent side stepping and also for data collection.

  11. NanoMeter

    I want one

    to help me win the lottery. But to be able to afford one, I need to win the lottery...

  12. Dick Pountain

    "The Tesla P100 supports up to four 40GB/s bidirectional links which can be used to read and write to memory."

    I've always wondered how long it would take for the Americans to reinvent the Transputer. Answer: 30 years

  13. Anonymous Coward
    Anonymous Coward

    I'm willing to bet..

    .. that if you let Microsoft write their next version of Windows for it, they would waste so many CPU cycles it would still take 2 minutes to boot. They just can't help themselves.

  14. phil dude
    Thumb Up

    fft's and molecules....

    Well the half-precision can be used for FFT's , thought since it appears they have homogenously clocked all the units a la Intel/AMD? (5 DP/10SP/20HP)? Probably will not harm our MD....

    I am predominantly interested in the latency characteristics, and how many of these you can get in box before it melts...;-)

    P.

  15. Cynic_999

    SDR

    This sounds like the perfect solution for a really high-spec software defined radio. Feed it with a few high bandwidth IQ streams and it would be able to find, decode and monitor lots of radio transmissions. I expect the NSA has ordered a few dozen of them for trials.

  16. DiViDeD

    At last! Far Cry 4 with everything turned up!

    I know, limited vision, but with NVLink making SLi look like yesterday's jam, I look forward to the first 3DMark benchmark result from some toad who lists his gaming rig as '25 NVLink'd P100s'. And of course, then we'll ALL have to have the same just to get back on the top of the table.

    I'm saving up and hollowing out my GPU cave as we speak. Will the council mind too much if I divert the Tank Stream to cool the bugger?

  17. MJI Silver badge

    Electric cars and small fastish motorcycles

    All this talk of Tesla and GP100 made me think of these.

    Elon Musks cars, and a small Suzuki which was the fastest of its capacity. 80mph from 100cc was not bad in the late 70s

  18. E 2

    All that page faulting...

    Is not faster. It is just more convenient.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like