I think the most likely consumer devices that chip would end up in would be a space heater.
Inside Nvidia's Pascal-powered Tesla P100: What's the big deal?
So there it is: the long-awaited Nvidia Pascal architecture GPU. It's the GP100, and it will debut in the Tesla P100, which is aimed at high-performance computing (think supercomputers simulating the weather and nuke fuel) and deep-learning artificial intelligence systems. The P100, revealed today at Nvidia's GPU Tech …
COMMENTS
-
-
Wednesday 6th April 2016 05:14 GMT Brian Miller
Done that...
When I worked as a technician, I used a Celerity computer as a space heater, literally. The work shop was located in a room that used to house many mainframes, and the air conditioning was still turned up. Chilly wind tunnel, that spot. So I used one of the mini supercomputers to give me some warmth at the bench.
However, a couple of these, and you'll be on the bottom of the Top500 list!
-
This post has been deleted by its author
-
-
-
-
-
-
Wednesday 6th April 2016 14:32 GMT Sir Runcible Spoon
Re: Charles 9
It seems to me that whilst Pascal is quite grunty - it isn't aimed at graphics processing as it's primary function.
So, it will probably compete with the 980ti out of the door, and at a lower price point - but I expect the more consumer oriented Polaris GPU's might well steal the show (for this generation at least)
-
-
-
-
Wednesday 6th April 2016 03:37 GMT Anonymous Coward
Great if you've got a suitable problem
As usual, these things are great if you have a problem which is a good fit to what it can do. In the science world, that would be either something which spends all of it's time inside BLAS or FFT, or at least has a compact, highly-vectorizable AND highly-concurrent kernel at it's heart.
If, on the other hand, you have a problem with a flat execution profile (so that you'd need to rework all the logic and rewrite 1e5 LOC to take advantage of a GPU), and, god forbid, long dependence chains everywhere (so that you gain no advantage no matter how much you rearrange things), the whole thing becomes a very expensive equivalent of a vintage Celeron.
I know Cray was ridiculed for his "If you were plowing a field, which would you rather use: Two strong oxen or 1024 chickens?", but I think a balanced diet is essential for long-term health - and lately, all we've been having is chicken.
-
Wednesday 6th April 2016 05:51 GMT Code For Broke
Deep Learning?
I'm sorry, but I don't believe a word of this Deep Learning b.s.
Any concept whose name immediately begs questions about the inverse of its name just rings painfully in my ears as the output of publish-or-perish academics with absolutely no experience in value creation. It's jargon backed up by explanations of jargon with jargon to help you understand the jargon in the jargon.
Having spent some time with Professor Wikipedia just now, I'm only more confident in my beliefs.
Has anyone a class of Deep Learning kool-aide for me to drink? Am I wrong? Please enlighten me?
-
Wednesday 6th April 2016 08:43 GMT Matthew Taylor
Re: Deep Learning?
"Deep" refers to the new algorithms' abilities to learn deep architectures - that is, neural networks with serveral (3 or 4) hidden layers. The multiple hidden layers are what give the NN it's power, but historically, machine learning algorithms have performed poorly on such neural networks.
The difference in the first deep learning methods was to train the neural networks layer by layer, in an unsupervised manner. This did a lot to "sort out" the data, and made the later back propagation phase more tractable.
Since then, a variety of different architectures and methods have been discovered, and the phrase "Deep Learning" has broadened somewhat to represent the new resurgence in neural network methods.
Deep learning is absolutely not B.S. - though calling it A.I. is a little misleading. What it is is a much more powerful data modelling / prediction paradigm, which allows a new class of applications (speech recognition / language translation etc).
-
-
-
Thursday 7th April 2016 09:33 GMT hattivat
Re: Deep Learning?
It's a way of making the computer figure out through trial and error how to do something that you wouldn't know how to hand-code. For example how to tell the difference between a pedestrian and a carbon cutout in pouring rain. Hence "undefined best answer" because you don't need to know (and for may of these problems, couldn't possibly know because humans have limitations) what the solution is beforehand.
It's like training a dog to jump through hoops - you don't take a microscope and a scalpel and manually reconnect its neurons, you don't even need to know which parts of its brain control which muscle, you just make it jump until it learns how to do it right by itself. In this example, deep learning would be the method of designing an artificial dog so that it can be taught to jump through hoops.
-
Sunday 10th April 2016 02:12 GMT Code For Broke
Re: Deep Learning?
@hattivat: Now that was something I could sink my braincells into.
I'm not agnostic to the concept of AI. And I appreciate that some really amazing programming is happening right now that allows for astonishing patern detection. I clearly don't understand it, and I am frustrated that I'm probably just not bright enough to be able; but I don't deny it.
But I balk at giving these and other concepts names that attempt to anthropomorphize what the process is doing. That is just the arrogant, self-deification of, again, mostly the academic set who come up with pure theory, no practice and sure as hell no meaningful value.
-
Sunday 10th April 2016 08:48 GMT Matthew Taylor
Re: Deep Learning?
"I balk at giving these and other concepts names that attempt to anthropomorphize what the process is doing."
Given you don't have any knowledge of the subject at all, it's not clear what business you have "balking" at what people in that subject area choose to name things. The anthropomorphic terms have come from the inspiration NN researchers have taken from the brain over the years. Not the other way round, and there's nothing particularly high-falutin' about them once you get to grips with it.
If you refer to a "block of code", might I say "A block is what you build a tower with, stop anthropomorphizing"?. We use words metaphorically all the time. You just happen not to know these particular machine learning terms, and for some reason have taken exception to them.
You also mis-characterize the "academic set". Their mindset is that of highly motivated programmers, armed with the specialist knowledge that anyone would have having studied something for a long time. To be honest, it just sounds like you have a chip on your shoulder.
-
-
-
-
-
Thursday 7th April 2016 02:05 GMT Code For Broke
Re: Deep Learning?
Sorry Matthew, but you've delivered a response that is, again, jammed up with jargon.
Architecture is how the bits of wood or steel are fitted up to make a building.
Neural concerns neurons, which are nerve cells in a living organism.
Hidden layers are found in a lasagna, which is a fine meal to take a mourning friend.
Propagation is what males and females do when bored/anxious/angry to pass the time... oh, and make babies.
None of these concepts pertain to computing. They are the entropic banter of academics, bent on turning a stream of consciousness into way to pay the mortgage.
Can someone please explain what Deep Learning is without the shadow puppets and microtonal soundtrack?
-
-
Wednesday 6th April 2016 15:17 GMT Pascal Monett
Re: Deep Learning?
I don't know if you're wrong, per se, but I viewed a marketing spiel from IBM about its Watson not long ago and I have to admit it scared me somewhat. Was it the capabilities narrated by the casually attractive female voice ? Was it the impact of the vision of the future that it generated in my mind ? Or was it the idea of a computer capable of "discerning intent" ? Probably a bit of everything.
See if you get scared here.
-
-
Wednesday 6th April 2016 10:44 GMT jms222
Sorry but I've worked in automotive and can see no conceivable reason why you'd need something like that in a car.
I still maintain a system written in Turbo Pascal and runs in a DOS box under Windows Me and yes it's connected to the outside world not via a firewall and no this has never been a problem.
-
Wednesday 6th April 2016 14:48 GMT hattivat
Simple - autopilot. Lightning-quick recognition of pedestrians, other cars, road boundaries, etc. with stellar reliability regardless of conditions (rain, dust, etc.) is absolutely essential for the development of autonomous cars and a decidedly non-trivial computing task. You need advanced neural networks for that, and nowadays these run on GPUs (CPUs are ~10x slower for this application).
-
Wednesday 6th April 2016 11:33 GMT schlechtj
Voice recognition
Microsoft operating systems have been doing the kind of voice recognition your talking about since vista and if you had Dragon or something similar, before that. I don't know why google still makes you send data to its servers because phones today are perfectly capable of handling that task. It's probably a program to get voiceprints of everyone on earth so they can sell it or leverage you into buying something. Or perhaps the government may want it to identify people in disguise. Humm... I think I'm going back to typing......
-
Wednesday 6th April 2016 16:34 GMT phil dude
fft's and molecules....
Well the half-precision can be used for FFT's , thought since it appears they have homogenously clocked all the units a la Intel/AMD? (5 DP/10SP/20HP)? Probably will not harm our MD....
I am predominantly interested in the latency characteristics, and how many of these you can get in box before it melts...;-)
P.
-
Wednesday 6th April 2016 21:56 GMT DiViDeD
At last! Far Cry 4 with everything turned up!
I know, limited vision, but with NVLink making SLi look like yesterday's jam, I look forward to the first 3DMark benchmark result from some toad who lists his gaming rig as '25 NVLink'd P100s'. And of course, then we'll ALL have to have the same just to get back on the top of the table.
I'm saving up and hollowing out my GPU cave as we speak. Will the council mind too much if I divert the Tank Stream to cool the bugger?