Doesn't make much sense now.
Developers have spent the last several years creating game engines that work on the Cell. Sony should just supply the same PPE, with more SPEs.
Sony's PlayStation 4 will not use the PS3's Cell processor, it has been claimed, because the current chipset and rumoured next-gen variants are too complex for developers to work with quickly. Industry sources indicated that the Cell, which was spearheaded by the now retired Ken Kutaragi, Sony's one-time PlayStation chief, …
Developers have spent the last several years creating game engines that work on the Cell. Sony should just supply the same PPE, with more SPEs.
That's what they ARE doing. Ignore what some uncited fanboy drivel from Kotaku (the worse offender for reporting rumor as fact) says.
Whilst the PS4 early access developer preview kits do have an AMD GPU at the end of the rendering pipeline (currently), the Cell remains, but with more cores, more local cache, beefier SPU's and more of them.
Yes, ignore Kotaku rumours and listen to El Reg AC rumourtards *rolls eyes*
The PS4 early access developer kit, if it exists, is going to be under heavy duty NDA. No one with the kit is going to be posting its details on El Reg.
Using Cell in PS4 would be insane. That doesn't mean it's not possible, since Sony's hardware guys are insane. Still, highly doubtful, especially when they were looking at Larrabee and other non-Cell solutions a few years ago.
Didn't IBM already kill the Cell?
What do the PS3 and 360 have in common?
They aren't using crap archs like x86. Why ruin their superior hardware with it? The Cell is a good thing to have in a gaming rig; by now devs should be already familiar with it, and at the very least they are used to PPC arch by now. Switching to x86 would be more painful than it seems. So yeah, it's possible they're adding AMD Fusion for their graphical stuff, but I doubt it'll take over the main processor.
IBM stopped development of some variants of Cell, and they haven't announced anything new for over 2 years. They did make some that were beefier that what's in the PS3, Cell PowerX8i I think was the name, but I'm not sure they really went anywhere except a couple of custom supercomputers.
The Sony prediction that Cell would be in all sorts of consumer tech within a few years seems to have gone nowhere.
It's an interesting architecture, but a strange one.
Look at all the tech sites making themselves look like fools off the back of some uncited rumour.
Yes, Sony may indeed take up AMD for the GPU, but as the PS3 (and likely PS4's) heavy lifting work will still be done by the CPU, it's meaningless. The RSX GPU in the PS3 is little more than a end-of-the-chain framwbuffer and final image processing filter, the Cell does all the real grunt work.
This is why all the fanboy arguments about how GPU fill rates and such are totally meaningless, as when you add the RSX and Cell together, it's 10x what the paper-specs tell you.
All of the facts in your post are incorrect.
The Cell is often used for final image processing while the RSX is used to draw textured polygons - not vice versa. There's usually a back-and-forth between Cell and RSX, not a simple pipeline. I don't know what you count as "heavy lifting" but the six Cell SPUs combined are only 1.5x as powerful as the RSX shader pipeline on paper, they cannot do texture caching, and they cannot rasterize at anything like RSX speeds. They are great for glow and Naughty Dog even use them for FXAA, but the RSX does most of the actual drawing. The SPUs are also used for things like animation skinning, physics, audio, etc.
GPU fill rate on the RSX is dreadful compared to X360, especially when using alpha. A 30FPS framerate on X360 can easily translate to 20FPS on PS3 due to alpha alone. This is why the SPUs are used for graphics; it helps offload the RSX. Ultimately, a lot of games will have superior X360 versions, simply because it's easier not to do all that SPU work.
The key thing for PS4 is to make this not be an issue any more.
That's crap, not sure where you copy any pasted in from, but it's utterly incorrect.
If you want to see how NaughtyDog use the Cell to do all the hard work, watch the Uncharted 2: Mastering the Cell movie on the game disk, and you will see how you got it back to front, and how games of Uncharted calibre simply aren't possible on the Xbox and it's ancient and inefficient ways of doing things.
Explain why we haven't seen anything like Wipeout's 1080p at 60FPS on the xbox
Because it doesn't render at *1920*x1080. The Xenos's GPU doesn't use the same box of tricks the RSX does in being able to resize horizontally. If you study the PS3 library, you'll note that of all the games that actually render at 1080p, almost none of them actually pull off 1920x1080. And even then, compromises are made such as reduced graphics assets or smaller render areas (think sports games with limited stadium/court sizes). Wipeout HD actually dynamically downscales the horizontal resolution as you play to ensure 60fps.
So I guess if that's the case going to rule out backwards compatibility right off the bat.
Just give us more than two bloody USB sockets
they cant just have an emulator. if the ps4 is an order of magnitude more powerful it should be able to do it. its not like my PC doest pee all over any ps3 game in terms of gfx
I'm no expert with processors, I haven't read up on the subject, but as someone who follows and plays games, my perception has always been that, other than hype around the possibilities of the cell processor, the PS3 has never actually benefited from having it, due to complications developing for it/games being developed for several platforms and so not taking advantage of it/lack of memory.
Considering that, this seems like a good thing. Although as I say, I'm happy to be corrected if I'm wrong.
A few companies have managed to get the best out of it.
Others have had a good go
But there is definately Cell skills out there
When the system was announced, I reckoned the system could be made to fly and do some really cool stuff, but would take the programmers some time to adjust to the mindset. My expectation was that PS3 games would technically lag Xbox for a year or 2 before taking the lead.
The main issue is that if something is difficult, people generally won't do it. When you add in cross-platform games (Xbox/PS3/PC), you want to generate as little custom coding as possible and keep the app as similar as possible across all 3 platforms. Coding to take advantage of the power of the Cell CPUs runs counter to that approach.
Microsoft were actually fairly clever in a marketing sense by making the Xbox run DirectX - they had a massive pool of devlopers ready made for the platform who already knew how to use and exploit the platform.
Infact every PS3 exclusives says hi....
They totally wipe the floor with other console exclusives and multiplatform titles.
Uncharted 3 for example, no load times, pretty much unparalleled visuals.
Only when you have to code for other, lesser consoles, does the PS3 suffer.. Which is why multiplatform titles mostly suck on all consoles compared to platform exclusives. The PS3 of course suffers more, because of it's esoteric hardware.
Think of the car engine analogy, making a Ferrari engine is great, and it fits nicely into a Ferrari, but when you have to chop 8 cylinders off and shoehorn it into a Morris Marina, it's not as good anymore...
If the report was true (doubtful it is), then it is rally just off the shelf components. Sounds like Sony could just sell a VM and allow people to play PS4 games on a rig with the same/similar components. If the Fusion was used, what would make the PS4 different from a gaming rig with the same CPU/GPU? Sony could save a lot of money by not selling any consoles as a loss leader; they of course sacrifice their console sales and having an entertainment hub connected to the TV.
"Sony could save a lot of money by not selling any consoles..."
Their console is just about the only thing making them money.
It is not the console, but the licensing they make money on. Every game, every accessory they make money on, even if they didn't actually make it. They were losing hundreds on every console sold when it was first released. Even today, if they are making money on the console, it is very little. If you went with the console only, the R&D hasn't been recouped yet.
Well of course I meant the whole playstation market. There is no market without a console, is there? I was going by the Reg article: http://www.theregister.co.uk/2012/02/02/sony_slashes_full_year_forecast/
It said that the LCD business & Sony-Ericsson were costing the company more that anything, right? Of course it also doesn't make clear how much profit/loss the whole playstation market is making.
If they were to use off the shelf components found in a typical PC, then all you have is a PC that says Sony PlayStation on it. You might as well as just sell a VM that they can run and call it a day; nothing gets subsidized anymore. if they used the top of the line Fusion processor today, in a year or two, it would be near the bottom. The lifespan of a console is around a decade. In the beginning the PS fans would need to have a good gaming rig, in a few years the average computer would have more than enough power to run it. The fact that you keep missing is that if they did go with standard components, why do you need a console and subsidize? Sony has sold around 62 million PS3's, a good number were heavily subsidized. If they sold a VM for $100, all they had was the development cost of creating the VM; which would be about the same as doing their own OS anyway. They have substantial savings on the hardware R&D and subsidies though; they spent nothing except to pick what hardware is supported.
It'll be a shame if Cell goes. A bloody interesting architecture, and one with far more potential - not yet realised IMHO - than x86. If only Sony had done something more interesting with it than sticking it in a games machine.
Will Cell fans become the Amiga fans de nos jours? I hope not.
>A bloody interesting architecture
Maybe on paper but I assume you never had to code for it. Its worse than CUDA with a fraction of the performance.
"worse than CUDA with a fraction of the performance"
How do you make that out? Care to mention a bit of hardware that uses CUDA (in a more or less similar price range) and that outperforms a PPU/SPU app? I'm not saying it doesn't exist, but your comment is meaningless without looking at hardware--the programming architectures simply aren't directly comparable. On top of that, you also need to consider what type of application you're running. You can't make a blanket statement like that without considering what problem is being thrown at the system.
While I admit I'm not an expert on CUDA, I would consider myself pretty good at programming on the Cell. It seems to me that it would be a lot more difficult to get anything approaching full utilisation of the GPU in a CUDA architecture in comparison. The main reason is that CUDA seems to be designed around throwing many less powerful compute cores at a problem, and due to Amdahl's laws, there's a limit to how effective this can be, depending on the type of problem. CUDA does seem to be an advance over most previous GPGPU attempts, mainly in supporting a richer set of programming primitives. Bitwise operations is the main one I see, but perhaps it also supports branching? I'm fairly ignorant on this, but I know previous GPU shaders didn't support branching, something that seriously restricted their power from a programmer's point of view. Another CUDA downside is the lack of recursive function calls. There are probably more restrictions I'm not aware of.
On the Cell, on the other hand, the SPUs are much more capable computers in their own right, with a full and pretty rich instruction set. It still has a performance penalty for (unhinted) conditional branches and it's really designed as a vector processor so there's a performance (under-utilisation) hit there too if you're writing scalar code for it. But the point is that it's a much more advanced core. Add to that that you've got great inter-core communication possiblities (hardware mailboxes and interrupts and very high bandwidth DMA, though not directly to the GPU--you have to use a flip buffer in main memory) and it's quite possible to think of dedicating some SPUs to specific tasks the way you would threads or concurrent processes in other systems. Or of using a hybrid model, with some SPUs loading code dynamically as needed for small, compute-intensive kernels, while others are statically allocated to certain tasks/threads.
The Cell is also much better at keeping the main CPU free from having to act as a master to slave cores, with all the attendant housekeeping that can entail. In a properly designed application, SPUs can basically act asynchronously and can coordinate work amongst themselves, with main CPU overhead kept to a minimum. My guess is that a CUDA system needs to dedicate a fair amount of main CPU grunt to keep GPU cores singing. And that's power could be spent implementing other parts of your app that the GPU can't help you with at all.
The upshot of all this is that while a CUDA system could very well beat a PS3 at a narrowly-defined task (eg, password cracking, though not things that involve rainbow hashes, since memory/disk bandwidth is the bottleneck there), such applications generally have to be embarrassingly parallel to begin with. So maybe you can write part of your render pipeline by throwing more cores at it in a CUDA system (constrained by your memory/DMA buses), but because of Amdahl's law, there's a limit to how far that will take you. At a certain point, you need to start thinking of apps as distributed programs with complex data interdependencies instead of purely parallel ones with only simple pipelines, and that's where the Cell's architecture really shines, in my opinion.
The Cell architecture harkens back to IBM's Power II and early Power III. Its memory model and instruction set were not optimized for today's large-memory-is-cheap environment. IBM totally abandoned this architecture in mainline-Power about 2001.
I don't know if Sony will change to Power 7 or Intel x64, but it will go one of those two directions. IBM doesn't push AMD graphics on Power, but it is there. It is mostly an issue of building a low-cost high-performance, easy-to-develop-for platform. There can be arguments for Power or x64 in all of those situations.
Er, the Cell is very different to anything else, including the entire Power range. Sure, Cell borrowed bits and pieces from the Power ecosystem (a PowerPC core here, 8 Altivecs there), but they were glued together in a totally unique way.
It is almost inconceivable that Sony will change to Power7, at least not in the form it exists in when built in to an IBM mainframe. The physical size alone (we're talking something as big as your hand) would preclude that. Plus the Power7 architecture has some even weirder components; I very much doubt that a games designer will be able to find a use for a decimal maths co-processor.
But you are right to point out the dilemma that Sony are in. I see several pitfalls with going x64; even today you have to have a fairly mighty x64 before you've got as much floating point grunt as the Cell has. That won't come cheap, plus it's tricky to not appear as a fancily dressed PC that isn't running Windows.
They could rely on GPU for the floating point grunt that's needed, and ATI / Nvidia would have you believe that they're the ones for the job. But whilst GPUs undoubtedly have a lot of grunt, again it's tricky not to appear as just a PC.
Developing Cell (16 SPEs? Hooking up to a beefier GPU?) would be a brave step, but it would allow them to preserve the investments that have already been made in software whilst bringing about demonstrable improvements, and they would retain complete control. But if they do, I wish they would let Linux back on it. When the PS3 was launched there was much talk about it's computing grunt. Including the GPU doing single precision floating point it was apparently topping out at about 2.1 TFLOPS, the Cell accounting for about 200 GFLOPS. Even now you have to try reasonably hard to beat that for the money.
200 gigaflops is hard to beat for the money? Try a GeForce GT 220 for $70. As for 2.1 TFlops that is a theoretical BS number never achieved. Even in 2008 on the folding@home science app a $200 Nivida 8800GT could do 4x the work of PS3 (had both going at the time).
And folding@home was hand optimized by the Sony folks so don't say it wasn't optimized for the PS3.
Oops you got it a bit wrong too :-) The SPEs are not Altivecs - there is one with the PPE core but its a different thing. IBM Mainframe is z/Architecture based and while it shares some components with Power its essentially CISC whereas Power is RISC. Decimal Maths is one small bit of weirdness on a z chip where there are nearly a 1000 custom instructions in microcode/hardware. Physical size is incorrect too - the entire MCM complex is big but an individual core at 45nm would be 25mm or so square. Improve the fab and the size is obviously less... IBM would not drop a whole System P Power 7 core in but like ARM they do mix and match so Power 7 less the decimal co-processor, IO channels and other exotic stuff for servers could do nicely. Just look at the architecture of the PowerPC A2 and BlueGene/Q multicore processors - nothing stopping them producing a custom multicore affair like these for the PS4 - unlike AMD IBM know how to stuff lots of cores on a chip and make it work (cf. Bulldozer).
This is how big the physical processor is:
Not even close to the size of your hand.
There were test modules that were big:
If the actual chip was the size of your hand, how do they get two of them on a blade? The blade is not all that big and has other components, like RAM, HDD, Ethernet, etc. Here is how big the blade is:
PS703 Express blade: 9.65 in. (245 mm) H x 1.14 in. (29 mm) W x 17.55 in. (445 mm) D; weight: 9.6 lbs (4.35 kg)2
12.75 x 3.86 x 10.8 in.
11.42 x 2.56 x 11.42 in.
The SPEs mostly are Altivecs, with just enough extra to make them independent. Most of the maths instructions are exactly the same. That was the whole point of them. So long as you know what you're doing Altivec/SPEs are pretty good for image/signal processing, and I've had very good mileage out of them for ten+ years now.
As for physical size, the Power7 MCM is large. Sure, an individual core is smaller but then that would not be a "power7", would it. Based on previous form I doubt that IBM will be doing anything for the PS4. They're not really interested in the games or pc market, it's just not worth their while. Freescale might, but they've got their own product releases coming along without worrying about Sony too. Sony have rights to Cell, so they can go it alone. But it really wouldn't be worth their while doing something Power-ish that wasn't based on Cell because they'd lose all the existing software.
As for performance, I've seen Cell get very close to 250GFlops (though not in a PS3). You have to try quite hard to get anything with Intel written on the top near that figure. It's hard to program, but get it right on Cell and your sums are done very quickly indeed. GPUs have a lot of grunt, provided AMD or Nvidia have written you a library function...
Now can we have the Cell architecture for Linux please ?
Meanwhile, AMD has announced that it's offering to produce modified Bobcats including the customer's own IP blocks. http://semiaccurate.com/2012/02/02/amd-opens-up-bobcat-to-3rd-party-ip/ *strokes Anonymous Goatee*
If CELL BE was so 1337 how come nobody but Sony has used it for anything? IBM sold so few of them outside the PS3 that they ceased new development. It was a great idea on paper but alas just as it was becoming a reality the rest of industry had figured how to produce quad plus full cores economically (which were much easier to code for) as well as the GPUs became viable for computation on the other end much faster than the CELL. The CELL was such a dog for general computing (thus the XMB sluggishness in any game) that it finally moved even Apple to Intel. In the end the marketplace spoke and It was a mercy killing.
Hmm IBM sold 13,000 of them for the, at the time, fastest computer on the planet - Roadrunner. Also the first to break the petaflop barrier. Hardly sluggish if used properly - which was and is the problem, SPEs are not easy to code for.
WTF has Cell got to do with Apple moving to Intel? That was due to a lack of Power CPUs with the right power envelope according to Apple - no one was going to put Cell in a Mac.
Errr... Toshiba used 'em for one of their Smart TVs.
Take anything from Intel or AMD and see if it can decode 48 SD streams and display all of them on a 1080P TV.
So if the PowerPC is such a dog, then why does IBM still sell it and continue to produce new models?
Shall we look at a few benchmarks; 64 cores to 32 cores; all real physical cores.
8 x Intel Xeon X7560.
IBM Power 780 with 32 cores:
IBM Power 780 with 64 cores:
Even half the cores on a Power7 machine can beat the best Intel can offer.
As for only Sony using it.
1) Toshiba has a TV with it.
2) IBM has used it in various products.
3) Leadtek uses it:
4) Fixstars has also used it:
5) Mercury Computer systems:
As a million posters above note CELL is not PowerPC. Power7 is actually a very viable architecture for general computing.
Developers typically don't have to "deal with" architectures at the hardware level unless they are coding in assembler, and I doubt very much games development takes place at that low level. They work with higher level languages and even higher-level frameworks.
The fact that the CELL processor is PowerPC based is the bigger issue I think. Not because it creates any issue for developers but simply because PowerPC is deader than a dead thing that died, was buried, dug up again, shot, drowned, burned and then buried again.
The PS3 was launched in 2006, and so was presumably developed over a period of a number of years before that - a time when PowerPC was on life support courtesy of Apple. Then Apple ditched PowerPC and jumped to the x86 side of the fence.
Does the PowerPC architecture still have any other customers or development channel beside CELL and a handful of other, very limited verticals ?
I didn't think so.
PowerPC is very much alive, IBM makes a fortune off of it in the Power line; currently Power7 series. Motorola still sells PowerPC processors and you can find them in many things, like your car. NASA also uses PowerPC on various things.
The Cell is NOT PowerPC based; it does have a PowerPC processor but that is it. The SPE's are not PowerPC based. The PowerPC portion is essentially a controller for the SPE's which are the ones doing the heavy lifting, not the PowerPC portion.
You do realize that the Power7 outperforms what Intel offers, right?
Ah, I see now... Cell is not PowerPC based, it just has a PowerPC in it. My mistake.
And thank you for giving me a water-tight defense against tree huggers - now I can tell them that my car isn't petroleum based, it just has a petrol engine IN it, which isn't the same thing at all, apparently.
PowerPC may be "alive", but then so is vinyl. But being "very much alive" doesn't necessarily make any technology the right or obvious choice. See where you said that PowerPC's can be found in my car ? Last time I checked, car's were not being cited as a showcase for cutting edge, high performance gaming platforms and technology.
It makes me laugh, the cell processor has NEVER been fully utilised, we have developers who want to make a game using new methodologies and techniques and, apparently, the cell can't cope with this. Sony RAVE about the cell, how it's TEN (yes ten) years ahead of it's competitors, then 5/6 years down the line, whip it out in favour of an integrated AMD chipset, allegedly!!
Let me guess, AMD is going to become 2nd source for PowerPC/Cell chips? (yes I know about Motorola and AMCC-sourced PPCs, but they don't produce high-performance PPCs anymore). But probably Sony doesn't plan to retain backward compatibility with PS3 games...
Why would AMD be a second source? AMD doesn't have their own foundries, that is who would actually produce it. Why would Sony take the design there when they could work with IBM that helped invent it? Sure IBM might charge more, but given the history that IBM has in HPC, wouldn't you rather deal with them? The Power7 kills anything that Intel or AMD has to offer, so IBM must be doing something right. How much experience in PowerPC does AMD have?
Sony could have a Fusion processor replace the PowerPC portion of the Cell and that would make a lot of sense. Since the Cell does have a GPU, they could utilize the GPU on the Fusion processor and have the CPU act as the controller like the PowerPC currently does. That sounds more feasible and likely than to use Fusion to replace the Cell. There should be substantial power savings as well, the CPU, GPU and memory controller all on one die. Less chips, less power and should be lower cost as well.
Biting the hand that feeds IT © 1998–2017