Didn't sound too bad until; "There's also an AMD-designed Radeon-class GPU that been tweaked by Microsoft to within an inch of its life."
Microsoft aren't exactly known for their prowess in hardware design, they should've left it to AMD.
Microsoft has revealed details of the chip powering its soon-to-be-released Xbox One – and it's one big ol' mofo. How big? Does a 363mm2 footprint – using a 28-nanometer process, no less – filled with five billion transistors impress you? Perhaps Microsoft is planning to use this big boy for Halo: OverReach By comparison, …
"This is of vital importance."
This is of Zero importance.
Microsoft are well known for very secure hardware compared to the competition. If they are relying on drivers not being available to prevent people downgrading to Linux then they already failed...
"Microsoft aren't exactly known for their prowess in hardware design."
The CPU in the Xbox 360 was pretty heavily customised at Microsoft's request. Whether they actually employed the hardware engineers themselves or whether they merely submitted design requests, I don't know. Either way, I'm guessing they did the same with this CPU.
What I do know is the additions that were introduced to the Xbox 360 CPU made for a pretty powerful processor. It wasn't perfect, but for the cost and power restraints imposed it was pretty damn good.
>I hear the PS4 won't play PC games either. It's almost like they're completely different devices which happen to include "xbox" in the name.
You got to give JDX credit for one thing, he is consistently loyal to his employer. A finer Microsoft schill can't be found (except for the numerous AC that only show up for Microsoft articles).
Nah, there's the German dude and the other one who insists on comparing current market shares with the monopoly era. Though it could be that they're just trolling vs. being actual shills ;)
On the claim of the other AC saying that MS does the most secure hardware: the 360 has cracked years before the PS3 was. And even then, it was because of someone doing a boo-boo and sticking a constant in a place where an RNG was required. So it wasn't even the hardware they cracked on the PS3, it was the crypto key itself...
I'm pretty sure they have said the Xbox One OS's reserve 3GB of memory and not 4. I also think the PS4 reserves 3/3.5GB so pretty much the same.
Personally I don't care about clock speed, transistor count etc and who has the biggest/fastest numbers. I only care about my games playing and looking great, if either of the systems can do that, then I'll be happy.
I doubt it 50% more powerful, I do think it has a performance edge though. I doubt there'll be any visible difference in the games cross platform games for a long time. Developers will go with the lowest common denominator. Exclusives will be what to look at maybe?
As it stands though Xbox One has 3GB of RAM reserved for OS and the PS4 3.5GB
I don't think it's fair to say boring old DDR3 :P it's still much faster than GDDR5 in terms of latency so it still has its place. There's a reason why PCs don't use it for it for CPUs.
When it comes to the CPU and GPU sharing the memory I have no idea which is better over all for gaming. I'd guess the GPU ought to have higher priority though.
I don't think we can really say what system is better till there's been some real world testing done.
I doubt I'll buy either for a while anyway, not at these prices, but I will get an Xbox one pad and use it on my PC.
I wonder if either chip will be harder to fabricate than the other.
Seems like some idiots are cherrypicking what to believe from Digitial Foundry, so that it suits their preferred console.
You can't go off ranting about how Digitial Foundry said this and said that when it suits you and then say they are full of crap when it doesn't...
That's nonsense. it's all about the bandwidth baby.. PS4's memory has a bandwith of 170 GB/s with GDDR5, while X1 DDR3 will have only ~70 GB/s
And the PS4 OS is 3GB in debug configuration (which is why the debug units have more memory), in retail, the OS is 1GB of RAM reserved for OS. vs the 3GB used by the X1.
The PS4 also has 30% more GPU pipelines. All this together makes the Digital Foundry claims VERY plausible.
>That's nonsense. it's all about the bandwidth baby.. PS4's memory has a bandwith of 170 GB/s with GDDR5, while X1 DDR3 will have only ~70 GB/s
Nope OP is correct, low latency, 64bit memory controller (GDDR5 uses 32bit) means system/cpu will be markedly faster - and I mean markedly. Though it is also true that the PS4 will have plenty of spare graphics bandwidth.
It's an academic debate anyway, Sony went which an AMD APU which requires GDDR5.
"Two sinking ships."
Seems unlikely to be a fail - the Xbox 360 just over took the Wii in the UK to become top selling console of all time, and it has been outselling the PS3 in the USA for years.
Plus this time the Xbox has better games and exclusives than the PS4 - and Kinect 2 - and an HDMI input and the ability to overlay your sat / cable box and act as a DVR. Microsoft are going into this round with several major advantages over Sony.
>the Xbox 360 just over took the Wii in the UK to become top selling console of all time
>The best-selling console of all time in the UK remains the PlayStation 2 with 10 million consoles. The PlayStation 2 is also still the best-selling console worldwide, with 155 million units sold – a figure slightly ahead of the Nintendo DS, which was on 153.87 million as of this March. - http://metro.co.uk/2013/06/27/xbox-360-beats-wii-as-the-uks-best-selling-console-3858990/
That sucking sound was your credibility. Sony sucks and all and the PS3 after the PS2 was a major embarrassment but so is forcing me to pay $100 more for that crappy kinect I will never use. No thanks.
Um... the whole DRM thingy scared away a lot of people, and a good chunk of 'em might never return even if MS did a 180 on that. The price point is also another downside, especially if you take into account that Kinect is mostly a gimmick me-too Wii. They did get Dead Rising 3 to be an X-bone exclusive ... too bad for DR3. It's probably the only one I was actually interested in playing.
The XBone might not crash and burn so spectacularly as we expected while they kept their DRM stance, but it will probably be a dud. And it should be, because MS attitude during E3 was basically "screw the consumer, we're going to abuse you even if you don't like it".
I'm intrigued by the whole 'shared memory' thing because it's nothing new at all. I'm not talking about the setup that PCs have had in recent times where the video memory was carved out of the main system memory, but every time I've seen it mentioned, I've just remembered the Amiga.
For those not familiar with the Amiga's innards (and this is a simplification, the real picture is more complex but I've forgotten most of the detail), there were essentially two kinds of memory hived out of the total system memory. The first was 'chip' memory, which could be read by all the main chips, which is where graphics and sound had to be stored. The second, was 'fast' memory where only the main controller could access, meaning that you stuffed application code there where possible, because the CPU could access it faster than it could if it were reading from chip memory. It was also possible to switch some from one to the other (e.g. like the later Amigas had a ton of chip memory but a lot of programs expected that if it saw that much memory, some of it had to be fast memory and promptly went splut)
So yeah, sharing memory between subsystems on a more unified level is not a new concept, especially when you're talking about memory that both the CPU and graphics setup can share between and essentially allow the graphics to grab from memory without the CPU being involved... it just reminds me of 1986 or thereabouts...
That was common in the early computing era.
Heck, the ZX spectrum does something similar. If you put your code in the upper 32K it will run at full speed, but in the lower 16k where the video memory is, it will get interruped by the ULA on a regular basis, which among other things, will totally screw up critical timing loops.
As i found out when i tried to write a speccy speedloader back in 1988.
All the cheap and cheerful 1980s computers that I've met socially (Tandy, Commodore, etc.) had up to 64k of address space with the video being mapped into part of it. One could PEEK and POKE right onto the display.
There later came some capabilities to swap banks of memory, to switch in RAM in place of 32k of ROM, or to switch in another 64k of RAM (total 128k). Obviously the code had to copy itself over before making the switch.
"Shared memory"? It was the original default assumption.
Video access to large address ranges of main memory has been around since long before the Amiga. For instance, the Atari 800 and the Commodore 64 - both those had memory-mapped frame buffers which could be set to read from most parts of the RAM.
The Atari 800 custom audio/video chips were IIRC designed by Jay Miner, who went on to design the custom chips in the Amiga. The Amiga had much more CPU memory also addressable by graphics hardware, and added a nifty DMA coprocessor that could do bit-oriented graphics operations over data stored in the 'chip' memory, as well as moving data around to feed the PCM audio channels and floppy controller... but at the core, it was the same kind of architecture, just scaled up.
Things got much more interesting when CPUs got write-back caches; now explicit measures were required to ensure that data written by the CPU was actually in memory instead of just sitting in a dirty cache line at the time the GPU or other bus mastering peripheral went to fetch it. It's all the same cache coherency issues that multiprocessor system architects have been dealing with for years, and in a system like the XBOne, most of the peripherals are more or less peers with the various system CPUs in terms of how they access cached data; in fact, most peripherals look like specialised CPUs, hence the "heterogeneous" part of the HSA. You don't need to explicitly flush CPU caches, or set up areas of memory that aren't write-back cached, in order for the GPU to successfully read data that the CPU just wrote, or vice versa. That's the nifty part.
I'm guessing that the XBOne, like the Xbox 360, will have its frame buffers and Z-buffers integrated on the enormous CPU/GPU chip. That will reduce the bandwidth requirements on main memory by a great deal, as GPU rendering and video output will be served by the on-chip RAM. There are other ways to get some of the same effects - the PowerVR mobile device GPUs render the whole scene one small region ('tile') at a time, only keeping a couple of tiles plus the same size of Z-buffer in on-chip RAM, then squirt the finished tile out to main memory in a very efficient way - but it does create other limitations in how the graphics drivers process a 3D scene; any extra CPU work to feed the GPU takes away from power savings given by the simpler, smaller GPU. Tradeoffs abound.
AMD claimed it would but irrespective of hardware you've got the OS issue. Apple Macs run on an x86 CPU yet its only been in the past couple of years we've seen any volume of games released for them and even that can be attributed to Valve wanting to move away from Windows as opposed to wanting to move to OSX.
"They may have similar hardware, but they are running completely different O/S"
Sony run a Linux like OS. Microsoft run a modified Windows 8 kernel and hypervisor.
If we go by benchmarks of Windows 8 versus the latest Ubuntu, Microsoft will have a performance advantage in the OS side both for large file transfers and for graphics.
>Sony run a Linux like OS. Microsoft run a modified Windows 8 kernel and hypervisor.
>If we go by benchmarks of Windows 8 versus the latest Ubuntu, Microsoft will have a performance advantage in the OS side both for large file transfers and for graphics.
More demented propaganda from RICHTO! Must we?
It sounds exactly the same. My contacts at flex doumen tell me that end of line yields for the xbone have only just crept into double digits.
In other words don't expect one this year, and if you do, expect it to be DOA or soon afterwards.
Sounds all to familiar...
re AC 11:45...
As you posted AC, I feel that it is quite likely that you have never been within 100 miles of either facility. And, due to the fact that you saw fit to hide your identity, that even if you have been to one or both, you can't prove it.
Now, as to the yields... I have no idea. _I_ haven't been to either facility. However, I suspect, based on people I know who have been to TSMC, that you are exaggerating somewhat. Yes, the yields have not been stellar. No, they're not as bad as stated. There _will_ be dead Xboxes. Lots of them. I doubt that there will be nearly the number that you suggest, though.
Now, if you would provide some actual support for your position, something a little better than "I know 'cause I know", perhaps there might be a re-evaluation. As is, though...
Personally I think that the chip companies should leverage the extremely useful heat producing capabilities of their multi-billion transistors switching.
I mean, who wouldn't want a house centrally heated by their computer? Picture it - SWMBO puts the thermostat up, AGAIN, and you get the option to model the microclimate in your back garden and sell the data to the MET office or perform a simulated nuclear test on the neighbour's cat. I might actually consider spending £2500 on a boiler if it came with an intel inside sticker and a HDMI port and could run Crysis at 42fps.
If most of it is cache memory, probably not that much more.
The Transistors in Microchips only use significant power when they are switching, so if they switch only rarely (In digital terms), Eg as in memory, they dont generate much heat.
Thats why memory sticks, which are easilly pushing 1 million transistors, rarely require extra cooling.
>>2 billion transistors are set aside to power the NSA compressed audio, video and data streams - direct from your house. Nowhere to hide, gamers
Or how to get the public to pay for a massive planetary wide distributed computer for the NSA. No wonder MS wanted it connected to the internet at all times.
You guys do realize the NSA has been doing this for years right?
Every thread about anything has a mention of the NSA since PRISM was leaked. You realize the Army shoot people? and that what a politician says is not always true right?
If the NSA/CIA/Mutant Lizard people want a distributed computer system they will just buy one. When the EFF were fighting to say DES (The U.S government approved crypto cipher) was insecure it was the intelligence community saying it was fine.
The EFF then made custom ASICs for $250k and the DESchall did it with a distributed net of home machines. Do you think that was news to the spooks? They probably had whole DC's full of stuff to break DES, similar things were seen with the Clipper chip.
Why would they need to risk being found out by hijacking machines that they do not control (would you want Bunnie Huang to find your NSA back door). They can either get them built themselves (ASICS would be far more useful then standard CPU's) or create a botnet on the millions of US Gov owned P.Cs.
You tell them the facts and they'll just downvote you.
Its highly entertaining if you're in the know about the Intelligence Community and/or Cryptographic Research. Same with the armchair CEOs, armchair Engineers, and armchair Warriors running around here.
Meanwhile, the 28 nm GTX Titan apparently contains about 7 billion transistors, and the Nvidia's GTX 680 about 3.5 billion transistors (the GTX 680 on a slightly under 300 square mm chip).
The Xbox One GPU is far less powerful, but it will still eat up a significant amount of space. Then there is the 8-core CPU, not to mention the cache that takes up half the transistor budget... For me, it looks like the NSA had to settle for a few hundred million transistors at the most.
Would be impressed if this was for some sort of life-saving system that will change humanity for the better - nope, it is so that young kids can do role playing games, shooting the crap out of each other in a virtual world while in the real world they are doing nothing which could be classed as useful... Thank you Microsoft for providing the virtual opium to destroy our digital youth of today :(
Xbox one is meant to drive a FullHd monitor, right? Nvidia behemoths gfx cards pushing 3 billion tn are meant to push larger / combo displays past 1080p. Which means xbox will have power to turn all the eye-candy on such a small display, by comparison. Or can it drive a displayport beyond 1080p screen? Specs? Of course the games will look great and 60fps smooth, since I bet it wont be designed to play in more than 1080p, which is the norm these days. Or most people will have a monitor that resolution available.
Judging by the photo of the chip it looks like it may have some LEDs on it. Well I hope so anyway, because I've just realised that despite silicon chips doing so much for us... they look really boring. So more flashing LEDs on my chips, please.
(Please don't burst my bubble and tell me they aren't LEDs.)
This seems to me like an effort to foil the mod-chippers.
Putting it all on a SoC with custom silicon could make it pretty much unhackable..
I'm sure a lot of managers at Microsoft would love it to be a black box filled with epoxy and only ethernet in one end and HDMI out the other, with a couple of antennas inside for controllers etc.
If it weren't for the small issues of cooling, and those pesky soldiers in their disconnected army bases kicking up a fuss about always-on connectivity, they'd probably have done that already!
It does make me chuckle to see, what looks like fanboy downvoting, in action. If anyone dares to post (obvious trolls and shills excepted) anything positive or supportive of XBone or technology it uses, it gets downvoted.
Are the Sony supporters really that insecure?
For the record I don't really care either way as both consoles are likely to serve their purpose and neither company can be trusted.
Yeah there are some interesting differences that have sparked debate and discussion and I will be interesting to see how it all works out in the end once we can see stuff actually running on the released hardware.
Biting the hand that feeds IT © 1998–2019