back to article Microsoft Xbox One to be powered by ginormous system-on-chip

Microsoft has revealed details of the chip powering its soon-to-be-released Xbox One – and it's one big ol' mofo. How big? Does a 363mm2 footprint – using a 28-nanometer process, no less – filled with five billion transistors impress you? Perhaps Microsoft is planning to use this big boy for Halo: OverReach By comparison, …

COMMENTS

This topic is closed for new posts.
  1. Fibbles

    Didn't sound too bad until; "There's also an AMD-designed Radeon-class GPU that been tweaked by Microsoft to within an inch of its life."

    Microsoft aren't exactly known for their prowess in hardware design, they should've left it to AMD.

    1. Anonymous Coward
      Anonymous Coward

      My experiences* are contrary, with MS being, very ironically, a lot more solid on the hardware front than their software.

      *My experiences are limited to mice, keyboards and such PC peripherals - never even seen an X-box in the real.

    2. Flocke Kroes Silver badge

      All hardware tweaked until open source drivers cannot use it

      This is of vital importance. Just think of the number of users who would instantly drop dead from viral infection if they had the option to install software not selected by Microsoft.

      1. Anonymous Coward
        Anonymous Coward

        Re: All hardware tweaked until open source drivers cannot use it

        "This is of vital importance."

        This is of Zero importance.

        Microsoft are well known for very secure hardware compared to the competition. If they are relying on drivers not being available to prevent people downgrading to Linux then they already failed...

        1. Anonymous Coward
          Anonymous Coward

          Re: All hardware tweaked until open source drivers cannot use it

          "Downgrade" from an M$ games console OS to Linux? Really RICHTO?????????????stroke?????? Have you mistaken brain-ajax for coke again?

    3. Def Silver badge

      "Microsoft aren't exactly known for their prowess in hardware design."

      The CPU in the Xbox 360 was pretty heavily customised at Microsoft's request. Whether they actually employed the hardware engineers themselves or whether they merely submitted design requests, I don't know. Either way, I'm guessing they did the same with this CPU.

      What I do know is the additions that were introduced to the Xbox 360 CPU made for a pretty powerful processor. It wasn't perfect, but for the cost and power restraints imposed it was pretty damn good.

    4. g e
      Holmes

      " tweaked by Microsoft to within an inch of its life"

      Hope the heatsink stays on this time.

      1. Paul Crawford Silver badge
        Coat

        Re: " tweaked by Microsoft to within an inch of its life"

        Why did I read that as "twerked by Microsoft to within an inch of her life"?

        Mine is the dirty mac(OS) ->

    5. Anonymous Coward
      Anonymous Coward

      Locks out all your old games.........

      1. JDX Gold badge

        I hear the PS4 won't play PC games either. It's almost like they're completely different devices which happen to include "xbox" in the name.

        1. Pascal Monett Silver badge
          Coat

          PS4 has "xbox" in its name ?

          I hadn't noticed.

        2. Anonymous Coward
          Megaphone

          "I hear the PS4 won't play PC games either"

          Don't be so sure...

          http://en.wikipedia.org/wiki/Gaikai

        3. asdf Silver badge
          Trollface

          >I hear the PS4 won't play PC games either. It's almost like they're completely different devices which happen to include "xbox" in the name.

          You got to give JDX credit for one thing, he is consistently loyal to his employer. A finer Microsoft schill can't be found (except for the numerous AC that only show up for Microsoft articles).

          1. Daniel B.
            Boffin

            @asdf

            Nah, there's the German dude and the other one who insists on comparing current market shares with the monopoly era. Though it could be that they're just trolling vs. being actual shills ;)

            On the claim of the other AC saying that MS does the most secure hardware: the 360 has cracked years before the PS3 was. And even then, it was because of someone doing a boo-boo and sticking a constant in a place where an RNG was required. So it wasn't even the hardware they cracked on the PS3, it was the crypto key itself...

  2. AndricD

    pretty interesting. wonder how different the ps4 chip is?

    1. Anonymous Coward
      Anonymous Coward

      50 percent more powerful

      According to digital foundry.

      Ps4 30 percent more gpu pipelines, much faster memory (gddr5 rather than boring old ddr3 in the xbone) and the Xbox OS uses 4gb of the memory thanks to its 3 os hypervisor.

      1. IRBonReg

        Re: 50 percent more powerful

        I'm pretty sure they have said the Xbox One OS's reserve 3GB of memory and not 4. I also think the PS4 reserves 3/3.5GB so pretty much the same.

        Personally I don't care about clock speed, transistor count etc and who has the biggest/fastest numbers. I only care about my games playing and looking great, if either of the systems can do that, then I'll be happy.

      2. Alistair MacRae

        Re: 50 percent more powerful

        I doubt it 50% more powerful, I do think it has a performance edge though. I doubt there'll be any visible difference in the games cross platform games for a long time. Developers will go with the lowest common denominator. Exclusives will be what to look at maybe?

        As it stands though Xbox One has 3GB of RAM reserved for OS and the PS4 3.5GB

        I don't think it's fair to say boring old DDR3 :P it's still much faster than GDDR5 in terms of latency so it still has its place. There's a reason why PCs don't use it for it for CPUs.

        When it comes to the CPU and GPU sharing the memory I have no idea which is better over all for gaming. I'd guess the GPU ought to have higher priority though.

        I don't think we can really say what system is better till there's been some real world testing done.

        I doubt I'll buy either for a while anyway, not at these prices, but I will get an Xbox one pad and use it on my PC.

        I wonder if either chip will be harder to fabricate than the other.

        1. Anonymous Coward
          Anonymous Coward

          Re: 50 percent more powerful

          Seems like some idiots are cherrypicking what to believe from Digitial Foundry, so that it suits their preferred console.

          You can't go off ranting about how Digitial Foundry said this and said that when it suits you and then say they are full of crap when it doesn't...

          http://www.digitalspy.co.uk/gaming/news/a483743/ps4-has-50-percent-more-raw-power-in-graphics-than-xbox-one-says-report.html

      3. Anonymous Coward
        Anonymous Coward

        Re: 50 percent more powerful

        "much faster memory (gddr5 rather than boring old ddr3 in the xbone)"

        GDDR5 is SLOWER memory (much higher latency) - and it is actually modified DDR3. It supports high bandwidth transfers for graphics, but is poor for CPUs....

        1. Anonymous Coward
          Anonymous Coward

          Re: 50 percent more powerful

          That's nonsense. it's all about the bandwidth baby.. PS4's memory has a bandwith of 170 GB/s with GDDR5, while X1 DDR3 will have only ~70 GB/s

          And the PS4 OS is 3GB in debug configuration (which is why the debug units have more memory), in retail, the OS is 1GB of RAM reserved for OS. vs the 3GB used by the X1.

          The PS4 also has 30% more GPU pipelines. All this together makes the Digital Foundry claims VERY plausible.

          1. Anonymous Coward
            Anonymous Coward

            Re: 50 percent more powerful

            >That's nonsense. it's all about the bandwidth baby.. PS4's memory has a bandwith of 170 GB/s with GDDR5, while X1 DDR3 will have only ~70 GB/s

            Nope OP is correct, low latency, 64bit memory controller (GDDR5 uses 32bit) means system/cpu will be markedly faster - and I mean markedly. Though it is also true that the PS4 will have plenty of spare graphics bandwidth.

            It's an academic debate anyway, Sony went which an AMD APU which requires GDDR5.

          2. MattEvansC3

            Re: 50 percent more powerful

            Not when you see digital foundry's test conditions, it equated to two top spec PCs running different GPUs and that's it.

  3. Rukario
    Pirate

    XBone and Itanic. What a comparison. Two sinking ships.

    1. RetroTom

      and mentions of Jaguar...

      didn't Atari once name an system after those big cats?

      good omen that one!

      1. Eddie Knopfler

        Re: and mentions of Jaguar...

        Yes, it was a 64bit massive black box with release titles that looked no more impressive than games from the previous generation of consoles.

        Good job Sony and MS won't be doing that... Oh, wait...

    2. Anonymous Coward
      Anonymous Coward

      "Two sinking ships."

      Seems unlikely to be a fail - the Xbox 360 just over took the Wii in the UK to become top selling console of all time, and it has been outselling the PS3 in the USA for years.

      Plus this time the Xbox has better games and exclusives than the PS4 - and Kinect 2 - and an HDMI input and the ability to overlay your sat / cable box and act as a DVR. Microsoft are going into this round with several major advantages over Sony.

      1. MJI Silver badge

        Ermm

        Why do I want to overlay my Freesat PVR output?

        No it doesn't record TV.

        And which games are better? Do they have Naughty Dog?

        And what use is the camera when I want to slob out on the settee?

      2. asdf Silver badge
        FAIL

        >the Xbox 360 just over took the Wii in the UK to become top selling console of all time

        >The best-selling console of all time in the UK remains the PlayStation 2 with 10 million consoles. The PlayStation 2 is also still the best-selling console worldwide, with 155 million units sold – a figure slightly ahead of the Nintendo DS, which was on 153.87 million as of this March. - http://metro.co.uk/2013/06/27/xbox-360-beats-wii-as-the-uks-best-selling-console-3858990/

        That sucking sound was your credibility. Sony sucks and all and the PS3 after the PS2 was a major embarrassment but so is forcing me to pay $100 more for that crappy kinect I will never use. No thanks.

      3. Daniel B.

        @AC

        Um... the whole DRM thingy scared away a lot of people, and a good chunk of 'em might never return even if MS did a 180 on that. The price point is also another downside, especially if you take into account that Kinect is mostly a gimmick me-too Wii. They did get Dead Rising 3 to be an X-bone exclusive ... too bad for DR3. It's probably the only one I was actually interested in playing.

        The XBone might not crash and burn so spectacularly as we expected while they kept their DRM stance, but it will probably be a dud. And it should be, because MS attitude during E3 was basically "screw the consumer, we're going to abuse you even if you don't like it".

  4. Pete Spicer
    Boffin

    I'm intrigued by the whole 'shared memory' thing because it's nothing new at all. I'm not talking about the setup that PCs have had in recent times where the video memory was carved out of the main system memory, but every time I've seen it mentioned, I've just remembered the Amiga.

    For those not familiar with the Amiga's innards (and this is a simplification, the real picture is more complex but I've forgotten most of the detail), there were essentially two kinds of memory hived out of the total system memory. The first was 'chip' memory, which could be read by all the main chips, which is where graphics and sound had to be stored. The second, was 'fast' memory where only the main controller could access, meaning that you stuffed application code there where possible, because the CPU could access it faster than it could if it were reading from chip memory. It was also possible to switch some from one to the other (e.g. like the later Amigas had a ton of chip memory but a lot of programs expected that if it saw that much memory, some of it had to be fast memory and promptly went splut)

    So yeah, sharing memory between subsystems on a more unified level is not a new concept, especially when you're talking about memory that both the CPU and graphics setup can share between and essentially allow the graphics to grab from memory without the CPU being involved... it just reminds me of 1986 or thereabouts...

    1. LaeMing Silver badge
      Boffin

      Reminds me of the 8-bit era when CPU and Video often would access the same memory on high/low memory clocks respectively (RAM could actually be faster than processors in them days by a two-factor).

    2. spodula

      That was common in the early computing era.

      Heck, the ZX spectrum does something similar. If you put your code in the upper 32K it will run at full speed, but in the lower 16k where the video memory is, it will get interruped by the ULA on a regular basis, which among other things, will totally screw up critical timing loops.

      As i found out when i tried to write a speccy speedloader back in 1988.

    3. JeffyPoooh Silver badge
      Pint

      "shared memory"

      All the cheap and cheerful 1980s computers that I've met socially (Tandy, Commodore, etc.) had up to 64k of address space with the video being mapped into part of it. One could PEEK and POKE right onto the display.

      There later came some capabilities to swap banks of memory, to switch in RAM in place of 32k of ROM, or to switch in another 64k of RAM (total 128k). Obviously the code had to copy itself over before making the switch.

      "Shared memory"? It was the original default assumption.

      1. Anonymous Coward
        Anonymous Coward

        Re: "shared memory"

        PC's with MSDOS too. 0xA000 was the base address if I recall correctly.

    4. Jason Ozolins
      Windows

      Memory-mapped frame buffers, old as the hills

      Video access to large address ranges of main memory has been around since long before the Amiga. For instance, the Atari 800 and the Commodore 64 - both those had memory-mapped frame buffers which could be set to read from most parts of the RAM.

      The Atari 800 custom audio/video chips were IIRC designed by Jay Miner, who went on to design the custom chips in the Amiga. The Amiga had much more CPU memory also addressable by graphics hardware, and added a nifty DMA coprocessor that could do bit-oriented graphics operations over data stored in the 'chip' memory, as well as moving data around to feed the PCM audio channels and floppy controller... but at the core, it was the same kind of architecture, just scaled up.

      Things got much more interesting when CPUs got write-back caches; now explicit measures were required to ensure that data written by the CPU was actually in memory instead of just sitting in a dirty cache line at the time the GPU or other bus mastering peripheral went to fetch it. It's all the same cache coherency issues that multiprocessor system architects have been dealing with for years, and in a system like the XBOne, most of the peripherals are more or less peers with the various system CPUs in terms of how they access cached data; in fact, most peripherals look like specialised CPUs, hence the "heterogeneous" part of the HSA. You don't need to explicitly flush CPU caches, or set up areas of memory that aren't write-back cached, in order for the GPU to successfully read data that the CPU just wrote, or vice versa. That's the nifty part.

      I'm guessing that the XBOne, like the Xbox 360, will have its frame buffers and Z-buffers integrated on the enormous CPU/GPU chip. That will reduce the bandwidth requirements on main memory by a great deal, as GPU rendering and video output will be served by the on-chip RAM. There are other ways to get some of the same effects - the PowerVR mobile device GPUs render the whole scene one small region ('tile') at a time, only keeping a couple of tiles plus the same size of Z-buffer in on-chip RAM, then squirt the finished tile out to main memory in a very efficient way - but it does create other limitations in how the graphics drivers process a 3D scene; any extra CPU work to feed the GPU takes away from power savings given by the simpler, smaller GPU. Tradeoffs abound.

    5. asdf Silver badge
      Trollface

      the good ole days

      Ah the Amiga the greatest product ever except for that whole failure in the market place thing.

  5. RAMChYLD

    Chip made in Malaysia...

    Console will not be available in Malaysia until 5 years after Singapore gets it, and even then there will be no Zune, no XBox Live Gold and 99% of the games won't make it over.

    Seriously, Microsoft! What the hell!?!

  6. Anonymous Coward
    Windows

    Microsoft 363mm2 28-nanometer chip ..

    How did TSMC steal Microsofts innovation in LSA chip fabrication?

    Did anyone tell Intel that TSMC is Microsofts new best buddy?

  7. kain preacher Silver badge

    Wait if both systems are using AMD CPU, whil that mean there will not be differences in game play between the two ? Also will this mean the PC games will not suffer either ?

    1. LaeMing Silver badge
      Happy

      Yes,

      But only if you are a perfect sphere moving in a complete vacuum.

    2. MattEvansC3

      AMD claimed it would but irrespective of hardware you've got the OS issue. Apple Macs run on an x86 CPU yet its only been in the past couple of years we've seen any volume of games released for them and even that can be attributed to Valve wanting to move away from Windows as opposed to wanting to move to OSX.

    3. Oneman2Many

      They may have similar hardware, but they are running completely different O/S

      1. Anonymous Coward
        Anonymous Coward

        "They may have similar hardware, but they are running completely different O/S"

        Sony run a Linux like OS. Microsoft run a modified Windows 8 kernel and hypervisor.

        If we go by benchmarks of Windows 8 versus the latest Ubuntu, Microsoft will have a performance advantage in the OS side both for large file transfers and for graphics.

        1. Anonymous Coward
          Anonymous Coward

          >Sony run a Linux like OS. Microsoft run a modified Windows 8 kernel and hypervisor.

          >If we go by benchmarks of Windows 8 versus the latest Ubuntu, Microsoft will have a performance advantage in the OS side both for large file transfers and for graphics.

          More demented propaganda from RICHTO! Must we?

  8. Anonymous Coward
    Anonymous Coward

    Q: "Why does a game console need billions more transistors than an Itanium?"

    A: Because a game console is vastly more useful than an Itanic.

  9. Mikel

    Unsat

    I wonder how much Microsoft paid for this.

  10. Anonymous Coward
    Anonymous Coward

    tweaked by Microsoft to within an inch of its life

    Red Ring Of Death springs to mind.

    1. Anonymous Coward
      Anonymous Coward

      Re: tweaked by Microsoft to within an inch of its life

      It sounds exactly the same. My contacts at flex doumen tell me that end of line yields for the xbone have only just crept into double digits.

      In other words don't expect one this year, and if you do, expect it to be DOA or soon afterwards.

      Sounds all to familiar...

      http://venturebeat.com/2008/09/05/xbox-360-defects-an-inside-history-of-microsofts-video-game-console-woes/2/

      1. lansalot
        Thumb Up

        Re: tweaked by Microsoft to witlolhin an inch of its life

        ah yes... "your contacts".. Please, feel free to share more fantastically "genuine" nuggets like this. We're all desparate for information, and your unsubstantiated claptrap fits the bill nicely !

        1. Anonymous Coward
          Anonymous Coward

          Re: tweaked by Microsoft to witlolhin an inch of its life

          Believe what you want, I don't care. However people have been to both the TSMC and Flex facilities, can you really be sure that I havn't??? Thought not...

          1. Atonnis
            FAIL

            Re: tweaked by Microsoft to witlolhin an inch of its life

            I'll take the odds on that one, bozo.

          2. James O'Shea Silver badge

            Re: tweaked by Microsoft to witlolhin an inch of its life

            re AC 11:45...

            As you posted AC, I feel that it is quite likely that you have never been within 100 miles of either facility. And, due to the fact that you saw fit to hide your identity, that even if you have been to one or both, you can't prove it.

            Now, as to the yields... I have no idea. _I_ haven't been to either facility. However, I suspect, based on people I know who have been to TSMC, that you are exaggerating somewhat. Yes, the yields have not been stellar. No, they're not as bad as stated. There _will_ be dead Xboxes. Lots of them. I doubt that there will be nearly the number that you suggest, though.

            Now, if you would provide some actual support for your position, something a little better than "I know 'cause I know", perhaps there might be a re-evaluation. As is, though...

  11. mark l 2 Silver badge

    i wonder how much cooling its going to require with 5 billion transistors?

    is it going to need a air conditioning unit installed under your TV and double as an extra heater in winter?

    1. Brenda McViking

      Burn baby burn

      Personally I think that the chip companies should leverage the extremely useful heat producing capabilities of their multi-billion transistors switching.

      I mean, who wouldn't want a house centrally heated by their computer? Picture it - SWMBO puts the thermostat up, AGAIN, and you get the option to model the microclimate in your back garden and sell the data to the MET office or perform a simulated nuclear test on the neighbour's cat. I might actually consider spending £2500 on a boiler if it came with an intel inside sticker and a HDMI port and could run Crysis at 42fps.

      1. Colin Ritchie
        Windows

        Re: Burn baby burn

        In 2005 I was playing WoW on a G5 iMac, the IBM CPU would hit 90 degrees and keep the room lovely and warm on a long winter's night in Molten Core. The twin fans sounded like a remote control aircraft was flying round the room and my nickname on coms was Squadron Leader...

    2. spodula

      cooling

      If most of it is cache memory, probably not that much more.

      The Transistors in Microchips only use significant power when they are switching, so if they switch only rarely (In digital terms), Eg as in memory, they dont generate much heat.

      Thats why memory sticks, which are easilly pushing 1 million transistors, rarely require extra cooling.

      1. spodula

        Re: cooling

        "Thats why memory sticks, which are easilly pushing 1 million transistors, rarely require extra cooling."

        I mean Billion of course. Sorry.

  12. Anonymous Coward
    Anonymous Coward

    2 billion transistors are set aside to power the NSA compressed audio, video and data streams - direct from your house. Nowhere to hide, gamers.

    1. xyz
      Black Helicopters

      I was just about to don my tinfoil hat when I saw your post

      >>2 billion transistors are set aside to power the NSA compressed audio, video and data streams - direct from your house. Nowhere to hide, gamers

      Or how to get the public to pay for a massive planetary wide distributed computer for the NSA. No wonder MS wanted it connected to the internet at all times.

      1. Aldous
        Facepalm

        Re: I was just about to don my tinfoil hat when I saw your post

        You guys do realize the NSA has been doing this for years right?

        Every thread about anything has a mention of the NSA since PRISM was leaked. You realize the Army shoot people? and that what a politician says is not always true right?

        If the NSA/CIA/Mutant Lizard people want a distributed computer system they will just buy one. When the EFF were fighting to say DES (The U.S government approved crypto cipher) was insecure it was the intelligence community saying it was fine.

        The EFF then made custom ASICs for $250k and the DESchall did it with a distributed net of home machines. Do you think that was news to the spooks? They probably had whole DC's full of stuff to break DES, similar things were seen with the Clipper chip.

        Why would they need to risk being found out by hijacking machines that they do not control (would you want Bunnie Huang to find your NSA back door). They can either get them built themselves (ASICS would be far more useful then standard CPU's) or create a botnet on the millions of US Gov owned P.Cs.

        1. Anonymous Coward
          Anonymous Coward

          Re: I was just about to don my tinfoil hat when I saw your post

          You tell them the facts and they'll just downvote you.

          Its highly entertaining if you're in the know about the Intelligence Community and/or Cryptographic Research. Same with the armchair CEOs, armchair Engineers, and armchair Warriors running around here.

        2. Salts

          Re: I was just about to don my tinfoil hat when I saw your post

          Have an up vote

  13. Joefish

    One billion transistors to play the latest games...

    And another four billion to ensure you don't have any fun whilst doing so...

  14. Mikko

    Meanwhile, the 28 nm GTX Titan apparently contains about 7 billion transistors, and the Nvidia's GTX 680 about 3.5 billion transistors (the GTX 680 on a slightly under 300 square mm chip).

    The Xbox One GPU is far less powerful, but it will still eat up a significant amount of space. Then there is the 8-core CPU, not to mention the cache that takes up half the transistor budget... For me, it looks like the NSA had to settle for a few hundred million transistors at the most.

  15. Anonymous Coward
    Anonymous Coward

    Nobody gives a shit so long as the bloody chip doesn't detach itself from the mobo like the 360.

  16. Anonymous Coward
    Anonymous Coward

    ??WTF??

    Would be impressed if this was for some sort of life-saving system that will change humanity for the better - nope, it is so that young kids can do role playing games, shooting the crap out of each other in a virtual world while in the real world they are doing nothing which could be classed as useful... Thank you Microsoft for providing the virtual opium to destroy our digital youth of today :(

    1. Atonnis
      FAIL

      Re: ??WTF??

      ??WTF?? You blame a console for bad parenting?

      And your assumption of youth is quite faulty as well. Given the price range of the devices + games, these things need someone old enough to afford the bills.

  17. Robert Grant

    Old news?

    Didn't the Xbox 360 have memory shared between GPU and CPU? That (development ease aside) was why lots of cross-platform games looked better on that than on the PS3.

  18. BigAndos

    Reminds me a bit of the logic chip in the Acorn Electron mentioned in the retrospective article last week. Funny how things go round again in tech, bit like how virtualisation and the cloud have brought "dumb client"/Server models back into fashion.

  19. Anonymous Coward
    Anonymous Coward

    1920 x 1080

    Xbox one is meant to drive a FullHd monitor, right? Nvidia behemoths gfx cards pushing 3 billion tn are meant to push larger / combo displays past 1080p. Which means xbox will have power to turn all the eye-candy on such a small display, by comparison. Or can it drive a displayport beyond 1080p screen? Specs? Of course the games will look great and 60fps smooth, since I bet it wont be designed to play in more than 1080p, which is the norm these days. Or most people will have a monitor that resolution available.

  20. South

    Better than my new PC?

    Q: just built my first gaming rig after becoming disillusioned with consoles.

    i7 950k (o/c 3.8ghz)

    14gb ram

    asus GTX680 4gb

    My question is are the new consoles going to blow this out of the water performance wise.

    1. Ramiro

      Re: Better than my new PC?

      Certainly not. By most people reckoning, consoles nowadays are much more about the convenience than raw performance.

    2. soaklord

      Re: Better than my new PC?

      Question: Did you pay less than $500 USD to build your machine?

      Apple != Oranges

  21. Anonymous Coward
    Anonymous Coward

    ... tweaked by Microsoft within an inch of its life

    Then MS say the specs for the PS4 and went two more inches.

  22. 080

    X Box One or Playstation, how boring, WGAF

  23. Anonymous Coward
    Anonymous Coward

    more flashing LEDs, that's what we need

    Judging by the photo of the chip it looks like it may have some LEDs on it. Well I hope so anyway, because I've just realised that despite silicon chips doing so much for us... they look really boring. So more flashing LEDs on my chips, please.

    (Please don't burst my bubble and tell me they aren't LEDs.)

  24. cyberdemon
    Devil

    Maybe I'm just cynical..

    This seems to me like an effort to foil the mod-chippers.

    Putting it all on a SoC with custom silicon could make it pretty much unhackable..

    I'm sure a lot of managers at Microsoft would love it to be a black box filled with epoxy and only ethernet in one end and HDMI out the other, with a couple of antennas inside for controllers etc.

    If it weren't for the small issues of cooling, and those pesky soldiers in their disconnected army bases kicking up a fuss about always-on connectivity, they'd probably have done that already!

  25. Lord Zedd

    Shared memory

    Another reason I won't be buying one. Plus, having all that crammed onto one chip means it will have cooling issues even worse than the 360.

    1. Mikel

      Re: Shared memory

      I'm sure it isn't the size of a suitcase because they thought that was sexy look for under your flatscreen.

  26. Down not across Silver badge

    Fanboy voting

    It does make me chuckle to see, what looks like fanboy downvoting, in action. If anyone dares to post (obvious trolls and shills excepted) anything positive or supportive of XBone or technology it uses, it gets downvoted.

    Are the Sony supporters really that insecure?

    For the record I don't really care either way as both consoles are likely to serve their purpose and neither company can be trusted.

    Yeah there are some interesting differences that have sparked debate and discussion and I will be interesting to see how it all works out in the end once we can see stuff actually running on the released hardware.

This topic is closed for new posts.

Biting the hand that feeds IT © 1998–2019