back to article Big Blue demos 100GHz chip

IBM reseachers have made a breakthrough in the development of ultra-high-speed transistor design, creating a 100GHz graphene-based wafer-scale device. And that's just for starters. The transistor that the researchers have developed is a relatively large one, with a gate length of 240 nanometers - speeds should increase as the …

COMMENTS

This topic is closed for new posts.
  1. Anonymous Coward
    Anonymous Coward

    System Frequency ?

    If I understand this correctly, the -3db frequency is meant here. This means that a sinus input of 100Ghz achieves an amplification of -3db.

    As a useful digital signal of n Hz needs harmonious frequencies of about 7n to 10n Hz, this means that we can expect digital circuits of 10 to 14 GHz operating frequency. That certainly is an improvement over those 5 GHz we already have, but not really dramatic as compared to current CMOS technology.

    1. Christian Berger

      @system freuquency

      Actually for amplifiers you typicall give the Gain Bandwidth product, which is about the frequency you reach amplification of 1. For "boxed" amplifiers where you do have the 3dB frequency. It's 3dB lower than the normal signal amplification. This would still be OK for digital circuits.

    2. Hungry Sean
      Pint

      oi

      Further than that, they are talking about the switching speed for a single transistor, presumably driving no load (they don't say if this is nmos or pmos either). A very very heavily pipelined processor might have logic about 8 fanout of four inverters deep, plus pulse latched registers would bring you to about 10 gates deep, so if the 100 GHz was for the digital range of the transistor, the system clock would be maybe 10 GHz, probably less once the effect of load is included. I'm not good with the analog side of things, so I'll bow to joeuro above and conclude that should be cut in 10 again, so maybe 1GHz system clock with the current generation. This jibes with the statement that these are 4x as fast as a conventional transistors with a similar gate length (40nm refers to the gate length after some depletion, so this is about 5-6 times as big as the bleeding edge, and by rough calculation, about 1/2 the speed).

      All that said, this seems like some very cool technology-- IBM's hardware is always very impressive. I hope they succeed in bringing it down in size.

      Beers to joeuro for quick thinking, and beers to the brainy bofffins.

  2. Anonymous Coward
    Thumb Up

    1 THz

    EETIMES.COM:

    "By optimizing its process to increase mobility and shortening the gate length, IBM will next aim to increase the speed of its graphene transistor up to 1 THz, which is the goal for the CERA program. "

    Now that would be the speed that would create a 100 Ghz clock CPU !

  3. Chris Tierney
    Thumb Up

    Holy Smoke

    If this is true then I'll invest all my money in IBM. :)

  4. Anonymous Coward
    Anonymous Coward

    THz Optical Receivers

    It seems Carbon is bound to replace Silicon:

    http://www.eetimes.com/showArticle.jhtml;jsessionid=HYQKL4HWJVYD5QE1GHPSKHWATMY32JVN?articleID=220600274

  5. Captain Thyratron
    Thumb Up

    Suddenly, gigahertz--a hundred of them!

    Holy damn it Christmas.

  6. Sly
    Coat

    zoom zoom y'all

    mines the one with the redneck mazda on the back

  7. K. Adams
    Joke

    Still won't be fast enough...

    ... to run Crysis...

    1. Tarthen
      Thumb Down

      Actually....

      Crysis' bottleneck is the GPU.

      My 2.4GHz quad has no problem with PhysX assisted playing :D .

    2. SynnerCal
      Joke

      Re: still won't be fast enough...

      to run Crysis? Erm, yes it will be - my pensionable 2.6GHz (single core!) Athlon64 manages to run it quite nicely when coupled to a (less elderly) 8800GTX gfx card. Now if your benchmark was CoD:WaW then that's a different story. <sigh>

      Like most I'm wondering if a 10GHz processor will be fast enough to run the version of MS-Office that's out at the same time... (cheap shot I know, but it is a Monday morning).

      Seriously though, IBM tech still manages to impress. Good on them.

  8. Neil Paterson

    Temperature?

    "Also notable is the fact that the graphene transistor's 100GHz performance was at room temperature. When chilled, even higher performance is to be expected"...

    ...and in the real world, when it's at higher than room temperature, it'll be slower...

    Bah, humbug!

  9. Geoff Smith
    Troll

    Moore's Law is now back in effect?

    Does this mean the software development industry can go back to the heady days of yore, producing bloatware as usual, secure in the knowledge that Moore's law will cover most, if not all, of their sins?

    After all, cleaning up sloppy coding, ridding the application architecture of unnecessary middleware, or excessive levels of abstraction caused by the choice of language and development frameworks is real work. Boring, nasty business.

    Without hardware resource constraints, we can allow developers to explore their "creative horizons" in interesting and sometimes exotic ways, without worry that the thing will hit the wall once you put a real-world peak user load on it.

    I can almost hear the cheers already.

  10. Anonymous Coward
    Jobs Horns

    You'll still need a Dual Core Version

    To run Win7 & Office 2010 :-D

    1. jonathan keith
      Boffin

      And a similar GPU

      To run Crysis 2...

  11. Anonymous Coward
    Thumb Up

    Encryption? My arse

    So when these new processors actually get made, all these super duper encryption technologies won't be worth diddly squat as they will get broken in hours if not minutes. Think proper security will be needed rather than hoping that it'll take a snoop years to find out the information.

    1. prathlev
      WTF?

      No problems for encryption

      You would need much more than an order of magnitude difference in computing power to change the feasability in brute forcing good encryption.

      Discover some flaws in the protocols instead. Or get a quantum computer.

    2. Anonymous Coward
      FAIL

      "Encryption? My arse "

      No, they won't. You simply don't understand the issues and orders of magnitude involved here. Brute forcing, or even reduced key space brute forcing exploiting various loopholes would still require a geometric increase in throughput.

      Obviously, with an NSA-sized budget, you could use an underground city-sized array, but all that money spent on researching how to bring some sort of reliability to the output of of multi-quibit quantum processors is the danger to crypto.

    3. Steven Knox
      FAIL

      Encrypt your arse.

      Increased CPU speed is only a linear improvement in breaking cyphers. So a 20x faster CPU will only take about 1/20th of the time as a current CPU. Since the better encryption technologies can resist a brute force attack for thousands of years, we're still talking decades or centuries

      However, increased CPU speed can result in an exponential increase in cipher security, by allowing an increased key length. So assuming these CPUs aren't sold solely to hackers, encryption is hardly at risk.

      If you want to break encryption, find the magic algorithm than can easily reverse what we believe to be one-way mathematical operations. Then don't patent it or share it with anyone.

    4. Degenerate Scumbag

      Your arse indeed.

      As processing power increases, more becomes available for encryption as well as cryptanalysis. Longer, stronger keys become practical, so snoopers are no closer to easily breaking crypto. The only caveat is that any older data that is still sensitive needs to be re-encrypted with the newer schemes to remain safe.

    5. Paul 4

      Proper security?

      What do you call proper security exactly? I thought it was good strong encryption, where people are encryption, although I fear that you think there is somthing other than encryption to most security. Breaches happen when the encryption is either not strong enough or not properly done (i.e. work arounds or errors)

    6. Anonymous Coward
      Boffin

      That's simple...

      ... we just increase the key sizes. We will have a hell of a time, though, upgrading all the old stuff.

      Sha65536. anybody? :D

  12. Phillip Webster
    Thumb Up

    Good stuff

    But it's useless unless it's applicable to RAM (cheaply) and/or another breakthrough is made for RAM.

    No use having a 100GHz CPU if your RAM is stuck at <3GHz with the current computing architecture.

    Still, interesting development.

    1. Anonymous Coward
      Thumb Up

      and what about HDD/SSD...

      CPUs that fast will definitely require a somewhat re-organized motherboard and disk IO

      I agree that regardless of speeds programming needs to become a priority again.

      Thumbs up as this is very interesting.

  13. J 3
    Thumb Up

    Graphene...

    ...is awesome.

    And to think the thing was right there in our pencils all along.

    1. Keith Oldham
      Joke

      Re : Graphene...

      Nothing wrong with pencils -just that we've not been clocking them fast enough!

  14. sT0rNG b4R3 duRiD

    Ok...

    Now be a good IBM now and please go wtfpwn intel with some actual products...

    kthxbai

  15. Christian Berger

    Big Misunderstanding

    The main problem most people don't get is that those 100 GHz is probably meant for small signals. Essentially you amplify some DC signal and add a bit of high frequency signal. This is how it's done in analogue electronics. And in fact there 12 GHz is already _cheap_ as you can see at 3 Euro LNCs for direct broadcast satellite reception.

    In digital applications you usually want to switch a transistor all the way. If you don't do that in CMOS you will essentially short circuit your power supply and your chip will blow up. This did happen in some early CMOS designs.

    There is actually one digital technology which aims to use the speeding benefits of analogue circuits is ECL Emitter-coupled logic. Those circuits can easily reach the multi gigahertz range, however a simple gate can take several milliwatts of power. They are usually used as dividers for frequency counters.

  16. ShaggyDoggy

    @ Phillip Webster

    Real RAM. not the consumer stuff you get in toy computers, runs somewhat faster,

    1. Ned Leprosy Silver badge

      "Real" RAM vs. "toy" RAM

      That's all very nice, but since my budget stretches to toy computers, toy RAM is what I end up with. Lots of reasons not to like PCs, but lots of currency-shaped reasons not to buy something better.

  17. Anonymous Coward
    FAIL

    Crypto....

    "So when these new processors actually get made, all these super duper encryption technologies won't be worth diddly squat as they will get broken in hours if not minutes."

    That is plainly WRONG. If you would just brute-force a 128 bit symmetric cipher, you would have to test 2^128 keys. Even if you had this 15 GHz clock, you could at max test 2^33 keys per second. So you would still need 2^95 seconds to do that. Now that would be 1256154276291608599593 years....

    Yeah, you are Uncle Sam and have 1 billion processors to work in parallel, but that still would reduce it to only 1256154276291 years. That is 1.2 Trillion.....

  18. Puck
    Thumb Up

    Competition for Intel...

    ...would be good. Might lower too-high-in-recent-years prices on the latter's products too.

    Although, to digress, I do resent it when we hear Ministers saying we all need to make better rather than cheaper - inspired as they clearly are by examples such as this, and oblivious of the effects of superinflated capital goods prices on anyone wishing to start up in business. We're not all fucking geniuses.

  19. Ylydxi Reclo
    Thumb Up

    WOOO!

    Pray this research revives the processor speed development. In the 90s, there was a report like this every two months.

  20. Mectron

    Forget Crysis

    Will it play Doom?

  21. Bronek Kozicki

    RE: Encryption? My arse

    Number crunching reduction from one billion years to two millions does not count as a failure, does it? Unless you meant weak algorithms, in which case it does not really matter.

  22. Mark Eaton-Park

    At last the solution to global warming

    If IBM extracts their Carbon from atmospheric CO2 then they could kill two birds with one stone

    1. Keith Oldham
      Happy

      Re : At last the solution to global warming

      There's a joke alert for facetious comments!

  23. SIGTERMer
    Go

    it's the 90s again + 20 yrs

    looks like blue is back.

  24. Anonymous Coward
    Anonymous Coward

    @joeuro

    No, forget 3db points. It's most likely the transistor is being used in a switch configuration, 100GHz is the frequency at which it switches.

    1. Mike Bishop
      Thumb Down

      RotaCyclic Nonsense

      The abstract of the IBM paper (linked to in the article) clearly states that 100 GHz is a cutoff frequency. It isn't a switching rate, and you cannot compare it directly to a CPU clock.

  25. Anonymous Coward
    Anonymous Coward

    CPUs, RAM

    People seem to think this development is only for CPUs. At the moment this is nothing to do with CPUs. So far all they have done is to create a transistor. Does anyone not know what RAMs are made from?

    1. Chris Harden
      Boffin

      um

      is the answer cheese?

    2. The Equestrian
      Joke

      Rams?

      Mutton?

    3. Captain Thyratron

      I, for one,

      think it'd be neat to make a radio out of these things.

      Transistors are used for more than just digital circuitry.

  26. E 2
    Happy

    First approx use case question?

    Is this a fair first approximation?

    240 nM feature size versus 32 nM feature size => 56.25 = (240/32)^2. So 56.25 times fewer graphene transistors per unit area than silicon.

    100GHz/3.33GHz => 30.03 time faster operation of a graphene transistor vs silicon.

    At first blush this looses against silicon (30/56.25 < 1.0).

    But if an IBM graphene CPU was made with 1/30 the number of transistors but simpler design needing less deep pipelines, less branch prediction, the graphene CPU could still be faster at, say, floating point or integer/logic but maybe not both, despite having many times fewer number of transistors.

    I'm just a sysadmin and programmer, no kind of hardware guru. So I know I could be quite out to lunch...

    1. Hungry Sean
      Boffin

      not quite, but good thinking

      First off, you're comparing individual transistor switch speed vs switching speed of a full pipe stage (probably 10-20 transistors deep, plus some pain from capacitive load and wire delay). The key bit is in the article where they say it is 250% the speed (2.5x) of a similarly sized silicon gate, so presumably the prototype is slower than a 40 or 32nm silicon gate. 100GHz number is attention grabbing, but not relevant to architectural speed.

      Now, your idea about simpler architecture is not a bad one if the actual issue were a big fast gate vs a small slower gate, but there's two major problems. First, if you could make a significantly smaller core through the right choice of simplifying architectural decisions, with silicon gates, you could have more of those cores, or they could have wider execution paths (this was the driving principle behind the Cell's architecture). The other problem is that one of the major limitations on core speed is propagation delay along wires, particularly global interconnect. So, even if you have a very fast transistor, it still takes the same time for a signal to get from one side of the die to the other. This will be a killer if you can only fit a single core on your die. The more likely use case for big fast gates (assuming the processes were compatible) would be to mix them in sparingly along critical paths, and indeed the circuit guys already play all sorts of games with gate sizing (mainly for driving big capacitive loads quickly).

  27. E 2
    Jobs Horns

    @Mectron

    Yeah, I suspect it would run Doom :-) But it would cost about 100 times the price of a bottom end current x86 CPU! Almost certainly it would become a top dollar IBM blade server product CPU, just like the Cell. Grrr.

    I think we got Doom on cell phones. Except maybe not the iPhone. You know, not wholesome enough.

  28. ForthIsNotDead
    Joke

    You know what this means?

    If this technology finds its way into the Windows world, Microsoft will have to bloat their software by a factor of 10 just to maintain the same performance as today.

    We're all doooooomed!

  29. Andy Watt
    Flame

    dialectric?

    I take it if we get up to 100GHz E.T. will be able to use this stuff to phone home then?

    Woopsie!

  30. Daniel 1

    Competition for intel?

    Are ye missing something? They'll patent this, and lease the rights to use it, to everyone else, including Intel - same as they have been doing with all their inventions for 3 decades. Around 60-80% of the profits on a current Intel chip go straight Armonk, anyway, with none of the production overhead to worry about.

    It's the same with high density disc drives, laser optics, RAM - a whole raft of the hard stuff that goes into anything, these days - from a mainframe to a E-book. There's not been a lot of computer, you could build, since the late 80s, that would even function without making use of some patented IBM technologies. Sooner or later, the money gets back to Armonk. Thats how you get to 'lose' control of a PC industry, and yet still somehow manage to remain the biggest IT company in the world.

    We see a tentacle here, we see a tentacle over there - but somehow, no one ever seems to wonder if all the tentacles might actually joint together into something really big, that's right underneath them.

  31. E 2
    Happy

    @Hungry Sean

    Wow, thanks!

This topic is closed for new posts.

Other stories you might like