I do wish people would stop calling these things transistors. They are not. Sure they are interesting and possibly very good switching devices, but a real transistor can do a whole lot of other things.
The team that earlier this year characterized a four-atom wire that obeys Ohm’s Law has now demonstrated a repeatable single-atom transistor. While single atoms have been observed acting like transistors in the past, the ‘device’ demonstrated by the UNSW, University of Melbourne and Purdue team is exceptional in that it has been …
Including operate at something vaguely approaching room temperature.
Right! -- And the quantum effects?.
And with so few atoms uncertainty/Heisenberg/Schrödinger come into play. Are we being told that the most reliable/dependable branch of Physics is wrong? Surely not?
Ohms law may be theoretically possible here too but again the coulombs involved would be so small that randomness/uncertainty would be a large or dominating factor. Whilst the manipulation of individual atoms is possible and done, in transistors random quantum tunneling and band gap limitations with so few electrons would seem very problematic, thus making it unreliable.
I'd guess a lot of these 'effects/observations' arise in part from calculated (derived) units such as Plank voltage, Plank current, Plank impedance etc. With these derived units, Coulomb's constant/permittivity and permeability of free space etc. almost anything in the electrical engineering sense can be calculated at the quantum level, but with so few atoms involved what's it amount to practice?
Would the experts in this area tell us whether there's anything to this in the practical electrical engineering sense or are they just essentially extrapolations of observations that might be useful in quantum computing etc.
Perhaps also there's been some screw-up in the reporting.
Maybe they've managed to build the much-anticipated Heisenberg Compensator?
Re: @Graham Wilson - - Hope so, it's time. (But seems it's already been invented.)
We need a Heisenberg compensator real soon now but it seems Sun's already beaten them to it:
(Seriously, this happens when PR deliberately screws research info (to help funding etc.), and produces videos w/o sufficient facts. Here, the video's headlines could have Mr. Average imagining the transistors running inside his near-future, now-implanted, super-miniature iPhone (a la a transistor radio) but the -196C. gets a bit lost in the translation.)
I'm also a bit annoyed, but the other way around. You are right, this isn't a transistor. Comparing this to a transistor is like comparing a transistor to a tube.
Yes, the tube still as it's (limited) uses today, but it works in a vastly different way.
What this is is basically a q-bit. Being able to tell where the atom is and do it repeatedly in a structure is the fondation of the next generation of computers. Just like the transistor completly changed computers when people started using them instead of lamps, this as the potential of completly changing the game again.
Keep it Cold
"It also needs to be kept at -196°C to operate"
So, keep it cold or the smoking hairy golf ball is back !
All very well
But to make them practical is going to be a huge, and possibly insurmountable, challenge.
Of course there is the challenge of keeping the damn things cool.
Next up, pack them in tightly and mutual field effects are going to be interesting (where interesting is a euphemism for "a bitch").
Then of course good old cosmic rays will likewise be "interesting". A cell state might survive losing a few electrons, but with a single atom there really isn't enough charge to survive anything.
These little buggers are going to be really challenging to manage. Keeping billions of em consistent (as you need for a microprocessor or such) is likely not something that can be achieved in the next 20 years.
Senior researchers have to get *funding* to perform research. Try this.
SR "I'd like some funding to develop improve qubits"
Funder "What's that?"
SR"It's the potential building block for a conceptual computer architecture that people might build in the future."
Funder [thinks] "WTF going to license this off us. A might be building block for a maybe computer."
Our studies show by 2020 semiconductor mfg will need transistors 1 atom across. By skipping intermediate levels and starting on developing devices at this level now we will be in pole position to offer processes, devices and consultancy they can use for this purpose.
Funder [thinks] "Sweet. We'll be able to flog this to Intel and I'm off to a permanent spot at Bondi."
Funder "Application approved."
Watch the video, listen to the SR's description. It's not a lie, more a creative re-purposing.
BTW Building a 2 giga transistor processor (as at 2008)
( http://en.wikipedia.org/wiki/File:Transistor_Count_and_Moore%27s_Law_-_2008.svg )
One transistor at a time is pretty stupid. OTOH *current* quantum computers using Sodium vapor atoms held by lasers have done significant work with 10s of atoms.
Which is *much* more viable using STM as a mfg tool.
I've still yet to see a quantum computer that can execute an actual *program*, So far they all seem like hard wired analog processors to me.
A first rate effort and congratulations to everyone involved.
-196 C is cold, but
it is "merely" liquid nitrogen (LiN) level. Not everyday, but not nearly as impractical as previous devices requiring liquid helium (LiHe) temperatures. Think of the Josephson switch, which was hailed as a breakthrough in high performance computing once. LiHe cooling requirements.
If we can build qubit devices based on this, the potential gains might outstrip the cost of LiN cooling.
My suspicion is that all of the chaps - I don't know about their female supervisor -
working on this project are rogue overclockers. A glimpse at them adjusting the machine in the video sufficed to confirm this suspicion - if sound had been recorded then, we no doubt would have heard one researcher saying to another - if we can lower the temp another Kelvin, I'm sure we can get this graphics card to over 9.0 GHz....