Researchers think they have opened the door to terabit per second Ethernet links using multiplexed 10Gbit/s data streams and small chalcogenide demux chips to demultiplex the 10 gig streams. An optical network cable can have lasers pumping multiple 10Gbit/s into different colours of the light spectrum and squirting them in …
640Gbit/s ought to be enough for anyone
mine's the one with the list of dubious 'quotes' in the pocket
Laser light operates in the 0.4μm to 10μm wavelength range, i would imagine that they would modulate small bands of these wavelengths from one laser source split into the appropriate wavelengths (by the waveguides?) then modulated individualy, then recombined and transmitted to wherever they are going to, then decombined again back into there respective wavlengths where the data is read.
But that's just my guess
What's the range?
How far will this be able to transmit, and are repeaters practicable?
I.e. can you transmit across the Pacific, or just across the lab?
1 TB ?
there is something I don't understand here
Everyone talks about the terabit possibility etc,
but for me 640 is closer to 500 than to 1000 (or 1024)...
they're still half way there, even though they're getting closer.
paris 'cos she also likes to pump things up
Not so fast..
Electrons have an intrinsic limitation in their 'reaction speed', a figure in the picosecond region. A single terabit/second stream would therefore be changing faster than electronics as we know it ever possibly could. Present opto-electronic components consequently operate between 10-100Gbps. Also, transmitting single channels at 100Gbps+ requires an extremely low signal to noise ratio hence the use of multiple 10Gbps streams instead.
So, to transmit 640Gbps you can't use a single channel because of errors to to non-linear effects, dispersion and quantum noise etc. You also can't use a single electronic detector because of the limited speed of electronics. The solution is to use 64 multiplexed 10Gbps channels but demultiplexing a 640Gbps data stream is again pushing the envelope as far as modern electronics goes, hence the solution needs to be implemented all-optically using highly nonlinear materials. This is very challenging to implement, hence the excitement over the published results.
Paris, because the rush for more bandwidth really is being fuelled by the demand to download hi-def copies of her theatrical work.
All well and good
But when do I get it bult in my router and PC?
Now would be good!
Long time to wait...
"..Commercialisation of such technology is, of course, if it takes place at all, many years away..."
So we'll be seeing it in Japan in a month, and Germany the week after, then?
Of course, silly me.
Ah! That's where I've been going wrong!
I wasn't using small chalcogenide demux chips to demultiplex the 10 gig streams!
Here was silly me, desperately trying to reverse the polarity of the neutron flow, while trying to avoid CROSSING the streams when I should really have been DEMUXing them! Bloody amateur mistake that!
Of course I see the reasons for the chalcogenide now, it helps fibriliation of the pulse clamp restrictors feeding to the flux capacitor, while smoothing the mid-range response.
I haven't yet seen the phase-oscillation diagrams, but I would expect you should be able to clearly hear the second violinist's right third fingernail in the 2nd Movemnet of Beethoven's 17th played at the Alaska State theatre in 1937 by Adophus Trompe.
Highly recommended for all true audiophiles.
He might be onto something...
Couldn't we just use Prisms at either end? That way we have separate signals still traveling over the fiber. Just use the prism to split / combine the wavelengths at either end..
I'm not some kinda crazy Optics boffin, but I do remember High-school Physics class with the whole splitting light into wavelengths / combining different colored LEDs to create different colors.
I want this for my home PC network.. yea really
I'm so tired of being limited to 1GbE at home.. and 10GbE is insanely expensive.. I wish everything would move to optical.. imagine if we could have 100GbE as your home network.. PC's could natively share SMP and devices, etc..
I wish IBM's group of 32nm super friends would all ban together and work out a mass produced low cost optical chipset for motherboards, nics, switches, etc that could go mainstream.. give me 10GbE, 40GbE, 100GbE.. just something more than 1GbE and that is stackable ( http://en.wikipedia.org/wiki/Link_aggregation ).
For years I see this and that "research" is released but nothing ever seems to actually come out to the real customer. I guess the US military sits on it until it has moved on to the next thing???
You people seem to be missing the point.
The problem isn't feeding 100 independent data streams down the same pipe and separating them out at the other end. With todays hardware that is a relatively trivial problem.
This is DEMODULATION.
The problem which has been solved here is taking one single terabit class datastream, breaking it up into managable parallel streams, pouring them into the same pipe and then recombining those streams into ONE at the other end so that the bits emerge IN EXACTLY THE SAME ORDER as they were originally fed in.
This is DEMULTIPLEXING.
I STILL don't know
How you set a laser printer to "stun"
Using a DWDM you can get up to about 160 channels- and more still if you were actually designing your own. Or used two & some more fancy optics.
Zap a 10GB/s laser down each channel of the DWDM.
Read other channels at end into 160 10GB/s receivers.
Use software or clever hardware to split data appropriately, as you would with bonded ISDN channels.
Et voila! 1.6TB/s of data transfer rate.
Want it to cost less? Use a CWDM. 16 channel CWDM system- a few grand. 10gbps down each fiber gives 160gbps- probably enough for your data needs in the near future.
Doesn't seem too hard to me, what're they doing in these research labs?
Why the need?
We've been doing 32 channels sdh @ 10G for YEARS*
We've been doign 80 channels sdh @10G for a few years
Sometime soon, we'll move to 40G SDH (though not sure how many channels we get out of that.
Gazillibit ethernet - why would anyone want this? surely this would be a datacentre technology, as we already have the long haul technology. Who's built a server bus technology capable of getting near this speed. So what's the point?
* This is what most of your internets/ mobile backhaul etc goes over. I know, coz I built it,innit.
I got the references to Ghost Busters and Back to the Future, but I am worried that there was a third (or even fourth) in there that I missed. Please tell me it's not so !!!
My home PC network...
...has nothing that can generate a 1 terabit datastream. I'll upgrade once I have a CPU with 128 cores all clocking around the 10GHz mark, attached to a storage array capable of delivering data at roughly 1000 times the speed that my hard drive operates at present.
Dunno what I'll connect it to, though. I can't see an ISP offering *that* kind of bandwidth in the UK any time soon.
But, yeah, I can see this might have non-domestic applications. (Oh, and Brett, thanks for emphasising the "demux" point.)
@AC - you've got to get all timey-wimey
Ohhhh, dear you're going to loose cred for not spotting the old reversed polarity neutron flow - Dr.Who (3rd - Jon Pertwee to be exact)
Pulse clamp restrictors - I'm sure someone on Star Trek said that sometime or other. It's the kind of twaddle LaForge, Data or (ew) Wesley would come up with.
The rest is audiophile twaddle.
It's alright getting speeds rocking up to 640GB/s but it would be sold as up to 640gb/s and no doubt traffic shaped after 1gb of data has been downloaded. and run on a 100Mb NIC card at half duplex only.
Oh sorry thought it was Virgin Media that were doing it like their 50 meg service.