Speed is impressive
but the 8 bit resolution is crap. Useless for many applications. I'm guessing a lot of the ADCs in mobile phones are 10 or 12 bit devices.
This one for example is 14 bit.
We’re nowhere near the limits of optical fibre capacity: the bottlenecks exist in the electronics that connects to the fibre. One part of this is the analogue to digital converter (ADC) and IBM is touting a prototype that it says could deliver a billion conversions per second. The company says its technology doubles current …
Well it's a great step up from the 5 Bit converters you currently have in fibre optic connections. (and even there only in the really fast ones) And even those converters are already in the price range of several $100.
If this one could be cheaper it could be a revolution.
An 8-bit ADC performing a billion (10^9) conversions a second produces a data output stream of 8 Gb/s. How does this correspond to the claim that it would enable 100 Gb/s communications? Also, as mutatedwombat has said, 8-bit resolution is 'crap', for anything involving audio/video.
There may be some analogue processing on radio antennae signals that could be done in 8-bit digital, so can anyone comment on that?
Now I'm no expert in the field but as far as I can tell this isn't being sold for AV use, so whether it is 'crap' or not is immaterial.
It IS being developed for encoding/decoding of optical communications, where that is probably more than enough accuracy to support various multi-level encoding schemes.
This post has been deleted by its author
Sample aliasing creates a huge amount of noise. A fast 8 bit DAC is more accurate than a slower 14 bit DAC for high frequencies. That 3.1mW power draw, if correct, is amazing.
RF DACs don't need many bits because they're usually receiving very noisy signals. If you took a very snowy analog TV image and converted it to a 256 color PNG, it would look pretty much the same.
Biting the hand that feeds IT © 1998–2019