Intel has developed a truly digital radio chip, a part that replaces the analog elements in today's radio frequency transmission and reception circuitry with digital equivalents. The result of a ten-year research project, the digital transceiver is a 32nm part capable of delivering Wi-Fi. It's still at the experimental stage: it …
They forgot to update Clark's first law .......
"When a distinguished but elderly scientist [/engineer/manager] states that something is possible, he is almost certainly right. When he states that something is impossible, he is very probably wrong."
Re: They forgot to update Clark's first law .......
Including engineers I'd agree with. Not managers though, they are often wrong about what is possible.
Shooting at a target that moves faster than your bullit.
Intel has been working at making digital radios for more than the mentioned decade.
From a certain point of view, this makes perfect sense, but not so much for WiFi.
The strength of digital radios is that they can shift between radically different radio protocols simply by changing the SW settings.
This would be a great boon to mobile phones since they can then reconfigure to using whatever network and frequencies are available in a given area without the need to include a physical RF circuit for each and every protocol and frequency out there.
So in principle an extremely good idea from Intel.
The problem and the reason they have continuously failed to deliver (and the current implementation seems to be a bit short of a total success) is that they are aiming at a moving target.
The reason RF circuitry is (partially) implemented in analog circuitry is that analog chips are by definition faster than digital chips working on the same chipset technology (so the transistors are identical).
Digital chips need a generation or more of chipset technology development in order to do what can now be done in analog chips.
Radio frequencies are a scarse commodity, so everybody tries to use them as well as possible and to cram as much data into the available space as posssible.
This means that the RF standards are continuously being upgraded for higher speed and bandwidth as soon as there is an available technology than can handle the needed processing.
So digital radios are playing catchup to standards that are continously being upgraded to a level, where only analog radios can follow (yet).
What Intel needs in order to succeed is for RF development to stagnate and reach a plateau, where speeds can no longer be increased by using faster or more powerfull chipsets or where the speeds are simply good enough for most practical purposes.
When that happens, they have a chance of success
Re: Shooting at a target that moves faster than your bullit.
Well the main problem why you cannot use that for mobile telephony is because they use different frequency ranges. That's already quite hard to do. In order to get an energy efficient transmitter you slightly overdrive your power amplifier and then filter out the harmonics with passive filters. So the analogue frontend still needs to be there and tuned to your particular frequency bands.
Claus, your writing would be easier to read and take up less space if you were to use paragraphs properly.
Tip: paragraphs contain more than one sentence in all but very exceptional circumstances.
Some work left to be done...
Low Noise [RF] Amplifiers (LNAs), done right, need special low noise transistors. These specialized transistors are the opposite of inexpensive (in comparison to integrated transistors that are bought by the billion).
Power Amplifiers (PAs), at the higher frequencies, are inherently analog. Digital PWM-type (Class "D") amplifiers can be done at much lower frequencies (low-VHF), but it'll be a few years before they get pushed up to GHz range.
RF filters are typically physical devices at the higher frequencies.
Power supplies may contain SW, but are not built from SW.
Antennas can be wide-band or adjustable. But they're not software.
The all-software software defined radio (SDR) is a long way off.
But applying SDR principles to the IF-strip is almost entirely good. As long as you can keep the power consumption down, and your SW change process isn't 4000 times more expensive than letting "Bob" or "Frank" get on with designing a new all-analog radio.
Re: Some work left to be done...
I'd have to disagree here.
It is possible to make a fairly good LNA in silicon at WiFi frequencies, with a good enough noise figure for the overall system noise figure to allow the sensitvity to exceed the 802.11 specs by some significant amount in at many data rates.
You can make class D or E amplifiers for PAs too, using a tuned output network to enhance efficiency, but this is only of use if you have a constant envelope modulation scheme and don't need linearity, in practice they operate at well under saturation to maintain the transmitted signal EVM. There are ways of improving efficiency in OFDM systems, using drain voltage modulation at the symbol rate, but the digital circuits to measure the necessary modulation and provide the modulator feedback signals reduce the gain somewhat. Also, the silicon area used by the inductors needed for the output networks are very large in comparison with the active devices, that makes the chips more expensive and the operations people very unhappy.
And, digital PLLs? Not new, seen one of those in the lab 3 years ago. On silicon, similar process geometry to Intel's chip.
Re: Some work left to be done...
You're getting "integration" conflated with "software defined". Of course virtually anything can be integrated. My point was that these technologies (the ones I've listed) are not yet ready to be replaced with software. In contrast, the transceiver's IF strip can now be pure computer software to great benefit.
Listening to some of the idiot promises made by the 'Software Defined Radio' folks over the past decade is enough to make me reach for the nearest barf bag. I could tell you some horror stories of certain SDR projects, but I'm not going to.
Free advice to decision makers: SDR technology is applicable to the IF strip (not the entire communications system). This imposes practical limitations on the flexibility that can actually be offered by SDR. The NSA policies may impose additional limitations. End result is that the promises made are not realistic.
Flexibility vs Consumption
A "Digital" radio is very flexible but eats more power. Most data radios have been mostly digital for years, there is a Analogue filters, rx rf amp (LNA), tx RF amp (pa). Other companies such as Analog Devices and Qualcom do have chips that work by aliased ADC. You CAN sample 2400MHz RF at only 200MHz as long as there are no signals +/- 50MHz either side of the 2400MHz. The 200MHz sampling by nyquist means the max data bandwidth is 100MHz, so 50MHz bandwidth is fine. If you sample 2 channels at a 90 phase shift you can double the resolution.
This will (a) Be way behind Analog Devices and Qualcom "Digital Radios" and (b) take more power, the significance is the Atom CPU cores.
But Analog and Qualcom have chips (SoC) with ARM cores. Much much less power hungry.
It was drilled into me as a young engineer in the early '70's that 300 baud (or whas it 1200? I forget) was the fastest data rate that can be achieved over telephone lines, due to Nyquist's theories. Till some twat pissed all over the idea by dreaming up trellis modulation....
(Pulls accoustic modem from garage, searches for old Post Office phone, model 701, oils dial and centrifugal regulator....)
Re: Flexibility vs Consumption
Sampling speed is one thing, bit depth is the other.
For maximum SDR flexibility, you need to have a wide open front end (no pesky hardware filtering). This is required to meet the promises made by the SDR folks. So how many bits depth do you need to pull Radio Wingwong out of the noise while parked three blocks away from Radio Luxembourg? Without bothering to do the math, probably about 32 bits. Maybe more. I know that 16 or 24 isn't going to cut the mustard.
Humans will get there. Just not until about 2030.
If I'm not mistaken...
If I'm not mistaken (and I could be) the XCeive XC5000 already did everything the Intel guys claimed were impossible like 5 years ago (it was already on the market by early 2007.). It's a single chip multistandards TV tuner, basically an amp and set of DSPs. Even the bandpass and notch filters are software defined (i.e. so it can handle both 5mhz and 6mhz channels.) Of course it does not run at microwave frequencies. Of course it is not operating at microwave frequencies; nevertheless, I have the feeling the Intel guys may have spent a tad too much time in the lab as opposed to seeing what people are already doing.
How is this different from the Gnu open radio project:
Is it just a speed thing or what?
- Review This is why we CAN have nice things: Samsung Galaxy Alpha
- Ex-Soviet engines fingered after Antares ROCKET launch BLAST
- Hate the BlackBerry Z10 and Passport? How about this dusty old flashback instead?
- NASA: Spacecraft crash site FOUND ON MOON RIM
- NATO declares WAR on Google Glass, mounts attack alongside MPAA