The RF side of things is more important/difficult than rounded corners on icons?
Smartphones nowadays come with big screens, megapixel-packed cameras and, thanks to apps, many, many more features than anyone could have dreamed of in the early days of mobile telephony. It has even reached the stage where making telephone calls is just one small part of a modern phone. And yet the need to support all the radio …
The RF side of things is more important/difficult than rounded corners on icons?
Yep, much more important and surely Apple will be an early adopter. It ensures, if the fixed battery doesn't, that one pride and joy will wear out and need replacing just as the new model comes on line. Brilliant!
Sent from a nearly two year old San Francisco. Remember them?
was that traditional radios were known and "good enough". Those of us that aren't youf, will remember GSM with it's single bands, then dual, then tri, then quad...These were selling points, I could lookup countries I was visiting and work out whether by buying a new phone I could roam there. Now - I have no idea how many bands my last few phones supported - the things just 'work' and it was no longer an issue.
With LTE coming and it's massive range of frequencies that are used over the world, we're suddenly back to ye-olde issue of "my phone's not going to work there". Plus manufacturers are having to churn out different models. Having a way of 'making this problem go away' is a definite thing I might look for when I make the LTE switch.
No, your phone will work there.
Oh' you meant your phone's apps that rely on LTE.
Like being able to download a movie or watch clips from tv that are not stored on your phone.
Surely RF-MEMS is just a stop-gap, then, until software-defined radio (SDR) takes over. SDR is basically just connecting a "good enough" antenna over a range of frequencies to a chip that does high-speed ADC and analysis of the data to simultaneously receive multiple frequencies. It's real, in use, and works now. You can even find website that let you monitor the whole of the radio frequency bands in their area at the same time as thousands of other people do the same, from a single device.
Not only that, they can transmit too, and they are eminently adjustable, and they work well with things like FPGA's to make them configurable. And there are already plug-in modules to read things like 802.11 wifi, Bluetooth, etc. and even DVB-T from the data they produce (i.e. effectively software decoders / encoders for those protocols given raw radio data). And the biggest software project for them in a GNU Radio project with open code.
SDR seems to have so many advantages of this technology, for instance no having to constantly "switch" between frequencies but just "hear" them all, all the time, that I can't see a real market for this. SDR's will inevitably come down in price and do more, so will only get used more. But this is a niche product that will only survive as long as something better doesn't come along (you don't get much better than an SDR).
An SDR is more than capable of receiving and transmitting FM radio, DVB-T, Bluetooth, Wifi, NFC, GSM, 3G, 4G, and whatever else comes along all at the same time, which is basically a phone's job. And when new wireless technology X comes out, it *is* nothing more than a software upgrade to support it (i.e. you can quite literally add support for "5G" with a firmware upgrade even if the standard didn't even exist when the phone was built). With SDR, the only limit is CPU-time - and that's something that's growing in the mobile market just as fast as anything else. Hell, the CPU's in modern phones would have been only a dream even 5 years ago.
Ignore this tech. Push your money into SDR. And then you can do whatever you want whenever you want and literally your hardware phone becomes software customisable to become anything from a TV to a FM radio to a CB transmitter.
> Surely RF-MEMS is just a stop-gap, then, until software-defined radio (SDR) takes over. SDR is basically just connecting a "good enough" antenna over a range of frequencies
I don't think they are the same thing; RF-MEMS is a software defined antenna rather than a software defined radio. You would want to hook the two together though....
I bought an 'expensive transceiver' SDR radio from eBay(Hong Kong) which was around £30 including postage, I haven't yet taken the radio apart to look for any microelectricalmechanical stuff. The specs indicate that it is based on the SDR RDA1846 chipset and is a Chinese mainland Baoefeng clone of a Yaesu handset. Except it works in many ways better than the 'traditional' Japanese analogue radio, and costs but a fraction!
The SDR chip inside my radio is last-year's technology. The new Wouxun car-mobile HF/VHF transceiver is based on the Freescale SoC "MC13260 chipset with integrated 32-bit ARM-9™ processor" These radios are not being built to sell to us in the west but to satisfy the massive Chinese internal market for flexible terminals. SDR is here!
When these new SDR radios are hacked/properly marketed, as they inevitably will be (look at gnuRadio on DVB-T SDR Rx), then the traditional radio markets, EU/US/JPN businesses will be as dead as a Norwegian blue Dodo!
This might happen soon:-
You want a new Airwave terminal rollout - best offer will be SDR from TYT.
replacement for Bowman - Quansheng crypto SDR is the winner
Tetrapol repeater - why not use the SDR terminal market leader Puxing
Power consumption? I don't know much about either technology, but wouldn't a software radio would consume more power and take up more physical space?
You may as well say, ignore BluRay technology, because CPUs are getting more cores.
SDR *decodes* the signal. Antennae *receive* the signal. SDR won't magically improve the signal-to-noise ratio at your analog stage. SDR and RF-MEMS are complementary technologies, not competing technologies.
As for the rest of your software-defined kool-aid:
"SDR seems to have so many advantages of this technology, for instance no having to constantly "switch" between frequencies but just "hear" them all, all the time, that I can't see a real market for this."
SDRs heterodyne the frequencies from "very very high" down to "low enough to sample". The ADC bandwidth allowing you to not heterodyne isn't *quite* there for 2.4GHz and above - I'm not aware of a consumer-grade 5Gsps ADC. And sampling the entire signal is very wasteful of power, if you don't need to.
"Not only that, they can transmit too"
This all depends on the analog stage of the SDR. Most SDRs don't transmit, because you need a license for most bands, and it is *extremely* dangerous to provide a transmitting SDR if there's *any* danger of it accidentally transmitting on the wrong band. Dangerous as in putting lives at risk. Fully user-configurable software-defined radio transmitters are never going to be available to consumers, sorry.
"Ignore this tech. Push your money into SDR."
Total non-sequiteur. I recommend you study RF analog stage design, Fourier's theorem, heterodyne theory, and look at the kinds of ADCs that are cheaply available, just to at least get a *flavour* for this subject, before writing off an antenna tuning technology on the basis that it's not your favourite signal decoding technology.
Thank you! I was cringing reading the earlier commentary of how SDR is doing it all, right here and right now.
Seconded! The OP has confused two different parts of the radio. It doesn't matter how good your decoder is, the channel capacity is limited by the bandwidth and the signal-to-noise ratio. (Shannon–Hartley theorem). You need a decent antenna to feed as much signal, and as little noise, as you can to your LDPC codes, or whatever. You can do this by tuning the antenna to the frequency band you are interested in, which is what these MEMS parts are trying to do.
Software defined radio allows you to output a huge range of frequencies and modulations to the antenna using the same RF hardware, but the antenna is still governed by the laws of physics. And if you're trying to put a frequence out via an antenna thats not designed for it then your signal won't get very far.
"hardware phone becomes software customisable to become anything from a TV to a FM radio to a CB transmitter"
I hate to break the bad news, but with the tinky little antenna in a phone you'll be lucky if that 27Mhz CB signal makes it to the bottom of your garden, never mind the next town.
I do SDR for a living, and you are wrong. There are 3 things that prevent your idea from working:
1) Not enough bits on the converters. To have an SDR in which the antenna dumps into the converters, you need very fast converters - to cover all the bands a phone might need you'd be looking at gigasamples per second - hundreds of gigasamples per second to do 60GHz. The best gigasample converters are around 12 bits. The dynamic range just isn't there - a strong signal in the FM band will swamp the converter and prevent it from hearing that weak signal in the cellular band. Yes, you get some processing gain as you band-limit the signal in the digital domain, but if the desired signal is too small, you still won't be able to resolve it.
2) not enough processing power. Even if you ignore the above, it takes a lot of processing power to handle a gigasample/second signal. Your phone will last just long enough to open the socket before draining the battery.
3) Noise. Sorry, we live at about 280K - and the noise floor is such that a wideband converter will see so much thermal energy that it will be swamped no matter what. Remember point 1? You don't even need a strong FM signal in the area, just the thermal noise from the phone will be enough. Unless you want your phone's 500 kg batter (remember point 2?) to be even heavier to run the cryopumps to keep your phone's front end at 2K.
Even the best SDR on the market has a few analog IF stages to bring the signal down in frequency to something more reasonable, and to band limit it to bring the noise floor down, and to decimate the sample rate to something that can be processed reasonably.
tl;dr: - you still need to have a tunable antenna, SDR isn't a panacea.
Still using the old monkeys and typewriters approach to writing I see.
How many frequencies does a phone need to use simultaneously?
If it's working as a WiFI->4G access point, and carrying a bluetooth->3G voice call, that's 4 (assuming WiFi is on 5GHz, not sharing 2.4GHz with the Bluetooth).
Can the RF-MEMS switch fast enough, or will the phone still need several antennae?
Not to mention wanting to scan other frequencies for potentially stronger signals.
Hi, RF-MEMs does band switching in the low micro seconds, which isn't fast enough for transmit / receive functions. However, a seperate T/R switch is used as well as RF-MEMS for band switching. An end user won't notice the band being switched (in theory of course :o)
Hardware switching is preffered at all levels of the stack due to processor load required for software switching.
Core and access networks use MEMs in optical networks for band drop or pass. They are also used in medical applications for similar purposes.
If it's reaching the stage where it's becoming harder and harder to support traditional voice calls, why not scrap it altogether, allocate the bandwidth to more data and have phones communicate via VoIP instead? This isn't a troll attempt, I'm just curious (and ignorant). =)
VoLTE (Voice over LTE) is pretty much what you describe: packet-switched voice telephony (3G is circuit-switched for voice, but packet-switched for data).
The problem is that you can't just decide in isolation to move to a different voice technology. There are billions of 2G and 3G voice users, and their phones all need to operate with the "new" system - a voice network with nobody to talk to isn't much use. This translation between protocols has to happen within the network, but right now there aren't enough LTE handsets in circulation to make VoLTE a pressing concern for operators, and because LTE is fully packet-switched, not circuit-switched, there's more to it than just adding codec support in the existing network.
Plus, LTE is not universally available - you will always need fallback to UMTS (3G) or GSM, so those antennae are not going away any time soon. That brings up the non-trivial problems of managing handover between VoLTE and circuit-switched UMTS/GSM during a call -- you cannot cut customers off mid-call for something as trivial as a bearer change: voice telephony, even on mobile (despite the disclaimers) is a safety-of-life service.
So, basically, yes, you could just adopt VoLTE only with no fallback, but then you'd be limited to speaking to people with the same type of handset as you, living in urban areas only, and not even Apple's customer base is that insular.
VoLTE will be the 4G standard for mobile networks and is a VoIP technology. At present, it isn't rolled out for various reasons i.e. interop, user experience, OAM tools etc. Most operators are still using Carrier Fallback for voice. Full VoIP will happen, but will take quite a few years to reach mass adoption. Fair question though.
Cheers Kristian - you beat me to the answer :o)
The root cause of the Apple Antennagate fiasco is as follows: A monopole antenna has a low impedance feed point at one end. The other end is open circuit, and thus very high impedance. You don't want to touch the high impedance end because (being high impedance) it'll be maximally sensitive at that point.
the main problem is that if you are receiving a signal you have to listen on all the frequencies it could come in on,. its all well and good once the connection is established to fine tune into prevent drop outs but the wide spectrum listening is still a problem if you need to receive calls.
It's all very well to match the impedance of an antenna to a radio's input/output stages, but anything less than a resonant antenna of the correct length for the frequency in use will waste power in the matching system.
I can easily match the output of my 100W HF transmitter to one of my VHF whip antennas, but all I'm likely to get for my efforts is a hot antenna tuning unit.
If your carrier buys spectrum and adds new LTE bands after your phone was designed, you may be able to add them via a software upgrade. Today you're just SOL, and may struggle to get signal in a spot where you could have 5 bars if your phone handled the new band.
With all the chopping up of retired TV frequencies and demand for mobile bandwidth skyrocketing, this is something that will become a real issue in some areas in a few years.
Semiconductor devices to change the impedance of antennae have been around for *decades* and are called "varicaps." The trouble is they mix the control signals (relatively high power) in with the input signal (potentially *much* lower power), meaning the front end has to be "blunted" to handle the power bleeding through, loosing sensitivity.
RF Micro Electro-*Mechanical* Systems can potentially isolate the control signals *completely* from the input signal. An example would be switching in different values of (on chip) capacitors) to match the aerial inductance. The switches can be in the form of "fingers" electrostatically pulled into a pit in the surface of the chip, like an integrated reed relay. The size makes it fast, the design gives *much* better isolation than conventional MOS transistor based switches and with no *active* devices in the signal path at this point *potentially* much hider bandwidth. That is a generic use of MEMS in this task. Other architectures can be much better tuned to the application, indeed "tuning fork MOS transistors" were first developed in the mid 1960's, but never went anywhere at the time.
I'll note SDR gives *flexibility* but it's a rule of the ASIC game that a single purpose chip *optimised* to its task will beat the general purpose micro *especially* on power consumption. Something to keep in mind as the number of things running on your phone multiplies and people have an annoying tendency to want a battery life measured in fractions of a day at least.
BTW for the argument you cannot get 60GSPS ADC's you need to remember that is the *carrier* frequency. The actual *signal* bandwidth is much smaller. This fact is AFAIK the key to how it has been possible to view analogue TV on Linux PCs running at c400Mhz. It's called "under sampling," however having a (*very* well shielded) local oscillator to bring the signal down to baseband would probably help quite a bit too.
IANRFEng and I'm sure others here know a lot more but if you didn't you'd probably read the article and think "So what?"
Just a comment on one throwaway line in the story...
"Were punters happy with ever-fatter phones ... but they’re not - they want thin, pocket-friendly devices."
Is that true? A fat phone would be able to include a better camera and a bigger battery - doesn't sound like a bad thing to me. My digital camera goes everywhere in a shirt pocket and needs to be fat to accommodate a good lens. My Android with GPS and data turned on dies after about 4 hours, not much use for route finding on a day's hiking in the hills. Maybe what I need is a camera that works as a phone (and gps) rather than a phone that acts as a camera?
Not alone there; the first "mobile" phone I used had a lead-acid battery and shoulder-strap, which was a bit much, but I've never complained of a handset being too big since then - too short a battery life, too small a camera lens, too small a screen, but never too big.
Now, a toughened waterproof double-thickness iPhone/S3 with four times the battery life - that, I'd want. Sacrificing anything else to make it thinner/smaller? Forget it.
Agreed. I was also annoyed by Smith's throwaway line "No one, after all, wants to go back to extendible external aerials". I'd be perfectly happy to have a phone with an extendible external antenna, if that improved reception. It's a tool, damn it; I want function, not appearance.
Now, I'm perfectly aware that it's unlikely the market would support a phone with an extendible antenna; I don't expect other people to share my opinions. But I don't need Reg writers telling me what I think, either.
Biting the hand that feeds IT © 1998–2017