This brings a smile.
One has to smile at this. Here, apps might be the reason but as long as Microsoft remains obstinate about its Windows GUI then others will change it.
Perhaps this is only the beginning--Metro out, Android in!
894 posts • joined 14 May 2008
Helium being what it is, the only way I can see such a drive working is if the drive's platter chamber is properly hermetically sealed. This would require a chamber similar in construction to that of a light globe or radio valve which has a fully sealed envelope. Electrical connections would enter the envelope and connect to the electrodes in the same manner.
Such a connection through the glass requires a special seal where the glass must properly wet the surface of the metal wire (this can be a manufacturing problem). Perhaps also connections could be done by electrical (capacitive/magnetic) coupling through the glass or whatever non-conducting material is used for the chamber. For instance, the motor stator could be on the outside of the chamber and the rotor on the inside.
Frankly, it'll be interesting to see what they come up with, as from my experience, keeping helium in any container is somewhat of a problem.
All I hope is that these drives will be more reliable than the current batch of failures that I've had over the last couple of years.
BTW, the shingle recording method reminds me vaguely of the recording system used in VCRs where the hi-fi stereo channel is laid down under the video tracks.
RF engineers and spectrum planners have known this for years! It just isn't possible to run LTE/4G and give everyone the full bandwidth of which these services are theoretically capable. Anyone who's involved in running the 'cellular show' who is not aware of this ought to be given the boot instantly.
The solution is simple but implementing it may be harder:
(a) Either fibre-to-the-premsis plus household WiFi wireless connection and or
(b) WiFi or micro-cell wireless from the local street fibre.
(c) All smartphones must have local WiFi/Bluetooth/micro-cell access in addition to normal cellular access.
(d) Smartphones would automatically and preferentially first connect to the local WiFi/micro-cell services before connecting to the cellular network.
(e) Regulators must insist that WiFi connections/access to the fibre be integral part of any cellular service/connection.
As it is, on a megabyte-for-megabyte basis, wireless is already outrageously priced when compared to DSL services, so, if anything, prices cellular ought to fall. This will only happen if regulators/governments adopt sensible policies—and sensible policies should not involve allocating extra scarce spectrum unless strictly necessary.
Shame the Greenies aren't aware that the radio spectrum environment is also an environmental issue (perhaps then governments would listen—as they no longer listen to engineers, only accountants).
First, I did mean to say "electromagnetic force", heavens know why I said "electromotive force"—my once lecturers would be horrified (and that error's sometimes a trick exam question).
"Hope you didn't ask the aviation experts about this." Yes, I did actually. I won't bore you with the ins and outs of the discussion except to say that weight/fuel economy is a major/principal reason for fly-by-wire but it's not the only one—another compelling one is that pilots simply like using it (as most of us have become addicted to smartphones). And there's other reasons too.
Right, conductors do have free electrons but also they have 'bound' ones. The reason the table my laptop is resting on remains solid or why one bleeds after having been bashed against a solid plate of conductive steel is because of 'bound' electrons—if you like the electronic structure of matter. I beg to differ with you about the electromagnetic force as it is much weaker than the strong force and operates over a long (theoretically infinite) range subject to the inverse square law. And that's the rub—that the electromagnetic force can induce currents at a distance from the source, apart from being why electrical stuff works, it underpins the reason for the problem of electrical interference. (Signals induced into TV antennae miles from the TV station illustrate the point—here they're wanted but unwanted ones are induced the same way).
Obviously, I'm not opposed to using electronic control systems—after all, that's my profession—but I've seen enough system failures, whether through interference or other electronic faults, to question whether ripping out a perfectly reliable and simple mechanical system only to have it replaced with a complex electronic one is the correct move. Often the answer is 'yes' but the question must be asked. Here's a simplistic illustration of how new problems arise: at present I've an annoying intermittent electronic fault in my car's dashboard and when it fails I lose everything. In older cars that I've owned when the speedometer cable broke, the simpler and less tightly coupled system meant that I still had functioning fuel and temperature gauges.
There's no doubt the electronics/control systems in the 787 are remarkable and a credit to the designers. The keyword here is reliability—key electronics consist of multiple/dual systems that function the same but which are different in design in that they've been designed by different teams/organizations under 'clean-room' conditions.
That said things still go wrong with the best designed electronic systems and sometimes finding glitches, intermittent faults and interference sources—which in the case of aircraft are highly variable as they change their environments—can be incredibly complex, especially in a system as complex as a 787. (Those who've had a PC that has locked up for no apparent reason know the problem.) Moreover, there are other well-publicised instances where electronics, which have replaced critical mechanical (but historically very reliable) systems, that have had electronic failures which have led to serious accidents. For example, it's still unclear why the electronics in the Toyota Prius accelerator failed (anyway, the public is still in the dark).
There's a final reason why I reckon it's important to have fallback in critical systems (here a different technology—mechanical backup), and that's because it's essentially impossible to do a full state analysis (analysing/checking every permutation and combination of operation) on a system as complex as 787. A state analysis on even something as simple as a domestic VCR can fail through its complexity. Here's just one example: I'm aware of a mass-produced VCR with many hundreds of thousands of units in the field where it was eventually found to fail by pressing a certain combination of buttons—but the fault was only found years after it had been released!
Certainly fly-by-wire isn't going to go disappear, but the lesson we should learn is that modern electronic systems are both new (in that we don't have a century or so experience using them), and often they're extremely complex and that sometimes they fail in unpredictable ways—in ways and at times that we least expect.
"A design too far perhaps?"
Perhaps it's an intrinsic problem with the technology (chemistry). I remember a time a decade or so ago when the operation I was with had to ship these lithium batteries by sea, as they were deemed far too dangerous to even ship by air!
So what happened? Seems to me that this is another instance of where regulators are being forced to compromise safety because of commercial pressures.
BTW, the use of fly-by-wire is the reason for the need for massive backup power—and fly-by-wire is principally an economic consideration. Electronics is my profession so I have a suspicion of electronic systems when used in super-critical environments because of susceptibility to interference, failure etc. The question is why would one substitute a well-understood mechanical system for an electronic one. After all, a mechanical system is one made up of atoms—atoms whose electrons are tightly bound to the nucleus and thus extremely stable. However, in an electronic system electrons are freed from atoms and are subject to the most ephemeral and easily disrupted of all the forces of nature—the electromotive force.
Just recently, in a lecture on the 787 fly-by-wire, I had the chance of putting this question to a group of aviation experts and, after their initial surprise, they essentially agreed that economic considerations were the principal and driving consideration for the change from mechanical controls to electronic fly-by-wire.
"But even so 1-2-3 was always better than Excel."
I agree. I base that on having used most of the biggies--1-2-3, Quattro, Supercalc, Visicalc etc. Goes back as far as Visicalc on my TRS80 and Supercalc on my Godbout S100.
As with VHS vs Betamax, being 'better' depends on marketing. And irrespective of what one thinks of its software, Microsoft is a superb marketing company. Moreover, the long delay in porting 1-2-3 to Windows did it much damage.
"RFID operates in unlicensed spectrum...."
In most jurisdictions where unlicensed spectrum exists, unlicensed use is only permitted up to a small strictly controlled power limit--100mW to 1W or so depending on service. To ensure effective jamming you may need well above this power limit, thus a license.
But as you say "Jamming may fall foul of other rules and regulations though", so there's the Catch-22--you won't be granted a license. Jamming has a long history and protocols have evolved with it. The ITU radio regulations specifically exist so that there is a peaceful coexistence between radio users and jamming goes against this ethos--hence licenses are unlikely to be granted.
It gets very messy from here. Some jurisdictions have 'secrecy provisions', which means that if you hear/receive a signal you are not allowed to act upon it unless it's meant for you. This leads to complications such as police radar--you may not be allowed to own a radar detector or the ludicrous situation where you allowed to own one but are not allowed to act upon its data--i.e. put your foot upon the brake.
In summary, radio regs are a minefield in almost every jurisdiction.
1. It's highly unlikely that a bulk eraser or powerful electromagnet will do it.
2. I've tried with my Wiercliffe bulk eraser and the RFIDs just laughed at it--mine's the same as this one on eBay: http://www.ebay.co.uk/itm/Weircliffe-Bulk-Eraser-Model-6-Degausser-Ex-MOD-Army-SAS-Who-Knows-/320778826601 and this eraser is about as powerful as they come in normal circumstances.
3. It stands to reason that a bulk eraser will not work as the RFID's antenna is made of copper wire or such and is very short relative to the wavelength of the eraser's 50/60Hz--which is many hundreds of miles! Erasers are designed to work on ferrous/magnetic materials--copper isn't.
4. To destroy an RFID you need to zap the junction of the transistor(s) that are connected to its antenna. The only easy way to get enough RF (Radio Frequency) energy with sufficient power to do this is with a microwave oven--it has the power and a wavelength short enough to be induced into the RFID's antenna (that's about 10 cm).
5. You could always jam RFIDs from their detecting transmitter by swamping it with a separate local RF source on the same or similar frequency. However, in most jurisdictions this would probably be illegal (transmitting without a licence).
6. You could shield the RFID in a Faraday cage--that's an electrically conductive wrapping such as aluminium foil or metal box. This effectively shorts out the detector's radio waves and stops them getting to the RFID. Also, there are specialised electrically conducting cloths in which ID passes etc. (things that contain RFIDs) could be wrapped but I've not tried them for shielding effectiveness.
"Shared-spectrum scheme with the Department of Defense"
I remember when Defense spectrum was totally sacrosanct, so this is a welcome change.
Nevertheless, I hope we don't go too far and bugger up the Spectrum with noise pollution for both Defense and everyone else. We've seen that starting to happen already with totally mad schemes such as BL/PLC, Homeplug etc. Such schemes, are much more likely to destroy the Spectrum environment than to offer any significant benefit to users..
Ahh, someone with sense at last but don't underestimate corporate marketing. This article is all about Sriram Peruvemba offering excuses not to change the status quo.
Remember the stats quo is almost the immovable object when corporate marketing gets involved. Remember Foveon?
Easy to say when you're an Anonymous Coward. Most Luddites were Anonymous Coward's too--they hid in the mobs that smashed the machines.
Shame there are so many consumers that will buy any crap that's put in front of them (but methinks you're probably a manufacturer in disguise).
"Somebody like the National Geographic is still not going to be happy with our product," Sriram Peruvemba, chief marketing officer of E Ink, etc..."
Perhaps I should show Sriram Peruvemba a Kodachrome slide which National Geographic used for many decades from the 1930s onward--it set the standard in colour reproduction for over 50 years! Kodachrome looks wonderful with many vibrant colours, was invented in 1934 and in production in 1937, once processed it doesn't use any power except external light source--even daylight works, and if well looked after and stored correctly it will last for 200 hundred years--and yes--it's old analog technology!
Methinks we should have a whip-around and buy Sriram Peruvemba and his research department a Kodak carousel projector with a few Kodachrome slides. On the gift card we'd write "Please start your product research here".
I'm fed up with crappy, tiny-gamut, wishy-washy, low contrast displays--and I know there's much better around--newer AMOLED technologies for instance.
I'm not a great ebook fan as I still prefer the power of my laptop so I've been searching for ages for a laptop with say a decent AMOLED display but essentially there's none about. Why? Well press reports say it's basically a marketing issue more than a technical one.
Damn them, something's obviously wrong with competition in the display dept. Too many cartels, or oligopolistic trading perhaps?
Easy and cheaply. Get a set of *old* forklift batteries for free (new ones cost a fortune). Forklift batteries are excellent as they've solid plates and even when sulphated and abused and no longer suitable for a forklift, they'll deliver many, many kilowatts.
Of course, you have to know how to 'load' them into the system. Helps muchly if you are an electronics nerd or know one.
"It's particularly worrying that the best advice offered is repeatedly to simply update antivirus protection – far more expensive responses are needed."
It seems official secrets acts do more to protect the idiots in charge than to protect us from external threats. No wonder WikiLeaks has many of these turkeys frazzled--and they set on WL's destruction.
Agreed. The issue, of course, is to get an agreed widespread standard. And as we well know, this is no easy matter. Just examine the history of the NTSC/PAL/SECAM wars of the mid 20th C. for that.
My feeling is that we're going to have a lot of interim standards and it'll be a considerable time before things stabilize to the extent of NTSC/PAL/SECAM and or 35/70mm film coexistence.
Irrespective, some experienced producers of film movies who have now gone to digital for convenience, continually whinge and bitch about the lack of dynamic range, especially the white compression/clipping problem. In the old days of film, details in the highlights could be extracted out of the negative by the lab if the director wanted them, now they're clipped and thus do not exist (or are too compressed to use--cause banding etc.).
David, well said.
Frankly, I'm often horrified by the lack of quality that I see in hi-res video--even professional video [film-replacement] systems. Over-compression, compression artifacts, CODEC limitations, clipped highlights etc. etc. often make images look unnatural and artificial--in fact, to me, they often look quite horrible.
In many cases, going from analog to digital seems to have been an excuse to bypass many of the norms and standards which make for high quality images (and which the analog world took and still takes for granted). If one feeds say an optical--i.e. via camera lens--Pulse & Bar* & grayscale signal through the complete digital camera/recording/monitor chain and just views it--leave aside electronic measurement of the signal for the moment--the problems are glaringly obvious. One sees artifacts of all sorts--the classical (analog) overshoot, ringing etc. as well as digital noise to the extent that would have been unacceptable in professional analog systems. [Of course, I'm referring to a test signal (and distortion products) that's been appropriately scaled to the resolution and bandwidth of the specific system.]
"Show me a TV that can light up my room like an open window, while retaining near-zero blacks, and I'll start to get excited."
Right, despite the ooh-ah factor experienced by many video neophytes to HD video--and for that matter its many real benefits over older systems--digital imaging has a huge way to go before it represents a true analog of the image that its endeavouring to reproduce.
* The Macdiarmid / BBC TV Pulse & Bar 'T' test signal goes back to the early 1950s. It's old but its design is still relevant as is based on the actual optical distortion that's perceived by a viewer after an image goes through any video chain/process. As the test signal is an analog for an image, its perceived optical distortion is what matters, it's irrelevant whether the medium is digital video, analog TV or even film for that matter.
Extra resolution is a highly desirable goal but a better priority would be to spend time improving both the gamma and dynamic range of both image sensors and displays.
1. Image sensors need to have a much better dynamic range than the current practical limit of about 10 stops before white clipping occurs. This would allow the camera electronics to properly simulate the Hurter–Driffield slanted-S (log exposure) curve of film (thus allowing detail to be extracted from the 'toe' [low light] and the 'shoulder' [highlights] of the image--still a major problem for electronic image sensors (television systems). At least 13 or 14 stops dynamic range should be the short-term target for digital image sensors so we can enter the High Dynamic Range Imaging (HDRI) era.
2. Large hi-res extended dynamic range displays such as OLED etc. are urgently needed to display the extra dynamic range (leaving geometry and res aside, the best CRTs still look better than LCD displays when it comes to dynamic range and so does the best film projected by a black-body radiator (tungsten filament) light source.
3. The colour gamut also needs to be widened--look how pathetically limited the current sRGB triangle is on the CIE 1931 color space chromaticity diagram [see Wiki--color gamut]. (Perhaps we even need research into four-coordinate (2 greens) colour systems.)
Preoccupation with image resolution at the expense of gamut and dynamic range seems counterproductive and shortsighted. Moreover, in this digital age, we should not lose sight of how remarkably good a film negative can be when it comes to dynamic range--after all, it's had 150 years development (although the same cannot be said about film's limited colour gamut).
Remember your eye can accommodate (adjust to) a dynamic range of over 10^6 whilst the best LCDs barely make 10^3 (despite the advertising blurb)!
Then you can't refer to the data as IP/Intellectual Property. The concept of Intellectual <=> Property is a non-sequitur, 'Intellectual' being a metaphysical notion and 'Property' being stuff governed by the laws of physics.
Whilst you (the recipient of the book) mightn't own its words the question of whether the author fully owns them either is a moot one. Why? Take a tune for instance, once someone plays it then the composer ceases to have control over it as listeners cannot suddenly have no knowledge or perception of it. If they all suddenly lost their memories then there'd be no purpose to hearing the tune in the first place. Immediately having no memory of listening makes no sense of the event.
One can create a metaphysical concept [knowledge] but it's a nonsense to claim ownership of it in the sense that one owns a physical object.
For heaven's sake, this (the whole issue) is a stupid specious argument.
You can't give someone a file anyway--not unless you also give them the server! Ipso facto, by definition and the laws of physics, if someone gives you a file then it's always a copy as the magnetic domains that constitute the original file always remain 'locked' to the server where it was created.
In philosophy, copyright, IP, or whatever you want to call it, is a metaphysical concept. Translated into English that means 'above and beyond physics' [hardware].
This is sophistry!
(Nonetheless, the concept of copyright and sophistry being entangled together I find rather appealing.)
"contary to popular belief they are not illegal to own"
Going from 5mW to 1mW is truly Nanny State stuff which I certainly don't agree with, but ready access to 1W devices shouldn't be possible. As with guns, lasers >200mW are potentially very dangerous and should be licenced and the licensee appropriately trained (and such lasers kept under lock and key a la armory procedures). Moreover, unlike a rifle where every round fired is the result of a deliberative action by the shooter, a laser's continuous wave output combined with that it can be swung through an arc makes it particularly dangerous.
My concern is that now these devices are commercially available it means that those with ill intent will obtain them one way or another, and after a few people are blinded by the fuckwitted action of others--and as sure as eggs they will be, then the Nanny State will make it almost impossible for the legitimate user to gain access them.
I've just tried to up-vote a facetious comment on the 'Jimmy Savile ringtones/iTunes' story via the Vodafone E585 modem and I had to terminate the connection and relaunch it. I still don't know if the up-vote worked or not--I'll check in a moment.
This Vodafone network is beyond any joke, even the most tolerant and magnanimous would consider it unacceptable!
There is no doubt that Vodafone's service in Australia is appalling. One of my services is a Vodafone wireless that uses Huawei E585 modem and it is used within 6km of the Sydney CBD. The location is not unusual in that it's unusually shielded by tall buildings, in fact there's cell phone antenna clusters within only 0.3km away at the end of the street--these are almost line-of-sight but not quite (and there's another two about one km away).
The biggest problem is latency and it's worse than being on a satellite circuit. I hit the Google icon on my task bar and the browser will literally take between 10 and 30 seconds to connect. And that's if it connects at all, often one just terminates the connection and tries again until a connection is achieved. When a connection is established, Google's links can take equally long time to connect. (Doing the same exercise on the ADSL line and Google appears within sub-second time).
On sites where the back-end has no effective bandwidth limit (i.e. where downloads from the site on a Telstra land-line ADSL consistently exceed >14mbps), the Vodafone E585 wireless streams somewhere between 60 and 230kbps. The E585 modem is not the limiting factor because at Sydney's Mascot airport when traffic is quiet (after it's closed) data rates of 500kbps can be achieved.
Even at say 200kbps the network's wireless link connection is not the limit. Somewhere within the Vodafone network, Vodafone's servers/switches are polling and simply cannot deliver the connection to the outside world--the traffic block is there and not in the wireless link! Moreover, it's a consistent problem that's almost independent of the time of day--'tis even a problem at 3AM! No amount of replacing of wireless equipment or radio antennae will fix that.
I for one would really like to know exactly what is going on behind the scenes at Vodafone Australia. We're given lots of info about improvements to the 3G service and no doubt that's happening but if Vodafone cannot connect to other service providers such as Telstra--which is essential as it holds/owns most of Australia's infrastructure including overseas gateways--then that's where the Vodafone bottleneck will be--and that's *not* solved by excellent 3G or even a 4G coverage.
El Reg reporters et al should try to get to the bottom of this problem. Seems to me that Vodafone is hiding its back-end problems. Potentially, they could be more problematic than the wireless link--in fact, in my case--they certainly are now!
With proprietary Microsoft hardware knowing APIs within Windows that other brands won't know about thus giving it an advantage, it might force other manufacturers to look around for a replacement O/S product.
After all, if you were a H/W manufacturer competing against Microsoft, you'd have to be pretty pissed of methinks.
Simple really, I rang up the telco and just told them to disconnect both the internet and texting services and they immediately disconnected me from them.
Those services then didn't work, it was the same as having not paid the bill and being cut off--except the phone service still worked OK.
I could write pages on this topic but hopefully I'll keep it to only a few paragraphs. What I find fascinating is my own ambivalent or even negative reaction to the use of this technology. I've been a nerd since I was kid--that's even before the word was coined. I've always been fascinated in science and engineering and I've always worked in hi-tech environments, moreover, I've often been an early adopter of technology--right on the bleeding edge, yet I still cannot understand or comprehend why such a huge percentage of the population is so utterly possessed--truly addicted--to this technology. For ages, I've dubbed it 'electronic heroin'. Early on, had I had an inkling of such widespread addiction then I might have capitalized on it and made a fortune. ;-)
On two occasions in recent months I've nearly killed someone when they've obliviously walked out in front of my car in the middle of a 4-lane thoroughfare whilst blithely texting--not to mention other similar accidents of which I'm aware. Whenever, I sit in a coffee shop, public mall eating area or canteen, etc., etc., I seem to be the only person who is not glued to their tiny screen. I'm one of the few who actually turns the mobile off before I go into meetings and such--and on extremely rare occasions when I've had to use a mobile in a restaurant, I physically go outside to do so.
In the past I've requested that telcos cut off both text and internet services from the mobile so only the telephone remained. And I object to the fact that even this is problematic, as the once-maintenance-now-telco-golden-egg, the SMS, was never designed to be a sophisticated customer based service, hence there's no protocols to return a 'not-received', 'not-answered' error message back to the sender if one has the service disconnected. Only today, in another El Reg post on Cassini being 15 years old, I speculate about how many orders of magnitude more expensive it is to send a byte via SMS across the room than it is for NASA to send the same byte from Saturn to Earth. Why is there seemingly no concern for the outrageous and extortionate cost of the SMS service--a service that telcos originally got for free? (As with the heroin addict, when one's hooked and going cold-turkey, what one pays for one's habit is seemingly the least of one's concern.)
If things continue along this track as they have been for the past several decades or so then that prediction of 2525 will be fact, if not sooner.
Nevertheless, as we're the ones who are not participating hell-for-leather as is everyone else, then objectively it's hard not to conclude that it's the likes of you and me who are really the odd ones out. Trouble is, I'm still far from convinced as to the reasons why.
In an army grunts grunt, but only amongst themselves. What warriors do for a living is war and that's set by The State--or at least by their senior officers, thus they have stuff all say in the matter of policy--at least in public. Soldiers do not say "Oooh, the Somme, Verdun etc. look suss and overly dangerous today so I'd better run off to tell the newspapers and politicians about it", they do what they're damn-well told without comment, or else.
But The State allows the opposite with police forces. It seems that every five minutes someone from the police is complaining about the lack of laws for this or that, or that some law should be tightened, or that judges are too slack: police POLITICAL complaints seems never-ending. Just as bad, politicians actually listen to their complaints and act on them because it makes good copy for newspapers and other media if they don't.
What this amounts to is that we've a never-ending tightening of our laws--have you ever seen the volumes of law becoming fewer? Of course not.
It's time police services were neutral in matters of policy. If they can't take the heat then they should get out of the service--complaining to all and sundry shouldn't be an option.
Policy and law should ultimately be the responsibility of the citizenry, not police demanding better working conditions and playing on the fears of the public whilst so doing. Simply, the more the cops complain, the fewer freedoms we have in our democracies.
"Fiddling as LibreOffice speeds past"
You mean waddles past at a snail's pace, don't you?
"LibreOffice, which celebrated its second birthday last month, is now releasing updates on a regular basis"
Whilst better than OOo, most of The Document Foundation's LibO updates have been trivial, important issues still aren't addressed, for instance, LibO's Writer still can't format documents as well as MS's 12 year old Office 2000 (keystroke commands/shortcuts still aren't complete), which is very basic stuff to say the least. And LibO still relies on Java, not to mention that it doesn't handle damaged document files as well as Office does (it consumes all available memory and goes into ga-ga land whereas Office at least opens the file).
LibO is better than nothing but I still an ancient MS Office in preference. And it's not for the want of trying LibO, unfortunately Office is still better (but I really wish it weren't so).
But then, one doesn't expect much for nothing these days, does one?
No matter how you dress up Ubuntu it's still Linux and not Win-32/64. Whilst there's nothing wrong with using Linux (I use it often), its very different architecture will put a spanner in the works of Microsoft/Windows-centric IT departments.
Everything will conspire against it being adopted, from service contract difficulties to utilities not working, to increased/additional training, to compatibility through to sheer prejudice on the part of IT staff, not to mention the usual difficulties in getting people to change etc.
Other than a complete rewrite of Linux from its core outwards to make it have Win-32/64 compatible APIs--which hasn't a snowball's chance of happening as it would be a monumental task, if at all possible--so 12/24 months from now it'll still be business as usual the Microsoft way .
"To be fair, we also did most of those things without the airplane, either."
...But certainly not WWII. My point was that society has organised truly ginormous events and done so effectively without the mobile/cell phone. If you actually want to know just how truly extensive aircraft use was in WWII then check here (start at about 06: mins in, exact reference at 06:38):
If you're really keen then you'd watch the whole video and see how the cellular radio concept was used decades before mobile phones became practically feasible.
Fail on item 2 perhaps, it's a matter of opinion.
But "Godwin's Law by association"? Really?
Come on, I've been accused of that in the past and pleaded guilty. However, here not even the contorted mind of a Times crossword addict could link a viable connection.
Biting the hand that feeds IT © 1998–2020