158 posts • joined 3 Jul 2009
Tbps not a useful measure
The lifetime of routers is set by the port density of their fastest interface. Quoting that, rather than aggregate inter-port Tbps, is a more useful measure of the awesomeness (or otherwise) of a router. Also useful is maximum packet-per-second of small packets: this is particularly where CPU and network processor designs are used, as this limit is usually reached well prior to bps.
What is cruft, what is security, and can the LibreSSL programmers tell the difference?
@bazza This issue had been fixed in the original OpenSSL code. I think it is reasonable for people to look at this bug and to have a concern that in "decrufting" the code people may be removing features which are actually essential and are not cruft at all.
Detail for four USB ports controller?
The Model B uses a LAN9516 for ethernet and USB. That chip has a fast ethernet controller, two downstream USB ports, all presented as being on a USB hub connected to a upstream USB port (which the Model B connects to the single USB port on the BCM2835 SoC).
So what controller are they now using to get the four USB ports? Or is there a distinct hub chip now?
NBNCo was founded in 2009. It is now 2014 and they still can't tell you what their product is.
The Liberal Party have -- by design -- turned the NBN project into a massive IT failure.
Why 25/50 when 40Gbps exists?
To answer some questions above, 40Gbps is implemented as four 10Gbps channels.
The cost of four lasers within a QSFP is obviously four times the cost of the one laser. But worse, where each laser is run over its own fibre (as must be the case for multimode fibre) the MPO/MTP connectors are expensive, fragile and almost impossible to clean and test in the field. Using 40Gbps ethernet has a high operational cost.
Using 25Gbps channels rather than 10Gbps channels halves the amount of cabling whilst remaining economic. Note that this is being promoted as a top of switch technology, so losing the benefit of 40Gbps over single mode of being able to be optically multiplexed by ITU-compliant 10Gbps passive WDM systems isn't a worry.
Drupal's performance is fine. View Varnish, memcached, APC as part of the product.
To my mind there's two issues. Firstly, authentication and authorisation. Because if you stuff that up, then hey, you've turned defacement of one site into a denial-of-service against government.
Secondly, the software-as-a-service outsourcing. That contract will need to be carefully written because flash loads on government websites have to be met. 404-ing a heap of affected people trying to access a information page about a natural disaster isn't acceptable. Furthermore, the site might need to refuse some users whilst allowing others -- as happens with the CSIRO maps of bushfire activity: the same content is presented to the public and to bushfire controllers, but access by bushfire controllers superceeds best effort.
Similarly the physical location of the service matters. There's no use having all of the Australian government web sites on servers in Singapore or the west coast of the USA. Because when the going gets tough that data won't be readily accessible. The web sites need to have a huge links to peering points on the Australian mainland.
These copyright-hungry journals are slowly harming themselves. My institution weights down copyright-hungry journals, simply because it prevents the institution (and the nation) re-using the very materials it paid to develop. Rather than have my first-class paper be weighted as if printed in a second-tier journal, I simply choose a top-tier journal without hungry copyright conditions.
Bit of a surprise to see a society publishing a copyright-hungry journal. You've really got to consider if that advances the society's goals.
Re: So just to clarify,,,
The certification is for the phone, not for the standalone software or hardware. It is just paperwork: multiple presentations of the one set of tests conducted by the phone manufacturer. If you import a device then you ask the manufacturer for their test pack and munge it into the format expected by the local regulator.
Measuring the wrong thing
The analysis confuses watts (instantaneous energy use) with Ah (energy use to complete the task). The second is the more interesting number when the CPUs have different processing rates. I could readily believe that Atom uses more watts, but also that it finishes the task sooner and can return to a quiescent state sooner than the ARM. So the question is: does Atom's behaviour use more amp-hours than ARM on various workloads. Or more concretely: which will exhaust the battery sooner?
Common platforms, and where you might use this chip
Christian, ARM Ltd are aware of the need for a common platform for operating systems software, and have proposed "Server Base System Architecture" (SBSA) as a standard systems architecture in this space.
Where a AArch64 server chip fits is an interesting question, especially in the sort of quantity needed to make money on a chip. I rather see this as AMD putting their toe into the water, and I imagine that's how their initial customers will also be approaching the chip.
Where AMD with ARM could competitively take on Intel is in offering a system-on-chip for servers: that would have to be funded by a Google or Amazon, but they might see sufficient mainboard simplification to make that worthwhile.
Also, there's a considerable market for 64b ARM in network middleboxes and appliances. These are constrained by heat, so Intel has always been problematic. But ARM hasn't been an option due to the lack of 64b parts (ie, your middlebox can't have more than 2GB of RAM, which is pushing it if you need a full routing table with multiple routing instances).
If you use the popular macports to make your Mac more like another operating system then you might want to update the macports-managed software.
What's wrong with this picture, and the general response to Heartbleed? Our servers are running software which may leak your private data. But *we will keep them running*.
In the contest between security and revenue, revenue wins.
I am paying for OpenSSL, via my Red Hat subscription
Firstly, there are middleman here -- Red Hat Inc, Novell and Google. I pay Red Hat for my Linux, Google for my phone software, in return they should be paying people to do whatever to produce the software they are selling to me. If OpenSSL aren't getting a cheque from Red Hat/SuSE/Google then I have some questions...
Secondly, there's the complexity of SSL/TLS itself. Whilst your article contacts the author (and kudos for that) I would be just as interested in an interview with the IETF chair of the area which published the specification in the first place. The small gain from allowing data in a response (to probe for MTU failures on non-TCP protocols) doesn't appear to me to justify the risk from a change to a security function. It's the chair's role to make that call.
Thirdly, there's C. We desperately need a new systems programming language. We've written enough applications programming languages to know what works and what doesn't (Java, Python, Lua, etc) but those languages simply aren't deployable in the systems programming space.
Finally, there seems to be a whole culture around security bugs which is simply broken. It's pretty much the task of the NSA to lead the response to this, and yet they seem to be the body most assumed to know of the bus existence but not to have told anyone. Not to mention that every bug is seen as an opportunity to sell stuff: create a consultancy, win a bug bounty, scare customers into buying products, scam the unwary, and so on.
Primacy of software
Could have had a little more about the primacy of software: IBM had a huge range of compliers, and having an assembling language common across a wide range was a huge winner (as obvious as that seems today in an age of a handful of processor instruction sets). Furthermore, IBM had a strong focus on binary compatibility, and the lack of that with some competitor's ranges made shipping software for those machines much more expensive than for IBM.
IBM also sustained that commitment to development. Which meant that until the minicomputer age they were really the only possibility if you wanted newer features (such as CICS for screen-based transaction processing or VSAM or DB2 for databases, or VMs for a cheaper test versus production environment). Other manufacturers would develop against their forthcoming models, not their shipped models, and so IBM would be the company "shipping now" with the feature you desired.
IBM were also very focussed on business. They knew how to market (eg, the myth of 'idle' versus 'ready' light on tape drives, whitepapers to explain technology to managers). They knew how to charge (eg, essentially a lease, which matched company's revenue). They knew how to do politics (eg, lobbying the Australian PM after they lost a government sale). They knew how to do support (with their customer engineers basically being a little bit of IBM embedded at the customer). Their strategic planning is still world class.
I would be cautious about lauding the $0.5B taken to develop the OS/360 software as progress. As a counterpoint consider Burroughs, who delivered better capability with less lines of code, since they wrote in Algol rather than assembler. Both companies got one thing right: huge libraries of code which made life much easier for applications programmers. DEC's VMS learnt that lesson well. It wasn't until MS-DOS that we were suddenly dropped back into an inferior programming environment (but you'll cope with a lot for sheer responsiveness, and it didn't take too long until you could buy in what you needed).
What killed the mainframe was its sheer optimisation for batch and transaction processing and the massive cost if you used it any other way. Consider that TCP/IP used about 3% of the system's resources, or $30k pa of mainframe time. That would pay for a new Unix machine every year to host your website on.
There's missing context here.
Sievers' treatment of systemd bug reports is poor, usually closing them or pointing the finger elsewhere. For example the journal logging bug, which flooded messages to the syslogger, locking up systems upon reboot; or the shutdown bug, where shutting down a system whilst shutting it down prevented future logins after the reboot.
In both of those cases people where left with non-functioning systems and repeated bug reports being closed until it was undeniable that the fault was with systemd. This behaviour would so delay bug finding that users were left with unusable Fedora installations for weeks.
With the kernel issue he's simply struck a community which, informed by those previous issues, put its foot down promptly and firmly.
Arrrgh, you right Eric. EndNote was the nightmare I lived through.
OneNote isn't popular with students for its note-taking, it is popular because it lays out bibliographies perfectly. That's a concern for students as bibliographic referencing is their main defence against the claims of plagiarism made by automated essay checkers. The days when you wouldn't reference material well known to a practitioner in the field are well over (Illich's "Deschooling society" had one reference). Lesser academics focus on the presentation aspects of bibliographies rather than their content, so OneNote is highly valued by students for its ability to churn out a variety of formats correct down to minor details of quotes/italics, comma/full stop.
The major competitor is Zotero. It's a fine product and well worth a look. It works quite differently -- being a extension to a web browser -- but the workflow of simply whacking the Z button every time you read something interesting works well and makes OneNote seem rather clunky.
Re: Support for the Locale!!!
The locale is still Australia, as locale isn't the same as the language you are writing in (think currency, date formats, etc).
The characters in Aṉangu and Yolŋu (including the Pitjantjatjara dialect) scripts aren't that rare and will be covered by most large Unicode fonts, including those in recent Windows.
In Mac and Linux you use the system keyboard configuration to alter the Compose key to produce Aṉangu and Yolŋu script. iPhone and Android need a keyboard definition. WIndows Xp was more complicated and AuSIL and others have software. Windows 7 isn't too bad and you can use the system keyboard configuration to add a Compose key. There is a common set of composing keystrokes, so please don't make up your own.
An alternative might be to visit Aspitech in Adelaide on your way through (http://www.aspitech.com.au/) and grab some of their refurbished PC goodness. They are shipping with Win7 and Office at the moment. The people there have strong social aims and many people in the "community sector" find them a godsend.
All this shows is that the analysis company is behind the times. Let me count the ways.
1) Tablets. Where are they?
2) Phones. Where are they?
3) Laptops are the choice of people who need to create content and people who need the cheapest computer possible (Chromebook). That explains the rise of MacOS as a proportion, as pure content consumers have moved to tablets. Also the market quantities are falling and this pushed back through percentages: simply put MacOS users are wealthier, and so more able to afford both a tablet and a laptop.
4) Desktops and laptops are no longer serve the same audience. Gamers want desktops. So you really need to pull out retail desktops as a distinct figure.
5) Sales figures undercount Linux on desktop and laptop. Web usage figures ignore the main revenue from Linux, which is from server and embedded use.
6) The retail and business motivations for purchasing computers have never been so different. Lumping them together doesn't give insight.
Insight is the point collecting data. But this has been presented so that it gives no insight. In fact it is misleading, you'd think that Microsoft with a 90% share of laptop+desktop computing was doing fine. In fact presenting it as percentages is poor: you get no idea of the huge shrinkage in sales.
I think the trick with DIY is to know when to step back. A good example is IP addressing: DHCP is here, it has worked for a decade, but the number of IT shops which have a DIY IP address allocation service. Sigh.
Another thing is not to get caught up reinventing. Not reinventing should be the main reason for choosing a language. LOC costs pretty much the same in every language, you want the maximum amount of work done by the language so that you write the minimum LOC.
Finally the nature of DIY has changed. These days it isn't the IT Dept going its own way. It's the IT Dept participating in a worldwide open source project which sets the future standard. This means that DIY isn't a dead end, but just the future arriving sooner.
ISPs are moving away from NetFlow for accounting as the current number of flows make it too onerous and router manufacturers are finally coming to the party with better customer accounting mechanisms.
So the notion that collecting IP flow metadata can occur at low cost is now wrong for the largest Australian ISPs.
The Sony Experia Z1 Compact is close to the ideal corporate phone, although sadly at corporate pricing. If you are looking for an Android equivalent to the iPhone 4S then this would be it.
Computing is needed to understand the modern world
Folks, computing is the new mathematics. Just as the development of symbolic manipulation opened completely new fields of science, computing is now starting to do the same. You simply can't understand the modern world built by science and engineering without understanding computing.
Embedded computers are changing the hard elements of engineering too. You can't build a car without a computer. In a few years you'll have computers changing the gears on your bike.
This is my objection to the curriculum. As you can read above the curriculum is very focused on Information Technology. There's very little computing and very little about computing devices. It's all about how to use computing to do administrative tasks.
In short you'll end up with a classroom where everyone has a smartphone but no one can tell you it's anything other than magic.
Enterprise computing is like competing to be the new mainframe
Why the fuss about enterprise server applications? They might matter to VMware, but they are a diminishing proportion of computing. ARM AArch64 might be able to run enterprise applications, but why would a ARM chip manufacturer go up against Intel with a high power, high throughput chip when it could make more certain money with a low power design.
What AArch64 will do is to totally win the "appliance" space, as those little 1RU boxes which do useful things will have less power draw (and thus heat issues, and thus be cheaper to design and own and be more reliable). Those appliances pretty much all run Linux, or will.
AArch64 also has a decent run at a peculiar sort of desktop -- the space which used to be filled by the "IBM mainframe terminal". Low power -- with its reliability and a small size on the desk -- makes ARM more attractive than x86.
I doubt ARM has much hope in the cloud, as it's performance per watt still trails x86 at maximum throughput. Remember that cloud servers are provisioned to be either at maximum throughput or to be off. If ARM is used it will be because cloud providers specify their own CPU, and obviously AArch64 is available for that, whereas x86 is not.
There's plenty of opportunity to make money with ARM servers without going for the hardest market first. The only attraction of enterprise is the large profits available due to their poor management of computing. But that very same poor management makes them adverse to change.
Not in the business of updates
I've read here a few times that HTC isn't in the business of updates. I don't get that -- why aren't they in the business of updates? If they charged $20 they'd turn updates from a cost centre to a revenue centre, and they'd be getting money without all the trouble of a new hardware design. And it would encourage customers to buy a HTC phone rather than a carrier phone. I just don't get why HTC persists in the current economics of Android updates when they could change the system for a better result for themselves and their customers.
Re: Desk Clutter
Tom7, this is so small you'd mount it on the back of the monitor.
But the enterprise has gone...
There's no "going back to the enterprise" as a safe ground for Blackberry.
The reason is simple: the iPad. That device's sheer ubiquity has forced IT managers to do the previously unthinkable: accept unmanaged devices onto their networks. Sure some networks still won't do that -- sometimes for valid security reasons -- but those networks accept a fall in productivity. If you're business IT -- arguing that computing improves productivity -- then not allowing personal-owned IT is a path to irrelevance.
There's also finance. Everyone has a smartphone. Why on earth would en enterprise want to issue its staff with one? You might do it as some sort of non-monetary salary. But the idea that key staff get a work mobile is one that beancounters are no longer keen on.
The result is that enterprises aren't as keen on Blackberry's special sauce as they were during the era of the standardised desktop operating environment.
No they can't. You have a contract which lists the items you paid consodweration for, and that will list firmware updates.
However you would be surprised how few IT shops actually reflect all the contributing factors in the lftetime cost of ownership in their contracts.
I am sure HP will happily provide already-contracted firmware updates where it is required to, and happily collect an annual fee from the other 99%.
Re: Nice apologist article, Simon !
"...the indonesians consistantly send there boys over to us to collect classified information from us."
It's a bit naive to think that Indonesian spies are primarily interested in the activities of the Australian government. They are much, much more interested in the activities of Indonesian nationals in Australia. In short, you don't find them trying to tap the "secure blackberrys" of Australian politicians but intimidating people raising funds for West Papua and ensuring that Indonesian students studying at Australian universities know that their government is watching them.
I think that part of the anger of the Indonesian establishment towards Australia's spying activities is that this focus of Indonesian spying activities away from Australia's government has been shown up by the depth of Australia's penetration of Indonesia's government. Not once -- as during the East Timor crisis -- but now twice.
It also helps that Australian police forces have taken foreign government intimidation much more seriously in the past decade, a positive side-effect of the War on Terror.
Re: Radiation Monitoring
Required here in Australia. The sensors are typically mounted on the input hopper and on the forklifts. Where I live in Port Adelaide there was an incident recently where a forklift sensor alarmed and the quick thinking and selfless operator drove the forklift well away from the factory's buildings before running.
2600 participants isn't a huge MOOC, it's about the usual completion number for a typical course.
If this stadium looks like a vulva, then the average stadium look like an anus.
Assange is living in Western Australia?
Surely the other political parties will contest his enrolment in the WA electoral roll. It's not like he will have spent even a night at his claimed domicile.
Simon, In the Apple II/BBC Micro era Australia used to have one of the best computing teaching resources in the Parks Computer Centre in western Adelaide. Unfortunately this was disbanded, but most of the staff are still around. There are also some outstanding computing educators. I would have thought that building upon their experience would be the approach to take, but I can't see that this has been done.
It would be well worth your time to track down a few of the old Parks staff and interview them about what works and what doesn't.
ID is pointless
Let's simply ink the fingers of people as they vote. No ID required. It's compulsory voting, so assuming that any adult presenting themselves with a uninked finger and matching a name on the electoral roll is valid is pretty good. In any case inking fingers is a lot better the presenting a fakable ID.
Not that the problem is large, the AEC estimate was maybe 800 people voted twice.
Host key generation is more of a risk
The real risk is the generation of SSL host keys so early in the system first boot that there is no other source of entropy other than the hardware RNG. Best of all these weak keys are permanent.
Midnight Oil hardly the only source
Des Ball wrote a very fine book on Pine Gap in 1988. It had huge press coverage at the time. The book was pretty much a summary for the public, so most of its facts were already known, not the least through the Democrats tabling leaked papers in the Senate.
The Labor Party was pretty much down on Pine Gap after the interference of the CIA in Australian domestic politics in 1974/5. One of the surprises of the Hawke government was that it didn't close the base in response when it came to power in 1983. Rather it negotiated a treaty spelling out exactly the management and function of the facility. Needless to say, this upset the Oilz.
So yeah, not news.
Re: Eduroam, and similar
I'd also add that universities differ from business because: (1) Unis are in IT for the long haul. They're not put off by a half-decade-long project with international agreements and interoperability like Eduroam. (2) Academics are used to listening and criticising proposals. So you get a good hearing, and then you get a bucket-load of encouraging criticism. Part of the reason for the quality of uni networks is the free review from people who's consulting rates are thousands per day. (3) Business simply doesn't operate at the same scale nor require the same availability. I've had business employing a few 10,000s people tell me they run a "big" network, whereas 10,000 users would be a quiet day for a uni network.
"the University core networks" -- no. The learning and research facilities are the core network. It's the administration networks which are non-core. That's the essential mindset difference between university and business computing.
The same is true of applications. You break some Oracle thingie used by administtration, that's bad news. You break e-mail across the university, you're fired.
At universities BYOD is simply fact. It's not a "strategy" open to debate. Even non-IT staff will have a laptop, a tablet and a phone and will expect equivalent access to resources from all of them. The university may or may not own all of those devices. Students definately don't want the uni to provide their IT -- although if the uni can arrange a hefty discount on a MacBook Air they'd be grateful.
The idea that you can limit access to administrative systems to a subset of platforms isn't a goer either. Just the other day I checked a student's recorded test mark from my phone (connected via Eduroam), whilst the student and I were discussing their progress. Business would call this "responsive customer service" and the more you tighten down the access to the admin systems the less responsive the staff can be.
Ubuntu, the Maralinga of Canonical's nuclear testing
So because Canonical has ambitions in the mobile phone market they are going to once again use Ubuntu as the testing ground for their technology. Didn't we have enough of this when they re-did the user interface so it worked better on tablets. And on netbooks before that.
Here's a thought. You've already got millions of users who want a nice desktop and laptop operating system. How about keeping them happy?
Time-based labs? Pre-VM concept.
The whole notion that time-based licensing was suitable for product testing was always doubtful, and these days it is entirely wrong. The test VM forms part of the delivery of the service. It is an environment you can arc up when further testing of the deployed infrastructure is needed -- either to extend it, or if to analyse a balls-up by exploring if the killer issue was seen in testing.
So TechNet had to go. A better vendor would have replaced it with something better.
Re: Root password, sure, but why wasn't the data encrypted?
Encryption isn't a cure-all, a wand you can wave to solve problems of access to data. Firstly, encryption implies keys. If you are sending the document to thousands of people within the one organisation and the attacker is within that organisation and has sysadmin rights... how long is the key going to stay secure? This is even true for PGP -- in that case you scarf up everyone's keyrings as well as the data and attack the passwords used to secure the keyrings. Secondly, there's still nothing to stop you from copying the data (should someone appear with a key later on). Thirdly, there's nothing to prevent traffic analysis. For example, a lot of files suddenly appearing in the plans-to-attack-libya directory.
Encryption is an interesting two-edge sword. Take command-line access to a server on a secure network. Should that use SSH. Or should that be forced to use Telnet so that the exact session of the person connecting can be audited? As a result a lot of secret-level systems use less encryption mechanisms than you would expect.
Disabling USB is difficult, as you can't unilaterally disable the controller as there are interior USB buses within modern computers tying the components on the mainboard together. What you can do is to refuse to mount USB media which hasn't been authorised. That's a bespoke SELinux rule for Linux, or a software hack for Windows. Neither is supported by the operating system's manufacturer, which is an issue for large installations.
I am not saying that people shouldn't try encryption and blocking access to devices -- a low fence is still a fence. But don't be surprised by the success of an attacker with abundant inside information and access.
In this case the technology is irrelevant. Let's say both the encryption and the USB were tight. The attacker was determined to leak and would have simply chosen another path. All we can do is to force people in to technologies with higher risk, such as cameras.
In focussing on these technical matters we're also ignoring the cultural -- the "why" of leaks. When you ask an organisation to act contrary to its mission the organisation betrays the people in the organisation most motivated by its mission. Having that betrayal of the individual by the organisation repaid by betrayal of the organisation by the individual is to be expected.
Re: Pi Power Supplies
The Pi does have a separate power input (as well as the Micro USB), see pins 2,4,6 on connector P1, they take a regulated 5VDC at 1200mA.
kt, it's not just kilotonnes of TNT
Can I suggest the kiloteen, the data downloaded by one hundred bored teenagers using their mobiles to find something diverting enough to retain their attention.
Related: the megameme, the data of 1000 cat photos from Reddit.
ACCC says "Microeconomics? We've heard of it."
Oh dear, the ACCC can't see the competitive issues with QoS enabling tighter vertical integration? You'd expect the economic rationalists at the ACCC to be in favour of the free market, not promoting the use of technical measures in such a way that they'd increase customers barriers to exit.
It's not the ACCC's job to solve the capacity planning issues of large carriers. The ACCC's job is to prevent the carriers from gouging consumers. Looks like there's been some regulatory capture over at the ACCC.
Commenters overcompensating with shotguns rather than sportscars
What is it with people and shotguns? If it's high enough to be regulated by CASA then you are shooting at an aircraft. All you need is one set of aviation laws with amped up penalties after a decade of terrorism hype, one gung-ho police prosecutor who wants to leave a rural backwater, and one dim magistrate (odds are good, he's still in the sticks twenty years on) and you're off to do time.
When the police rock up and you're holding a shotgun and the air smells of firing, then the cops aren't listening to you rail against these hypocritical greenies flying stuff over your head and destroying the peace and quiet of the countryside they're supposedly saving. The cops are going to be much more concerned about separating you and the gun, and wondering if all of your babbling indicates a dangerous mental state.
By far your best bet is to go and ask the people to stop harassing you with their strange plane. And if they don't, then ring the police. Then you're the injured local, they're the outsiders acting outrageously using a new form of trespass, and the police will try to solve your problem rather than wondering about the travel time for the special weapons squad.
Old, old attack.
It's not rocket science, I described the correct configuration for AusCERT back in 1999 in response to DDoS we were seeing then. (Modify the "bogon" list for the newer "end of IPv4, so let's use every Class A possible" list of bogon networks.) See AL-1999.004 at http://www.auscert.org.au/render.html?it=80
Questions for SDN articles
When reporting SDN could you please make it clear if: (1) the SDN is OpenFlow or proprietary; (2) if use has royalties or revenue splitting, or requires the customer to purchase special licenses; (3) if use requires the customer to load software which is atypical for enterprise switch deployment. Thank you.
Quasars used for accurate geodesy, which is essential for GPS
"quasars have magnitude above 14". You are thinking about visible light, in radio astronomy about 10% of quasars sit well above the noise floor of a modest radio telescope.
The GPS systems don't use quasars directly. Rather quasars are the reference points for surveying the earth's position in space (you use two interconnected radio telescopes half the world apart to form a baseline and then measure the different times of arrival of the quasar's signal, triangulating the position of earth in space, the jargon word is "eVLBI geodesy").
This field of research was very interesting to the US Air Force during the Cold War as it was directly relevant to the accuracy of their ICBMs. That work continues to be used to accurately place GPS and surveillance satellites, which are flip sides of the same triangulation problem.
- Just TWO climate committee MPs contradict IPCC: The two with SCIENCE degrees
- 14 antivirus apps found to have security problems
- Apple winks at parents: C'mon, get your kid a tweaked Macbook Pro
- Feature Scotland's BIG question: Will independence cost me my broadband?
- Driverless car SQUADRONS to hit Britain in 2015