148 posts • joined 3 Jul 2009
If you use the popular macports to make your Mac more like another operating system then you might want to update the macports-managed software.
What's wrong with this picture, and the general response to Heartbleed? Our servers are running software which may leak your private data. But *we will keep them running*.
In the contest between security and revenue, revenue wins.
I am paying for OpenSSL, via my Red Hat subscription
Firstly, there are middleman here -- Red Hat Inc, Novell and Google. I pay Red Hat for my Linux, Google for my phone software, in return they should be paying people to do whatever to produce the software they are selling to me. If OpenSSL aren't getting a cheque from Red Hat/SuSE/Google then I have some questions...
Secondly, there's the complexity of SSL/TLS itself. Whilst your article contacts the author (and kudos for that) I would be just as interested in an interview with the IETF chair of the area which published the specification in the first place. The small gain from allowing data in a response (to probe for MTU failures on non-TCP protocols) doesn't appear to me to justify the risk from a change to a security function. It's the chair's role to make that call.
Thirdly, there's C. We desperately need a new systems programming language. We've written enough applications programming languages to know what works and what doesn't (Java, Python, Lua, etc) but those languages simply aren't deployable in the systems programming space.
Finally, there seems to be a whole culture around security bugs which is simply broken. It's pretty much the task of the NSA to lead the response to this, and yet they seem to be the body most assumed to know of the bus existence but not to have told anyone. Not to mention that every bug is seen as an opportunity to sell stuff: create a consultancy, win a bug bounty, scare customers into buying products, scam the unwary, and so on.
Primacy of software
Could have had a little more about the primacy of software: IBM had a huge range of compliers, and having an assembling language common across a wide range was a huge winner (as obvious as that seems today in an age of a handful of processor instruction sets). Furthermore, IBM had a strong focus on binary compatibility, and the lack of that with some competitor's ranges made shipping software for those machines much more expensive than for IBM.
IBM also sustained that commitment to development. Which meant that until the minicomputer age they were really the only possibility if you wanted newer features (such as CICS for screen-based transaction processing or VSAM or DB2 for databases, or VMs for a cheaper test versus production environment). Other manufacturers would develop against their forthcoming models, not their shipped models, and so IBM would be the company "shipping now" with the feature you desired.
IBM were also very focussed on business. They knew how to market (eg, the myth of 'idle' versus 'ready' light on tape drives, whitepapers to explain technology to managers). They knew how to charge (eg, essentially a lease, which matched company's revenue). They knew how to do politics (eg, lobbying the Australian PM after they lost a government sale). They knew how to do support (with their customer engineers basically being a little bit of IBM embedded at the customer). Their strategic planning is still world class.
I would be cautious about lauding the $0.5B taken to develop the OS/360 software as progress. As a counterpoint consider Burroughs, who delivered better capability with less lines of code, since they wrote in Algol rather than assembler. Both companies got one thing right: huge libraries of code which made life much easier for applications programmers. DEC's VMS learnt that lesson well. It wasn't until MS-DOS that we were suddenly dropped back into an inferior programming environment (but you'll cope with a lot for sheer responsiveness, and it didn't take too long until you could buy in what you needed).
What killed the mainframe was its sheer optimisation for batch and transaction processing and the massive cost if you used it any other way. Consider that TCP/IP used about 3% of the system's resources, or $30k pa of mainframe time. That would pay for a new Unix machine every year to host your website on.
There's missing context here.
Sievers' treatment of systemd bug reports is poor, usually closing them or pointing the finger elsewhere. For example the journal logging bug, which flooded messages to the syslogger, locking up systems upon reboot; or the shutdown bug, where shutting down a system whilst shutting it down prevented future logins after the reboot.
In both of those cases people where left with non-functioning systems and repeated bug reports being closed until it was undeniable that the fault was with systemd. This behaviour would so delay bug finding that users were left with unusable Fedora installations for weeks.
With the kernel issue he's simply struck a community which, informed by those previous issues, put its foot down promptly and firmly.
Arrrgh, you right Eric. EndNote was the nightmare I lived through.
OneNote isn't popular with students for its note-taking, it is popular because it lays out bibliographies perfectly. That's a concern for students as bibliographic referencing is their main defence against the claims of plagiarism made by automated essay checkers. The days when you wouldn't reference material well known to a practitioner in the field are well over (Illich's "Deschooling society" had one reference). Lesser academics focus on the presentation aspects of bibliographies rather than their content, so OneNote is highly valued by students for its ability to churn out a variety of formats correct down to minor details of quotes/italics, comma/full stop.
The major competitor is Zotero. It's a fine product and well worth a look. It works quite differently -- being a extension to a web browser -- but the workflow of simply whacking the Z button every time you read something interesting works well and makes OneNote seem rather clunky.
Re: Support for the Locale!!!
The locale is still Australia, as locale isn't the same as the language you are writing in (think currency, date formats, etc).
The characters in Aṉangu and Yolŋu (including the Pitjantjatjara dialect) scripts aren't that rare and will be covered by most large Unicode fonts, including those in recent Windows.
In Mac and Linux you use the system keyboard configuration to alter the Compose key to produce Aṉangu and Yolŋu script. iPhone and Android need a keyboard definition. WIndows Xp was more complicated and AuSIL and others have software. Windows 7 isn't too bad and you can use the system keyboard configuration to add a Compose key. There is a common set of composing keystrokes, so please don't make up your own.
An alternative might be to visit Aspitech in Adelaide on your way through (http://www.aspitech.com.au/) and grab some of their refurbished PC goodness. They are shipping with Win7 and Office at the moment. The people there have strong social aims and many people in the "community sector" find them a godsend.
All this shows is that the analysis company is behind the times. Let me count the ways.
1) Tablets. Where are they?
2) Phones. Where are they?
3) Laptops are the choice of people who need to create content and people who need the cheapest computer possible (Chromebook). That explains the rise of MacOS as a proportion, as pure content consumers have moved to tablets. Also the market quantities are falling and this pushed back through percentages: simply put MacOS users are wealthier, and so more able to afford both a tablet and a laptop.
4) Desktops and laptops are no longer serve the same audience. Gamers want desktops. So you really need to pull out retail desktops as a distinct figure.
5) Sales figures undercount Linux on desktop and laptop. Web usage figures ignore the main revenue from Linux, which is from server and embedded use.
6) The retail and business motivations for purchasing computers have never been so different. Lumping them together doesn't give insight.
Insight is the point collecting data. But this has been presented so that it gives no insight. In fact it is misleading, you'd think that Microsoft with a 90% share of laptop+desktop computing was doing fine. In fact presenting it as percentages is poor: you get no idea of the huge shrinkage in sales.
I think the trick with DIY is to know when to step back. A good example is IP addressing: DHCP is here, it has worked for a decade, but the number of IT shops which have a DIY IP address allocation service. Sigh.
Another thing is not to get caught up reinventing. Not reinventing should be the main reason for choosing a language. LOC costs pretty much the same in every language, you want the maximum amount of work done by the language so that you write the minimum LOC.
Finally the nature of DIY has changed. These days it isn't the IT Dept going its own way. It's the IT Dept participating in a worldwide open source project which sets the future standard. This means that DIY isn't a dead end, but just the future arriving sooner.
ISPs are moving away from NetFlow for accounting as the current number of flows make it too onerous and router manufacturers are finally coming to the party with better customer accounting mechanisms.
So the notion that collecting IP flow metadata can occur at low cost is now wrong for the largest Australian ISPs.
The Sony Experia Z1 Compact is close to the ideal corporate phone, although sadly at corporate pricing. If you are looking for an Android equivalent to the iPhone 4S then this would be it.
Computing is needed to understand the modern world
Folks, computing is the new mathematics. Just as the development of symbolic manipulation opened completely new fields of science, computing is now starting to do the same. You simply can't understand the modern world built by science and engineering without understanding computing.
Embedded computers are changing the hard elements of engineering too. You can't build a car without a computer. In a few years you'll have computers changing the gears on your bike.
This is my objection to the curriculum. As you can read above the curriculum is very focused on Information Technology. There's very little computing and very little about computing devices. It's all about how to use computing to do administrative tasks.
In short you'll end up with a classroom where everyone has a smartphone but no one can tell you it's anything other than magic.
Enterprise computing is like competing to be the new mainframe
Why the fuss about enterprise server applications? They might matter to VMware, but they are a diminishing proportion of computing. ARM AArch64 might be able to run enterprise applications, but why would a ARM chip manufacturer go up against Intel with a high power, high throughput chip when it could make more certain money with a low power design.
What AArch64 will do is to totally win the "appliance" space, as those little 1RU boxes which do useful things will have less power draw (and thus heat issues, and thus be cheaper to design and own and be more reliable). Those appliances pretty much all run Linux, or will.
AArch64 also has a decent run at a peculiar sort of desktop -- the space which used to be filled by the "IBM mainframe terminal". Low power -- with its reliability and a small size on the desk -- makes ARM more attractive than x86.
I doubt ARM has much hope in the cloud, as it's performance per watt still trails x86 at maximum throughput. Remember that cloud servers are provisioned to be either at maximum throughput or to be off. If ARM is used it will be because cloud providers specify their own CPU, and obviously AArch64 is available for that, whereas x86 is not.
There's plenty of opportunity to make money with ARM servers without going for the hardest market first. The only attraction of enterprise is the large profits available due to their poor management of computing. But that very same poor management makes them adverse to change.
Not in the business of updates
I've read here a few times that HTC isn't in the business of updates. I don't get that -- why aren't they in the business of updates? If they charged $20 they'd turn updates from a cost centre to a revenue centre, and they'd be getting money without all the trouble of a new hardware design. And it would encourage customers to buy a HTC phone rather than a carrier phone. I just don't get why HTC persists in the current economics of Android updates when they could change the system for a better result for themselves and their customers.
Re: Desk Clutter
Tom7, this is so small you'd mount it on the back of the monitor.
But the enterprise has gone...
There's no "going back to the enterprise" as a safe ground for Blackberry.
The reason is simple: the iPad. That device's sheer ubiquity has forced IT managers to do the previously unthinkable: accept unmanaged devices onto their networks. Sure some networks still won't do that -- sometimes for valid security reasons -- but those networks accept a fall in productivity. If you're business IT -- arguing that computing improves productivity -- then not allowing personal-owned IT is a path to irrelevance.
There's also finance. Everyone has a smartphone. Why on earth would en enterprise want to issue its staff with one? You might do it as some sort of non-monetary salary. But the idea that key staff get a work mobile is one that beancounters are no longer keen on.
The result is that enterprises aren't as keen on Blackberry's special sauce as they were during the era of the standardised desktop operating environment.
No they can't. You have a contract which lists the items you paid consodweration for, and that will list firmware updates.
However you would be surprised how few IT shops actually reflect all the contributing factors in the lftetime cost of ownership in their contracts.
I am sure HP will happily provide already-contracted firmware updates where it is required to, and happily collect an annual fee from the other 99%.
Re: Nice apologist article, Simon !
"...the indonesians consistantly send there boys over to us to collect classified information from us."
It's a bit naive to think that Indonesian spies are primarily interested in the activities of the Australian government. They are much, much more interested in the activities of Indonesian nationals in Australia. In short, you don't find them trying to tap the "secure blackberrys" of Australian politicians but intimidating people raising funds for West Papua and ensuring that Indonesian students studying at Australian universities know that their government is watching them.
I think that part of the anger of the Indonesian establishment towards Australia's spying activities is that this focus of Indonesian spying activities away from Australia's government has been shown up by the depth of Australia's penetration of Indonesia's government. Not once -- as during the East Timor crisis -- but now twice.
It also helps that Australian police forces have taken foreign government intimidation much more seriously in the past decade, a positive side-effect of the War on Terror.
Re: Radiation Monitoring
Required here in Australia. The sensors are typically mounted on the input hopper and on the forklifts. Where I live in Port Adelaide there was an incident recently where a forklift sensor alarmed and the quick thinking and selfless operator drove the forklift well away from the factory's buildings before running.
2600 participants isn't a huge MOOC, it's about the usual completion number for a typical course.
If this stadium looks like a vulva, then the average stadium look like an anus.
Assange is living in Western Australia?
Surely the other political parties will contest his enrolment in the WA electoral roll. It's not like he will have spent even a night at his claimed domicile.
Simon, In the Apple II/BBC Micro era Australia used to have one of the best computing teaching resources in the Parks Computer Centre in western Adelaide. Unfortunately this was disbanded, but most of the staff are still around. There are also some outstanding computing educators. I would have thought that building upon their experience would be the approach to take, but I can't see that this has been done.
It would be well worth your time to track down a few of the old Parks staff and interview them about what works and what doesn't.
ID is pointless
Let's simply ink the fingers of people as they vote. No ID required. It's compulsory voting, so assuming that any adult presenting themselves with a uninked finger and matching a name on the electoral roll is valid is pretty good. In any case inking fingers is a lot better the presenting a fakable ID.
Not that the problem is large, the AEC estimate was maybe 800 people voted twice.
Host key generation is more of a risk
The real risk is the generation of SSL host keys so early in the system first boot that there is no other source of entropy other than the hardware RNG. Best of all these weak keys are permanent.
Midnight Oil hardly the only source
Des Ball wrote a very fine book on Pine Gap in 1988. It had huge press coverage at the time. The book was pretty much a summary for the public, so most of its facts were already known, not the least through the Democrats tabling leaked papers in the Senate.
The Labor Party was pretty much down on Pine Gap after the interference of the CIA in Australian domestic politics in 1974/5. One of the surprises of the Hawke government was that it didn't close the base in response when it came to power in 1983. Rather it negotiated a treaty spelling out exactly the management and function of the facility. Needless to say, this upset the Oilz.
So yeah, not news.
Re: Eduroam, and similar
I'd also add that universities differ from business because: (1) Unis are in IT for the long haul. They're not put off by a half-decade-long project with international agreements and interoperability like Eduroam. (2) Academics are used to listening and criticising proposals. So you get a good hearing, and then you get a bucket-load of encouraging criticism. Part of the reason for the quality of uni networks is the free review from people who's consulting rates are thousands per day. (3) Business simply doesn't operate at the same scale nor require the same availability. I've had business employing a few 10,000s people tell me they run a "big" network, whereas 10,000 users would be a quiet day for a uni network.
"the University core networks" -- no. The learning and research facilities are the core network. It's the administration networks which are non-core. That's the essential mindset difference between university and business computing.
The same is true of applications. You break some Oracle thingie used by administtration, that's bad news. You break e-mail across the university, you're fired.
At universities BYOD is simply fact. It's not a "strategy" open to debate. Even non-IT staff will have a laptop, a tablet and a phone and will expect equivalent access to resources from all of them. The university may or may not own all of those devices. Students definately don't want the uni to provide their IT -- although if the uni can arrange a hefty discount on a MacBook Air they'd be grateful.
The idea that you can limit access to administrative systems to a subset of platforms isn't a goer either. Just the other day I checked a student's recorded test mark from my phone (connected via Eduroam), whilst the student and I were discussing their progress. Business would call this "responsive customer service" and the more you tighten down the access to the admin systems the less responsive the staff can be.
Ubuntu, the Maralinga of Canonical's nuclear testing
So because Canonical has ambitions in the mobile phone market they are going to once again use Ubuntu as the testing ground for their technology. Didn't we have enough of this when they re-did the user interface so it worked better on tablets. And on netbooks before that.
Here's a thought. You've already got millions of users who want a nice desktop and laptop operating system. How about keeping them happy?
Time-based labs? Pre-VM concept.
The whole notion that time-based licensing was suitable for product testing was always doubtful, and these days it is entirely wrong. The test VM forms part of the delivery of the service. It is an environment you can arc up when further testing of the deployed infrastructure is needed -- either to extend it, or if to analyse a balls-up by exploring if the killer issue was seen in testing.
So TechNet had to go. A better vendor would have replaced it with something better.
Re: Root password, sure, but why wasn't the data encrypted?
Encryption isn't a cure-all, a wand you can wave to solve problems of access to data. Firstly, encryption implies keys. If you are sending the document to thousands of people within the one organisation and the attacker is within that organisation and has sysadmin rights... how long is the key going to stay secure? This is even true for PGP -- in that case you scarf up everyone's keyrings as well as the data and attack the passwords used to secure the keyrings. Secondly, there's still nothing to stop you from copying the data (should someone appear with a key later on). Thirdly, there's nothing to prevent traffic analysis. For example, a lot of files suddenly appearing in the plans-to-attack-libya directory.
Encryption is an interesting two-edge sword. Take command-line access to a server on a secure network. Should that use SSH. Or should that be forced to use Telnet so that the exact session of the person connecting can be audited? As a result a lot of secret-level systems use less encryption mechanisms than you would expect.
Disabling USB is difficult, as you can't unilaterally disable the controller as there are interior USB buses within modern computers tying the components on the mainboard together. What you can do is to refuse to mount USB media which hasn't been authorised. That's a bespoke SELinux rule for Linux, or a software hack for Windows. Neither is supported by the operating system's manufacturer, which is an issue for large installations.
I am not saying that people shouldn't try encryption and blocking access to devices -- a low fence is still a fence. But don't be surprised by the success of an attacker with abundant inside information and access.
In this case the technology is irrelevant. Let's say both the encryption and the USB were tight. The attacker was determined to leak and would have simply chosen another path. All we can do is to force people in to technologies with higher risk, such as cameras.
In focussing on these technical matters we're also ignoring the cultural -- the "why" of leaks. When you ask an organisation to act contrary to its mission the organisation betrays the people in the organisation most motivated by its mission. Having that betrayal of the individual by the organisation repaid by betrayal of the organisation by the individual is to be expected.
Re: Pi Power Supplies
The Pi does have a separate power input (as well as the Micro USB), see pins 2,4,6 on connector P1, they take a regulated 5VDC at 1200mA.
kt, it's not just kilotonnes of TNT
Can I suggest the kiloteen, the data downloaded by one hundred bored teenagers using their mobiles to find something diverting enough to retain their attention.
Related: the megameme, the data of 1000 cat photos from Reddit.
ACCC says "Microeconomics? We've heard of it."
Oh dear, the ACCC can't see the competitive issues with QoS enabling tighter vertical integration? You'd expect the economic rationalists at the ACCC to be in favour of the free market, not promoting the use of technical measures in such a way that they'd increase customers barriers to exit.
It's not the ACCC's job to solve the capacity planning issues of large carriers. The ACCC's job is to prevent the carriers from gouging consumers. Looks like there's been some regulatory capture over at the ACCC.
Commenters overcompensating with shotguns rather than sportscars
What is it with people and shotguns? If it's high enough to be regulated by CASA then you are shooting at an aircraft. All you need is one set of aviation laws with amped up penalties after a decade of terrorism hype, one gung-ho police prosecutor who wants to leave a rural backwater, and one dim magistrate (odds are good, he's still in the sticks twenty years on) and you're off to do time.
When the police rock up and you're holding a shotgun and the air smells of firing, then the cops aren't listening to you rail against these hypocritical greenies flying stuff over your head and destroying the peace and quiet of the countryside they're supposedly saving. The cops are going to be much more concerned about separating you and the gun, and wondering if all of your babbling indicates a dangerous mental state.
By far your best bet is to go and ask the people to stop harassing you with their strange plane. And if they don't, then ring the police. Then you're the injured local, they're the outsiders acting outrageously using a new form of trespass, and the police will try to solve your problem rather than wondering about the travel time for the special weapons squad.
Old, old attack.
It's not rocket science, I described the correct configuration for AusCERT back in 1999 in response to DDoS we were seeing then. (Modify the "bogon" list for the newer "end of IPv4, so let's use every Class A possible" list of bogon networks.) See AL-1999.004 at http://www.auscert.org.au/render.html?it=80
Questions for SDN articles
When reporting SDN could you please make it clear if: (1) the SDN is OpenFlow or proprietary; (2) if use has royalties or revenue splitting, or requires the customer to purchase special licenses; (3) if use requires the customer to load software which is atypical for enterprise switch deployment. Thank you.
Quasars used for accurate geodesy, which is essential for GPS
"quasars have magnitude above 14". You are thinking about visible light, in radio astronomy about 10% of quasars sit well above the noise floor of a modest radio telescope.
The GPS systems don't use quasars directly. Rather quasars are the reference points for surveying the earth's position in space (you use two interconnected radio telescopes half the world apart to form a baseline and then measure the different times of arrival of the quasar's signal, triangulating the position of earth in space, the jargon word is "eVLBI geodesy").
This field of research was very interesting to the US Air Force during the Cold War as it was directly relevant to the accuracy of their ICBMs. That work continues to be used to accurately place GPS and surveillance satellites, which are flip sides of the same triangulation problem.
Touching an individual machine means you are losing
My advice would be that if you are touching an individual machine, then you are losing.
For servers that means Puppet, Nagios, single signon, a brutal approach to hardware failure, funnelling everything through a ticketing system, referring to that as the documentation for changes in your configuration repo, documentation written for use by trained people rather than blow-by-blow. Because you end up with a low headcount, then that means evolution of hardware, not once-in-a-blue-moon refresh projects.
For desktops that means either SOA for BYOD, but not some expensive middle ground. It means automating the common helpdesk tasks. It means using the vendor's tools rather than third-party tools, because that lowers your training costs because users get good hits from Google. It means online training.
For networking it means DHCP for IPv4 and Dynamic DNS. It means IPv6 is standard for intranet use (ie, no interior NAT). It means not fiddling with ethernet autonegotiation. It means anycast DNS forwarders. It means cookie cutter cupboard, building and core designs. It means treating VM servers as first class items in the network. It means 802.1x for wireless rather than web landing pages.
Skype is a phone company
> but relying on Skype for emergency calls...?
Here's the thing -- the emergency services call taker doesn't get to decide what technology the call maker used. They can hardly say "hang up and ring back on a real phone". As for tracing and interception, if those don't work for Skype then that's where the criminal activity moves to.
"The usual suggestion, that users choose strong passwords that they don't re-use, will no doubt be ignored..."
Evernote could easily use the authentication mechanism of the user's choice: Facebook, OpenID, and so on. There's a big number. But they choose not to, as they want to "own" the customer. That is not the user's fault, but the result of corporate strategy.
Re: Free Is Good @The Dim View
That might have been an accurate view five years ago, but LibreOffice these days is solid (I wrote a book using it, the publisher didn't even notice that I wasn't using MS Office).
What LibreOffice needs to do now is to get ahead of Office. Office has always had half-arsed templates; its flowing of inserted drawings is just bizarre; its graphs are PR-oriented toys; presentations are overly constrained to MS's layout; it treats meta-data as a incidental; and it doesn't play well with others.
The LO user interface needs work -- the colour selection is a user interface disaster. But in general it is solid.
The SVG import in LO has improved a lot, and this makes it very easy to pull vector images into documents and presentations. LO is still the simplest way to produce a PDF.
Drones are fine
I've worked for a military contractor. Basically, you end up trusting that the government will use your tools well, just as people in the military trust that the government will put them in harm's way for a worthwhile cause. It's impossible to say in advance if you yourself might agree about some future conflict-- when my weapons were used in East Timor against a group of military thugs who were killing people for fun I couldn't have been happier.
Some projects obviously carry more ethical issues than others, and all the firms I worked for were open in their acknowledgement of that and were supportive of individual's decisions not to work on particular projects on ethical grounds. This was not only generosity, it was a government requirement for the access to projects in the secret and above classifications, so as to minmise the risk of betrayal.
The ethical question about drones is simple enough: in a just war is it wrong for a just participant to use that weapon. You can certainly make that case for nuclear weapons, for some types of land mines, and for some finishings of small bombs (making them look like toys, etc). I can't see that you can make the case for drones.
This isn;t to say that drones have no ethical issues. But that the issues are far more subtle than those presented by the ethicist. For example, automatic tracking and fire raises the potential for firing on civilians, and yet allows the drone to engage an enemy under cover.
Re: "LINX told users struggling to reinstate those ports to simply reset them"
Actually, the request made sense, as LINX allows one MAC address seen per port. Dropping carrier empties the list of seen MAC addresses.
Reactor for a mine in the middle of nowhere, so NIMBY claims are wrong
This isn't a NIMBY issue. The suggestion is to use nuclear to power an expanded Olympic Dam mine, some 500Km in the desert from Adelaide, the nearest city. The issues of "what if it goes wrong" are around staging a medical evacuation and emergency response across large distances.
A major unaddressed issue is that mines have a definite life and are in the middle of nowhere -- the reactor can't be repurposed but will have to be decommissioned. The technology and the price for doing this are both underdeveloped.
Microsoft can't buy a market leader, it has to merge and thus change it's essentials
Microsoft doesn't buy the market leader because Microsoft doesn't want to change.
Consider the example of Microsoft buying Apple. There is no way that can be a purchase, it has to be a merger. Furthermore, the Apple executives are the ones which need to bump out the Microsoft executives, since those at Apple have made the right choices and executed them well.
The result is a company which isn't Microsoft anymore. And that's why Microsoft don't buy the market leader -- they don't want to lose "their" company to outsiders.
Not suited to businesses which use Exchange for e-mail
It's not suitable for business for the simple reason that it can't connect to Exchange. The later (ie, working) versions of exchange-ews don't work as the version of the GNOME software in this Ubuntu is too old and Canonical didn't put any effort into backporting evolution-ews to their older version of GNOME.
It says a lot about about the half-arsedry which is Canonical that they'd ship an operating system aimed at business users without a decent connector to Exchange.
The digital amendments to the Copyright Act made international price discrimination more certain, as it made circumventing geolocking software possibily criminal (It's complex if it is or not, as it depends very much if a court believes that the technical protection measure exists *only* for price discrimination. If the court rules that there's an anti-piracy element to the digital protection measure then circumventing the TPM is criminal.)
The other reason for high costs in Australia is price gouging by distributors. These companies often have exclusive agreements with producers, and then use that exclusitivity to charge monopoly prices. That's the essential reason why software in vertical markets costs so much more in Australia (and also the reason for the high price of car and bicycle parts).
- Mounties get their man: Heartbleed hacker suspect, 19, CUFFED
- Batten down the hatches, Ubuntu 14.04 LTS due in TWO DAYS
- Samsung Galaxy S5 fingerprint scanner hacked in just 4 DAYS
- Feast your PUNY eyes on highest resolution phone display EVER
- Wall St's DROOLING as Twitter GULPS DOWN analytics firm Gnip