Damn you, El Reg
My 1,981 character post / rant got chopped. Was it really that boring?
2021 posts • joined 23 Apr 2008
My 1,981 character post / rant got chopped. Was it really that boring?
@Alexh2o, your point about content creation is key. In many ways the trends that we see today (the rise of Android phones, iPad, etc) is crazy.
Up until relatively recently the devices on which people consumed digital content were largely the same as those used to create it in the first place. Now it’s completely different. The consumers of all things digital now use iThings, Androids. But these devices really suck when it comes to creating anything.
However the rush for consumer orientated tablets and smartphones is for nothing unless there is content. And the serious content creators still need/want a good desktop PC / MAC. They've got commercial time-scales to work to; they've a lot to get through. Bling may be cool, but it doesn’t pay.
I think the world is on the edge of a disaster so far as hard nosed realistic people who have a job to do are concerned. Some predictions:
*PCs and Macs I think will suddenly become *very* expensive as most people will be buying tablets / smartphones. That’s bad news for the world’s content creators.
*RIM is clearly in trouble and may fold. But can you imagine a business world without them? The offerings from their smartphone competition have only a veneer of utility (though MS could get close, because they do Exchange). Deep down the others have little to compare to what Blackberry give their corporate customers to safeguard, control and access data. RIM are interesting (i.e. brave) because they've not abandoned their hard nosed approach to business functionality purely for the sake of chasing bling and fat consumer sales. It's costing them dear.
*The rise of BYOD, Android, iPhone in the workplace is increasing the risk to employers' businesses. Even the people on Capitol Hill see that. It is setting the scene for some spectacular company failures. E.g. a company could easily go under purely because some idiot employee (and there *will* always be one) jail-broke their Android and wound up with a key logger Trojan. Worse still, no one would know why it happened. Blackberries don’t have that problem, and that should still matter to companies.
In all I think current tech trends are leading to less work being done, more unnecessary risk for companies, and a change in work/life balance that in the round is detrimental to an economy. I suspect that company IT people recognise that, but few company directors do.
Probably still not strong enough to get all those small bolts that disappear into the floor when you take a complex gadget apart...
"If you still don't get it, then watch "Brazil" and read/watch "1984.""
Seconded, but I wouldn't bother with 1984. Brazil is all you need to see to understand big bureaucracy gone out of control. A very good film indeed. And now I can't get the music out of my mind, "da daaaaa, da da da da da da da daaaaa, da da da da da da da daaaaa, etc.
Customer: my electricity has been switched off
Central Services: is you name Buttle?
Hey! Leave the 'electro-sensitivity loons' alone. You, me, we're all electro-sensitive. I'm especially so. I find that any kind of contact whatsoever with a mere 230V AC, even the sort that comes out of the socket on the wall, really knocks me off my feet for the rest of the day. How do you think that'd feel, every day of your life? I can tell you, it soon begins to pall.
Now then, I must see to getting that bedside lamp earthed properly. Been meaning to do it for years.
Pigeons have a whole different set of vulnerabilities. Cats, 12 bore shotguns, hawks. "Pigeon packet lost error. Hit by car. Abort retry ignore?"
They've been around for a while, but jumbos are still cool in a big engineering sense.
Also it's good to see that there's some sectors of industry that are still building things to last and be renewed along the way. The aviation industry has got used to aircraft lasting decades, and so that's how they're built. The extreme example is the B52 - the US military are planning on running those to at least 2040 apparently, by which time they'll be nearly 100 years old.
Other industries (e,g, cars) seems to give the impression that they believe that that sort of business model stifles innovation and progress. Cars (especially American ones) are not built to last or be viable beyond a few years. But look at how new aircraft have changed, and look at the upgrade kits (better engines, etc) that you can get for old aircraft. There's no reason why cars shouldn't be built with that sort of life cycle in mind.
"and in the end no-one will benefit but the lawyers."
Absolutely agree. And the losers are ultimately going to be the tax payers and shareholders.
I think that LightSquared do have a point about spectrum management. A regulator can't possibly manage a country's spectrum if it is powerless to resist the effects of non-compliant devices flooding the market. In effect LightSquared are saying "Your the bloody regulator, now get on and bloody well regulate". It's risky - technically LightSquared are in the clear, but there's a whole load of bolitics (thank you Roger Mellie) and commercial pressure involved too.
The UK equivalent, Ofcom, is in a similar predicament. Having ignored the importation of non-compliant powerline Ethernet devices for too long they're now in a position of having to shut the stable door after the horse has bolted.
We all need good spectrum regulation, but we only get that if the regulators are quick off the mark when problems arise. I don't think we've seen that in either of these cases.
"Yeah, the dumbasses want to impress friends by making phone calls at 30,000 ft."
Oh I don't know, I reckon quite a lot of business people would welcome the opportunity to carry on their work for the hours that are otherwise spent idle. But I doubt their fellow passengers would agree...
The risk was a lot higher in the early days of mobile phones. Base stations were few and far between, possibly up to 35km away in those days. That meant that a phone could easily be putting out 2 Watts of RF power. That doesn't sound a lot, but it's quite capable of generating an RF spark in the right sort of resonant gap.
The trouble is that no RF engineer would ever say with 100% confidence that such a gap couldn't be formed by, for example, the pump nozzle and filler pipe. It's not that straight forward to analytically prove that it's impossible. You can say empirically that it's highly unlikely (and we all know that - petrol stations are notorious for not blowing up on a regular basis). Your RF engineer might therefore use the word 'er' when asked. And there's nothing like the word 'er' for causing some health and safety type to print up a sticker as quick as a flash.
Nowadays there's so many base stations around a phone hardly ever has to put out 2 Watts, so the risk (whatever it actually was) is now correspondingly less. From what I hear most garage forecourt signs have a mobile base station in them anyway, which is a rather neat way of dealing with the fact that people tend to ignore sensible but only very rarely critically important advice.
"But, even so, just how many incidents on *any* plane have ever been attributed to wireless interference from a phone?"
I think the problem there is that it would be very difficult to attribute an incident or crash to a mobile phone.
Say a phone did cause some catastrophic interference. At best, all that would show up in a black box recording would be some erroneous behaviour on the the part of the aircraft's systems. Now this might be caused by the software being fed duff information (because of the interference), or it might be because the software is just plain buggy, or perhaps a cable connection went bad. Who's to tell?
In effect we're getting to a point were an experiment is going to be tried out on the paying public. Nothing wrong with that, the odds are likely to be heavily in favour of crash-free flying. But we're going to have to accept that the price might be some unexplained incidents.
BTW Qantas had a frightening incident with an aircraft that suddenly went defective in mid flight, started going all over the place. Apparently it took a reset to restore normality. I don't know if they know why yet. But incidences like this occurring when in flight mobile use *is* permitted will likely leave investigators with a nagging doubt. At least at the moment they can definitively blame the aircraft's system, find the cause and make positive recommendations.
Once upon a time C++ compilers merely translated C++ source code into C source code then ran a C compiler.
Just because C isn't restricted to being object orientated (nor is C++ for that matter) doesn't mean that you can't write code in a an object oriented way. Functions pointers in structures aren't a million miles away from methods inside objects (that's probably how the early C++ compilers/translators did it). Though of course with C you can cheat, so one would generally opt to be disciplined when coding that way to stick to whatever rules you choose. I wrote an entire windowing graphics library that way back in the really old days (1980s).
"Anybody else think not being able to identify C compiled code is itself a comment on the industry? "
I thought all viruses had a noise associated with them. Isn't it normally "Aaaaarrrrrggghhhh bugger" coming from the user when they discover the damn thing's infected their PC/MAC/Android?
Ah, we'll there in lies the problem that Google built in to Android without really thinking about it.
Apple RIM and Microsoft can push updates out to their customers with with a high degree of independence from the network operators. This provides a long term assurance to customers that their handset is going to be looked after for the duration of their contract (mostly). Whereas with Android you're completely depending on the handset manufacturers and the network operators, a much less certain proposition. Most people won't know how to root their handset.
As for a VM, with the ARM A15 cores that's not necessarily so hard to do. Don't know when they make it in to handsets...
Clearly someone in favour of poor programming and buggy software.
Yep, it'll be something like that, possibly they've done it as direct manipulation of some time string. I've not read their report.
Yet again some programmers somewhere have been shown to be a bunch of lazy ******s. Symantec had a similar problem with their antivirus software updater thinking that the year 2010 came before the year 2009... And are Apple devices capable yet of setting an alarm off properly at the appointed time? I suspect not.
I honestly don't know what goes on in such programmer's heads. If they cared to take even a casual glance at the reference manuals for things like the ANSI C library, Java class libraries, etc. they would find a wealth of functions that a bunch of careful people spent time and effort on so as to make it easy for other programmers to avoid this sort of mistake. Why don't they just ******g use those well thought out routines instead of thinking "I know, I'll do it all over again myself in my own code, how hard can it be, I'm sure a string will do?". It's unbelievable madness. Who supervises these idiots and reviews their code, designs their systems? Sure, the purpose of the routines available in the libraries may be a bit tricky to fully understand, but then time measurement systems (e.g UTC plus the various local timezones) are not a trivial topic. But that's no excuse to ignore the complexity.
In The Beginning Apple chose to go with GSM then UMTS (rest of the world) ahead of CDMA2000 (USA). Clearly they'd decided to pursue the majority market first, for understandable reasons. But you'd have thought that the attraction of a few hundred million extra customers would be appealing, even for Apple.
It does seem to call into question the depth of Apple's engineering outfit. If Samsung and everyone else can spare a few engineers to shoehorn in a TD-SCDMA baseband into their phones, why can't Apple? Are they lacking staff in their engineering department? Did they not consider the different basebands when they designed the kit? Are they now find themselves having to make big changes to the internals to accommodate the different chips (if that is in fact necessary...)?
Having said that, it's hard to question anything about Apple's strategy when they've got $90billion in the bank.
Or indeed the fact that a large proportion of laptops do indeed have a spare mini PCIe socket (that also carries USB for some reason) inside?
Quite often the connector isn't soldered to the motherboard, but the contacts are generally there somewhere on the PCB.
Well, if you count a .vcf contact file in that category of Bluetooth action, then quite often. It's sort of the forgotten piece of functionality that Apple don't seem to care about.
In the days before Facebook, Myspace, etc. people would swap photos with Bluetooth. It's still quite effective - it incurs no data cost and works irrespective of local 3G coverage - but generally people have forgotten about that little nugget of functionality. With smart phones and now tablet becoming more and more capable, it ought to be something that happens more.
This is typical of the effect that Apple (and Android to a lesser extent) have had on the market. Apple implemented a smart-ish phone that was lacking in several key areas (crap battery life, the ability to make and hold on to a phone call, most of Bluetooth not working, no support for Java apps, poor security, expensive to buy and run, unreliable, fragile, prone to breaking down, etc. etc) and then managed to persuade customers in their millions that none of that was important. And then Apple (in an impressively cynical way) claim to be great innovators when they finally put right some of those omissions (which so far they've failed to do by most accounts).
For example, it is perfectly feasible these days to turn up with (and I pick RIM purely because I know for sure that you can do this) a Blackberry Playbook, give a Powerpoint presentation using it connected to a projector with the notes displayed on the tablet's own screen, using your Bluetooth connected Blackberry phone as a back/forward remote control. Then if anyone wants a copy of the presentation you could just Bluetooth it straight to their phone, laptop, along with contacts details. None of it needs a cloud, 3G coverage, prior knowledge of their email address, etc. So it works, and its reliable. Unless they've got an iPhone, or possibly if they've an Android. Reliability of such technology is absolutely key if you want to make that sale!
But because of the generally depressive effect that the likes of Apple have on what people's expectations are of technology, it is very difficult for companies that do actually make this stuff work to get their message across. In effect their advertising has to be along the lines of "Things can be better than an iPhone! Don't fall for the Cappuccino PR". Which isn't a very good message to dish out to people who've got expensive iPhones burning holes in their pockets and don't want to be told they're stupid for not having looked elsewhere.
Didn't Amazon get round the similar restriction on iPads by doing a HTML 5 version of Kindle? So what's to stop application developers doing the same on Android? Isn't Google just going to drive app developers away from their market place?
...the programming language 'Whitespace'.
Yes, the source code is just white space characters... All non-whitespace characters are ignored.
Modula 3? Surely no one would be that strange...
@Destroy All Monsters: Yes there are! Once ADA runtimes emerged that actually used O/S facilities like threads instead of re-creating those things for themselves, ADA got a *lot* better. From what I vaguely remember, Greenhills ADA on VxWorks was pretty decent indeed.
I can remember the problems that a bunch of colleagues had in the very early '90s with ADA (on Vax I think). The application they'd written was too large for any of the ADA runtimes of the day to actually run. I never found out if they ever got it going...
The K machine is mighty pricey, and it would interesting to see how that cost breaks down into CPU vs I/O development. The K machine has a very elaborate interconnect. This must surely take a lot of the credit for the machine's sustained performance being so close to the theoretical peak performance. The cost break down might illustrate where investment pays off best.
Having said that (see my post above), RIM's existing phones are pretty smart already. QNX is necessary to give them a way ahead, and presumably their existing bespoke OS is at a dead end, but it does run alps and do some clever messaging.
@synthmeister, I think you're right about RIM having osborned themselves, but I'm not sure they had much choice. If they'd kept quiet about their plans then it would have looked like they didn't care about the smartphone revolution. That would have raised questions about the long term and pushed buyers toward other platforms that were moving forward. That would probably have been worse.
As it is I think RIM have bravely chosen what will be an excellent technical solution (the Playbook really is quite good), but will require a lot of hard work to get people to understand and want the benefits. They could have chosen to do something clunky (ie squeezing a desktop OS onto a mobile platform like Android and iOS) but that would have lessened the technical value of their offering.
@Chris 3; indeed, and the reason they want iPhones is because they've fallen for the Apple PR about "just working" and "secure". Now really they should be hard nosed unemotional decision makers who carefully weigh up decisions before making them. Questions like "are my company communications secure" should float across their minds. Getting decisions like that wrong is potentially a company-killer these days. Pandering to the whims of staff isn't going to look so clever when your company's intellectual property becomes public because some employee jailbroke their iphone or android and got infected with a dodgy app.
@Henry Blackman; do some reading. As any reader of El Reg should know emails arriving on your corporate blackberry were encrypted when they left your company email server, thanks to the way that BES works. So even if BB's servers were compromised the emails themselves aren't available to a hacker/government.
And so far as availability is concerned, RIM did have one little outage, but iPhones have one every afternoon when their battery goes flat.
The SPEs mostly are Altivecs, with just enough extra to make them independent. Most of the maths instructions are exactly the same. That was the whole point of them. So long as you know what you're doing Altivec/SPEs are pretty good for image/signal processing, and I've had very good mileage out of them for ten+ years now.
As for physical size, the Power7 MCM is large. Sure, an individual core is smaller but then that would not be a "power7", would it. Based on previous form I doubt that IBM will be doing anything for the PS4. They're not really interested in the games or pc market, it's just not worth their while. Freescale might, but they've got their own product releases coming along without worrying about Sony too. Sony have rights to Cell, so they can go it alone. But it really wouldn't be worth their while doing something Power-ish that wasn't based on Cell because they'd lose all the existing software.
As for performance, I've seen Cell get very close to 250GFlops (though not in a PS3). You have to try quite hard to get anything with Intel written on the top near that figure. It's hard to program, but get it right on Cell and your sums are done very quickly indeed. GPUs have a lot of grunt, provided AMD or Nvidia have written you a library function...
Er, the Cell is very different to anything else, including the entire Power range. Sure, Cell borrowed bits and pieces from the Power ecosystem (a PowerPC core here, 8 Altivecs there), but they were glued together in a totally unique way.
It is almost inconceivable that Sony will change to Power7, at least not in the form it exists in when built in to an IBM mainframe. The physical size alone (we're talking something as big as your hand) would preclude that. Plus the Power7 architecture has some even weirder components; I very much doubt that a games designer will be able to find a use for a decimal maths co-processor.
But you are right to point out the dilemma that Sony are in. I see several pitfalls with going x64; even today you have to have a fairly mighty x64 before you've got as much floating point grunt as the Cell has. That won't come cheap, plus it's tricky to not appear as a fancily dressed PC that isn't running Windows.
They could rely on GPU for the floating point grunt that's needed, and ATI / Nvidia would have you believe that they're the ones for the job. But whilst GPUs undoubtedly have a lot of grunt, again it's tricky not to appear as just a PC.
Developing Cell (16 SPEs? Hooking up to a beefier GPU?) would be a brave step, but it would allow them to preserve the investments that have already been made in software whilst bringing about demonstrable improvements, and they would retain complete control. But if they do, I wish they would let Linux back on it. When the PS3 was launched there was much talk about it's computing grunt. Including the GPU doing single precision floating point it was apparently topping out at about 2.1 TFLOPS, the Cell accounting for about 200 GFLOPS. Even now you have to try reasonably hard to beat that for the money.
Oh wait, it does have just one control. Must be genuine!
...only a lot more functionality.
Flame bait in oh so many ways?
As for spotting the fakes, you can tell because it's got more than one control. Apple would have designed it with just one dial...
3G phone networks; the doors on a lot of trains; aircraft navigation; the national electricity grid. I bet you've relied on one or two of those at some point in the past, and would have been seriously annoyed if they stopped working.
GPS has wormed its way into a lot of things that we count on everyday and take for granted. If GPS jamming became a major thing it would cause a lot of problems.
Those engineers who have built highly important systems that rely solely on GPS (I don't know if that's completely true for the list above) are lazy idiots. Either that or their management are cheapskates. The trouble is that GPS is just so damned convenient for many things. And to have to fall back on something else when GPS packs up is difficult. But it is necessary if you're building something that really matters like trains, planes, grids and comms.
Though I think the recent decision to put catapults and arrestor gear on the carriers makes sense; they're no longer dependent on the JSF which (from my reading of the runes elsewhere in the press) is no bad thing...
The problem with consultants is making sure that there is enough of them around who actually know real stuff when it really matters, eg when war breaks out.
Consultants quite often used to be on the inside but ended up outside. If you rely on consulsants that probably means that there is no internal program growing new experts. Its not like many universities offer courses on how to build effective war machines, etc. The result is that when the current crop of consultants die off there is likely to few compentent replacements, and fewer still who recognise the deficiency. Not a good situation to be in if a major war happens...
I think you'll find that RF front ends care very much indeed about peak received power. Your, er, fail is to not read what I wrote. So I shall write it once more. That band, even if used for sat phones, makes it difficult to get a co-located built in GPS receiver to work at the same time. The FCC should have realised it, the GPS industry should have realised it, and maybe LS were naïve in assuming that the others already had.
As for your spurious refereneces to mobile phone power management, if you do happen to be 35km from a base station (traditionally the GSM max range) then your phone will be running at 2watts and yes it will go flat very quickly indeed. If anything your analysis illustrates that a sat phone in that band co-located with a GPS rx would have been in more trouble than if afflicted by LTE; you couldn't have got the satellites closer to allow the phones to put out less power. Even more reason for the FCC to spot that the band allocation was going to be problematic.
And on the topic of the FCC, if it isn't their job (as you hint) to know what band allocations are viable and best for all US users, what exactly are they for?
It would be interesting to know what plans for built in GPS LS had/have. Their whole proposition depends on GPS working in LS handsets. If they can show a working technical solution then that is the end of the debate. Alas I think that whether LS's filters are 'working' is not going to be assessed objectively by the press, commentards, lobbyists, politicians, CEOs, shareholders, accountants, bureaucrats or the courts, and I doubt that engineers will get a look in.
Firstly your phone is capable of 2watts. Secondly a sat phone would have needed at least that, probably more. Thirdly, a sat phone might only have been 2watts but being only an inch or so from the GPS receiver in the sat phone it would have been far worse than a L^2 basestation a few miles away. Or haven't you heard of the inverse square law? Or do you some how imagine that a sat phone user would never have wanted a built in GPS? Or that they wouldn't have wanted to do something like use Google Maps? Or do you imagine that a sat phone GPS jammed by the sat phone itself would somehow have been less annoying to every sat phone user than what LTE will cause?
The problem (yes it is real enough) has *always* been there lurking in the frequency allocation, satellite or terrestrial, but it has taken L^2 to make everyone realise it. No one noticed before because sat phone was a commercial non starter. That's how badly the FCC have dealt with this.
I'm no happier about this than anyone else. But don't blame L^2, blame the FCC for not keeping their eye on the ball. I mean, hadn't the FCC heard of built in GPS?
" The bands adjacent to GPS are designated as space->earth only,"
No they weren't. They were designated for satellite mobile comms. That involves earth->space too, presumably in the same frequency band, or you're in for a very one sided phone call.
Given that, don't you think that the earth based transmitter that fulfilled the earth->space leg would have caused similar problems? Especially as that transmitter would have been co-located with a GPS rx, namely the one in the satellite mobile phone?
The problem with this whole debate is that no-one is thinking clearly about what the actual technical issues are, were, and always have been. In summary,
1) The GPS industry have been lazy in ignoring frequency allocations that were always going to cause them problems
2) The FCC didn't even begin to think what the technical issues would have been resulting from the sat phone band allocation
3) The FCC were negligent (as you hinted) when permitting the change use; it would have been a good time to have re-assessed the band allocation given widespread GPS usage
4) The FCC are being cowards in not telling the GPS industry "tough luck”
5) LightSquared have been naive in trusting the cowardly FCC to do their job properly. A little testing would have revealed the problem ages ago and avoided the whole thing.
As a result:
1) LightSquared's investors are going to lose a lot of money
2) The US taxpayer is going to lose a lot of money through under exploitation of valuable spectrum space, and possibly as a result of being sued by LightSquared for negligence
3) A viable technical solution to the whole issue (the filters that LightSquared developed) is probably not going to see the light of day because the freetard GPS industry's lobbying looks to have paid off
Many people will crow if and as seems likely when LightSquared are dead and buried. But really it is the tax payer and consumer who is going to lose out the most. That's not good for anyone?
"buying patents they didn't develop and then using them against business rivals"
*If* the patent system worked properly it would not be possible to use patents in this way.
The ideal behind patents is that someone invents something, gets the patent, others can't copy without a license. The practical reality is that patents are awarded for the most trivial "inventions" these days with very little regard for what has actually been done before. This is leading to many companies having overlapping sets of weak but apparently enforceable patents, so war breaks out. The Venn diagram of companies and their patent holdings must look like a whole load of frothed up bubble bath.
The US patent system is truly dreadful in this regard, but I'm not sure that anyone else's is very good either. The problems were built in at the start. Surely it doesn't take a super genius to spot that the prior art checking process was only going to grow exponentially. Then the US made life LOTS harder for itself by allowing software patents....
The only way to fix the system is to tighten up on what 'invention' actually means, specifically in relation to triviality, the invention 'date' and commercial realisation. That should then be retrospectively applied to all patents when a dispute is initiated by an offended company. I imagine that the majority of disputes would evaporate in a puff of smoke. A whole lot of lawyers will of course strongly lobby against such a move, so it's up to the politicians to think for themselves and see what harm is being done to their economies.
But I think you're right; all these companies are behaving in an entirely logical manner given the patent system that exists. I would like to think that some of them are thinking "why is this happening really?" and will become motivated to lobby for a change. At the moment it looks to me like all the leading companies will be run by patent law experts instead of people who actually know stuff and build things :-(
"I don't see how this is going to end well for MS"
Well, I think that's because you haven't considered the wider picture. In case you hadn't noticed there is a trend towards ARM (or at least, seeking lower power) across the whole IT industry now. It's no longer just mobile platforms where ARM matters now.
Microsoft *have* to respond to that. They probably need to get Windows Server and Desktop on ARM too just to survive. What we're seeing here with WOA and tablets is clear evidence that MS are positioning the whole product line to be ready for the x86->ARM transition. Sure, what we've seen here is limited, won't run everything, but the constraints are now "artificial" (purely for the sake of battery life), not technical.
It is not so hard to imagine that they could role out a desktop / server orientated version (where there is at least mains power to use) without too much difficulty. Remember that Linux is already there, and whilst MS haven't worried too much about the Penguin on desktops, Tux does do extremely well in server land. MS make money on servers too, and they want to carry on doing so I'm sure.
It seems clearer that there are going to be many sources of hardware - TI, Qualcomm and NVidia are involved - so there would seem not to be a grave prospect of hardware lock in. A bit like the PC market. And that can only be good for consumers. There maybe a good prospect of desktop ARM hardware sooner rather than later. There already is ARM server hardware at HP.
As for performance, I think that the days of ARM being too slow are already gone. And if you don't think they are, the quad core 2GHz 64bit DDR3 parts that are being talked up now should address your concerns. I mean, it's not that long ago that people were dreaming of such performance in their desktops! The smartphone revolution has shown that there is plenty of performance in ARM, and plenty more to come.
Satisfying a bunch of corporate users who want bling phones isn't necessarily good for business. RIM did have their little woopsie last year. But Apple are quite rubbish at software reliability, generally break at least one thing with every update and, as we've seen iOS 5, prone to making significant and unannounced API changes. Not exactly a good thing on which to bet business critical apps and functions.
Of course, Apple have managed to persuade people that they don't need things like battery life, good antenna performance and the ability to make phone calls reliably when moving. That's good PR but not good for end users. Apple may also succeed in convincing people and their businessess that they don't need reliable software or online/cloud services either. And that may indeed be fine, right up until your business is killed as a result of a really big cock up. Do you trust Apple not to have one of those?
...An old timer pulling a double bluff.
I think we should be told.
Bluetooth could do it years ago, for free, using very little power, regardless of network coverage, was nigh on universal, and is going to the phone of the bloke you're talking to. Plus it defines the format of the contact data (VCard so far as I know), and is presented on the Bluetooth link as being a contact. This allows the receiving phone to add it to the address book with little room for misinterpreting the data fields and merely prompting the user as to whether they want this to happen. Problem solved well over a decade ago.
*If* a handset manufacturer has put the 'Bluetooth' label on the box then all of the Bluetooth stack should work properly. But Androids and iPhones were (still are?) definitely a bit dodgy in the whole Bluetooth area generally (don't know about WP7), mostly I suspect because they were in too much of a hurry to do the job properly to a high quality. You couldn't even use a hands free kit reliably, something that not even Steve Jobs would ever have been able to convince the world it doesn't need. But I digress. Solving manufacturer laziness by saying "Oh just email it" is a backward step, an acknowledgement that things are a little bit worse than they used to be, is making things far more complicated than they need to be for no good reason.
But you are right, it is a personal choice. Personally I'd far rather not have to type in someone's email address or phone number just so that I can swap contact details with them when I could just zap it directly into their phone with only a couple of button presses. I mean, once you've typed in their email address you've pretty much done the whole thing anyway. By way of analogy, when you're giving someone a business card you don't want to be writing it out long hand in front of them do you? Bluetooth is instant, so there's no awkward email/text delay and there's so little room for error.
The phone that is, not the fruit.
My old Symbian Sony Ericsson could too, and the millions of Nokias that preceeded it.
Someone read the EULA?!?!
Regardless of what it used to say in the EULA, given how old Office is there was clearly no intent on MS's part to claim IPR ownership. They've not so far as I'm aware ever sued someone for royalties because they used a licensed version of Word. Maybe their reticence was because they were embarrassed that they're made a legalese mistake; lawyers are supposed to be fully conversant in it.
It goes to show how useless the complex EULAs of the software world actually when not even the originators understands its meaning fully...
Ah yes, NT for PowerPC. I saw it a few times, back in the days when the future was multiplatform-bright. It was running on embedded SBCs that the manufacturer was using instead of buying PCs... There must be many IT experts out there who are way too young to remember what was going on back then.
One has to be impressed by Intel for how well they managed to see off those platforms through being good at marketing and silicon processing. Perhaps they've now made the mistake of believing their own PR...
Yep, I agree with all that. The only thing that I'd add is that if MS can make the programming API the same as it always has been, then porting from x86 (little endian, 32 or 64 bit) to ARM (little endian, 32 and soon to be 64 bit) probably isn't so difficult.
I'm reminded of the demo MS did showing an early Win8 running on ARM with Office 10, printing to an ordinary Epson printer. The hints were that the printer driver and Office were simply recompiled and appeared to work well enough for MS to be confident about giving a public demo.
If MS have really pulled it off to that level that soon (the demo was ages ago), it might be reasonably easy for app devs to avoid the shade of Wordperfect and 123...
They should be.
It has been suggested / reported / deduced that MS and / or Apple are going to start doing proper ARM versions of their mainline OSes. Linux is, of course, already there. The OEMs (in MS's case) could then start manufacturing laptops with ARMs in them and offer customers a choice. There will always be users who genuinely need a lot of compute grunt, but most people would probably value battery life over raw horsepower.
That'd be great for end users, ARM, MS, Apple and the OEMs, but almost certainly not so good for Intel. Nor AMD.
From what I've seen of Macs being used by bio/med scientists, the features of ZFS will be utterly lost on most of them. I showed a bunch the miracle of network file sharing; it took a *lot* of explaining. Up to that point to move files between Macs (all connected to the same LAN) they were using a large number of USB drives, and were forever running out of space on them. Ironically, didn't Apple beat MS to network shared disk space? Seems to have been a waste of time. Then I tried to get them to understand the wisdom of backups. Fair enough, they know cells inside out and I don't, but even so.
ZFS is way too complicated I suspect for the average Mac user to understand and desire. A Mac is perhaps the last machine on earth you'd pick as being a platform on which to store enough data to even begin to get ZFS interested. But I wish them luck.
Biting the hand that feeds IT © 1998–2018