Re: Poor Apple.
Intel's been working on EFI since 1998 if we really want to get into it.
It's Forth based, right? So we're probably talking about a stack overflow?
2137 posts • joined 18 Jun 2009
Intel's been working on EFI since 1998 if we really want to get into it.
It's Forth based, right? So we're probably talking about a stack overflow?
There's no central authority instructing the nodes to act; they discover whether it's safe to broadcast through their own local observation. The lack of a centralised actor and the ostensible resulting chaos leads to a more efficient overall system.
I'm not a libertarian but I can see there's a reasonable argument in there.
I think the poster may be confusing refresh times and touch response times — the iPhone display has run with hardware acceleration and a 60Hz response rate since day one whereas the Android OS didn't mandate a GPU at first and versions prior to 3.0 did all drawing and updating, including scrolling, on the CPU.
There's definitely some response lag on iOS devices; I couldn't tell you exactly what it is but it's easy enough to discern if you try dragging or scrolling. Just watch exactly what you put your finger down on, then move it quickly and watch whatever is trying to track your finger always be a few milliseconds behind. It subjectively feels like a lot less than 100ms but is definitely more than a frame, and clearly more than the 1ms Microsoft has demonstrated in the lab. I guess 12ms likely buys you a single frame of lag, which it definitely feels like iOS is doing worse than.
Then I guess the question is: is it a hoax in the same way that leaked government initiatives are sometimes hoaxes — i.e. the most cost effective way of floating an idea before investing any money in it?
But isn't the reason the current one is being withdrawn from the EU that some aspect of it needs to be redesigned for compliance? Though I'll wager it'll be just to seal off the fans, especially as G5-style liquid cooling probably isn't something they'd want to attempt again.
I'll go a step farther and say I honestly don't see that Swartz would have approved of this move — what does a list of bankers have to do with JSTOR's remuneration to publishers rather than authors, to arbitrary and ridiculous sentencing limits or to the failure of the law to differentiate hacking penalties based on motive?
Anonymous continue to act as a group of attention-hungry children with no philosophy or ideology beyond enjoying a bit of bullying. Sometimes they may pick targets you personally don't like but that hardly absolves them.
Here in the US at least, the original, To Play the King and The Final Cut are all carried; at twelve episodes in total they're a very entertaining way to spend a weekend.
I haven't watched the Spacey version yet but I guess there'll be some severe adjustments as if you wanted to ascend to President without winning a national election then you'd need Ford-style to be Speaker of the House and for President and Vice President to resign or die. It's happened exactly once under exceptional circumstances — it's not at all like in the UK where the PM only needs the support of the majority of his peers, giving us relatively frequent 'unelected' leaders like Callaghan, Major (at first) and Brown.
Apple hasn't said anything on the record, it's merely blocked some software with known security issues. You seem to be implying that to do so is criticism and that Apple should be allowed to criticise only if its own software is perfect but if that's the standard then surely none of can criticise Apple unless we've written only flawless software?
As noted above, and in its name, the relevant standard is international, emanating from an industry-recognised body based in Illinois. So the alternative position would have been "everyone in the industry uses this standard, but we know better because we're politicians". I suspect that position is more in disagreement with most people's political leanings than whatever you're accusing.
That shouldn't really be an application-level feature anyway — it should most naturally reside in the GDI (or whatever has supplanted it) according to my understanding of the Windows API as PDF documents are just another abstract canvas to paint to.
It's cheap to criticise; how would you define your anointed 'serious computing'? I can think of no distinction that doesn't either bar all computers more than about five years old (ie, based on processing capacity) or deign that only about 2% of the world takes part.
My feeling is that — even if you exclude leisure browsing — as tablets can do at least 90% of what people use computers for, they are computers. Just like an oven without a hob is still a kind of oven, a two-seater car is still a kind of car, a light aircraft is still a kind of aircraft, Heat is still a kind of magazine and Vin Diesel is still a kind-of actor.
But naturally you're not willing to tell us what that reason is or make any other arguments beyond a bare statement of your position?
It would seem to me that the use case Apple cite — AutoCAD — is quite real, even if rare; designs often need to be shown at sites and in meetings. Tablets (including but not limited to the iPad) have fully functional office suites for 95% of computer productivity tasks and have or are acquiring a bunch of the more specialist software, like Mathematica, DICOM viewers, first draft video editors, etc.
Tablets are serious computing devices, including the iPad.
I've seen some security researchers be quite concerned about the slow proliferation of Android updates too, and I'm pretty sure they're not motivated by iPhone fanaticism. The basic complaint is that differences between versions of the published source code are an authoritative documentation of security problems that Google recognised in the previous version, and if a serious security problem is found there's no point telling everyone to update their OS because a large number of them can't do that thanks to HTC-or-whomever.
Other than that I think you're right about choice, though the "years behind the competition" stuff is obviously a stretch. See e.g. the web browser — a pretty fundamental component. You could copy and paste in Android's as of April 2009. You could copy and paste in the iPhone's as of June 2009.
Ars helpfully did a poll — http://arstechnica.com/apple/2012/12/poll-technica-whats-your-preferred-ios-mapping-app/ — 32% of iPhone users prefer Apple's Maps, 52% Google's and the rest are mainly on Waze (6%) or 'other' (4%), with Nokia and Bing both also managing to break the 1% barrier.
Summary then: Google has already saved the day, though a third of people weren't bothered anyway — and this is amongst technically minded folk that say the day-in day-out headlines.
My understanding of the standard Oric conversation is that somebody has to point out that...
When the ULA scans a video byte it's either an instruction to change the current two-colour palette or to output pixels. The net effect is that you have to leave a gap anywhere you want to change output colours. Teletext did a similar thing but got away with it because words naturally have gaps between them. Video memory was more compact but it was very hard to write multicolour games when compared to the other micros of the day.
(and access to the sound chip was only through the versatile interface adaptor, which was a further pain)
The GSX extension to CP/M shipped in 1982, offering a hardware independent API for graphics. I think you're wading into ill-defined waters trying to talk about the first OS that supported bitmapped graphics; in the consumer market it's going to be one of a bunch of things that shipped with 8-bit micros. If you're looking for hardware independence then Acorn certainly have a shot, with the drawing primitives being OS calls rather than something implemented in the BASIC interpreter (which was a separate ROM and a separate piece of software), using a virtual resolution with subpixel precision (in that all drawing operations occurred at a conceptual 1280x1024 if memory serves, the available display modes being power-of-two divisors of that) and being suitably hardware independent as to work across the BBC, Electron and Archimedes.
And you've definitely got the Macintosh preceding Windows, the Xerox Star preceding that, etc, etc.
Track 41 was an option; others included deliberately malformed sectors (which couldn't be reproduced through the abstraction of a PC's floppy controller), oddly spaced sectors (which you'd time for after using a normal track for calibration), deliberately unformatted tracks and a host of other options.
I don't think the return you imagine is necessarily going to happen.
It's uncontroversial to say that for some people a tablet is a better device than a desktop/laptop. Those people will migrate one way and then not migrate back the other. So I guess the disagreement is: how many people is that
I'd argue that it's a big number, being a large proportion of those that use a computer primarily for accessing the Internet. I further think that the people that just want to access the Internet are the reason that laptops have made their way into shops like Tesco, and that the £300 Tesco-level laptop is responsible for large volumes because it's so easily available and cheap enough compared to its perceived value to be an impulse buy.
So while businesses — not just technological but anything that involves document preparation or significant digital editing or anything like that, being pretty much all of them — and enthusiasts aren't going to migrate permanently to a tablet, a huge chunk of people are.
The 33% drop in share price is because the product has stopped being fashionable, the product being 'shares in Apple'.
The P/E ratio is still low, sales are up and revenues are growing. Apart from the lack of profit growth as highlighted by El Reg, I think there's also the psychological problem that Apple shares are no longer a sure fire thing for an investor.
Our recollections obvious differ; I recall UIQ being a 'get the stylus out and prod at the scrollbar' experience just like Windows CE. It's not a technological step forward that Apple deserves any credit for but launching the iPhone OS only once it could assume a GPU by default was a massive gain for usability. It was immediately easy to run a 60 FPS user interface, removing another barrier between man and machine.
The UIQ machines, at least prior to the iPhone, were unaccelerated with the corresponding user interface lag.
In a lot of press it's because the press releases make that comparison, and the press releases more often make that comparison because — as you imply — there's more people that want to make their camp look larger. It's also a much easier narrative.
The technical press probably do it entirely because firms that chose to support Android devices on their infrastructure really don't care whether there's TouchWiz or whatever on top or not, and people who make money through applications similarly either put resources into iOS or put them into Android. Writing an Android application for a Samsung phone is no different from writing one for an HTC phone.
I guess the main people that really want a firm-by-firm breakdown are investors, which are the exception.
If I dare be contrary, the iPhone was judged as revolutionary because it was the first consumer device with a direct manipulation interface metaphor and because Apple cut sweetheart deals with the networks so that people who bought the iPhone got unlimited data where it generally wasn't available to anyone else for similar monthly rates.
So the difference was not technology but friendliness to the consumer — both in the interface and in the bill that came at the end of the month.
As I recall, quite a few mainstream reports correctly cited the original device's flaws: a slow network connection, no ability to install apps, a single day of battery life, no Exchange support, etc.
The media's willingness to report on the iPhone is also a good thing for everyone because it keeps Google on its toes and we're a free market economy. It's also nothing like an anomaly; if you compare the amount of press the iPhone gets to its installation base then compare the amount of press Windows Phone gets to its installation base you'll see that Apple's device is not the outlier.
It wasn't a vote, it's sales figures. The iPhone hasn't hung onto first place, it's reclaimed it. The driver of that appears to be the decision to keep the two previous generations around as cheaper models, from free on a contract.
Why not? Everyone loves Office 2007, Windows 8, GNOME 3, KDE 4, etc.
You can make money by entering the lottery but it's still not a healthy strategy for running a business. There is, as you say, quite a lot of leeway for redefining what your product is — if your product is radio licensing, touring and appearances then the recorded versions you give away for free are just viral advertisements — but it'd be disingenuous to argue a whole business model based on a tiny subset of available data points.
I don't think Google Play does generate all that much revenue — one of the notable differences between the Apple and Google ecosystems is that the latter tends to be more focussed on apps that are free at the point of delivery and then make money through in-app advertising or through selling additional content. And, of course, Google doesn't have any sort of requirement that they receive a cut of the latter.
Per the App Annie report that El Reg (and many others) wrote about in December, Google were seeing about 85% as many downloads as Google but generating just a quarter of the revenue.
If you take whatever number that is, add Android advertising revenue and subtract development costs I can easily imagine the outcome being negative.
I think the problem is that you have to pick one company or another. The three biggest names in mobiles right now are probably Google, Samsung and Apple — if described at their worst, a personal information thief and wifi snooper, a convicted cartel member and a patent troll.
In those circumstances I think that someone with suitable technical skills buying a Google phone because it's most hackable and then going to the necessary extremes to remove the undesirable behaviour is understandable.
Specifically titles like Alone in the Dark, North and South, Hostages and Alpha Waves.
Not so much Stir Crazy Featuring Bobo, which I seem to remember acquiring only because Your Sinclair offered it as a freebie if you subscribed.
I think the poster's just saying that Ataris were pretty good regardless of the quality of Amigas.
The STE is also quite a bit better than the machine you're probably thinking of — it has a blitter and hardware PCM audio. Like Commodore, Atari released improved hardware as the years ticked by.
I passed the first Microsoft Store I've seen, in New York's Times Square, just the other day. I can't talk to sales totals but it was definitely packed. That said, I'd imagine it's difficult for anything next to Times Square not to be packed.
If Apple's Stores stopped selling anything? I don't think Apple would do anything because I think that'd be symptomatic of the end of Apple. The post-2000 Apple as a purely consumer company lives and dies on its ability to attract lifestyle purchasers. I'm not sure they'd be able to contract back to the design and technology-obsessed* niches they held last time things went bad.
(*) in that the PowerPC really was quite a bit faster than the Pentium for a long period in the late 90s and RISC snobbishness shouldn't be underestimated, and nowadays there's the 'it's also a fully certified UNIX' angle plus things like the retina display.
VP8 is going nowhere — that was true in 2010 and its true now. There's no real incentive for hardware acceleration so there mostly isn't any hardware acceleration. You can't wrap VP8 in Flash (other than with an in-Flash software decoder) so there's no easy transitional compatibility. 99% of computer users already have a paid licence for H.264 that came with the OS (Wimdows, Mac, iOS) or the hardware (Android). Even if they didn't, it's currently free to implement for browsers and if VP8 were to make any headway then the MPEG-LA could just make it free for that use permanently. Given that its also the Bluray, etc, standard, it'll probably always have better tools.
From the dirtier side of the business, the MPEG-LA has 18 companies that claim to have patents covering VP8, also available as insurance. That's probably just sabre rattling but you shouldn't bet your company on it.
If Google switched off H.264 on YouTube it'd just cut off most of the audience — such as anyone using Flash — and therefore most of Google's money.
Since Factortame there's been official recognition that certain statutes are of a constitutional nature with the effect that they're not subject to implicit repeal — the normal rule is that if one act says one thing and another says another then the later one wins because the earlier Parliament can't bind the later; however if the earlier is recognised as a constitutional statute by the court then it'll override the later unless the later explicitly says that the former doesn't apply.
Amongst those acts recognised as constitutional is the Human RIghts Act. Since the ECHR which the HRA incorporates protects freedom of speech in Article 10 technically, even in the UK, there are constitutional guarantees of freedom of expression. Though they're explicitly subject to concerns about national security, public safety, etc, etc, so a WBC-style organisation wouldn't be safe.
I don't even really agree with the DOXing — like when The News of the World publishes lists of paedophiles there's too much of a risk that an error will have identified the wrong person or the message will get confused somewhere and someone not even identified by Anonymous will suffer. In general I don't support any similarly one sided attempt to render justice; any system created by people is just too fallible.
What I am thoroughly in support of is the online petition mentioned in the article to get the WBC legally recognised as a hate group. Let's have any measures against this sort of disgusting activity administered by people that are accountable and subject to appeal.
There's really no ground on which you can give Nokia more credit for being amongst the first to ship a new kind of solid state storage that they've started buying in than you can give Apple for being amongst the first to ship capacitive multitouch based on the technology of an entire company they've bought and then funded for a few years.
Let's hear it for Micron and FingerWorks.
Then I guess the solution is to buy 'The ZX Spectrum ULA: How to Design a Microcomputer' (ISBN-10 0956507107; published in 2010 so still widely available) as the ULA is fully documented and imaged within. You could definitely build an entire new ZX Spectrum with that and even have the correct horizon on Aquaplane, the correct multicolour text on Uridium, etc.
There's no mark II; I've heard rumours some sort of internal disagreement about fjords.
Surely CTC take a large chunk of the credit via the Datapoint 2200? They largely specified the 8008 instruction set and merely contracted it out to Intel. When Intel couldn't deliver on time they went with a TTL implementation, meaning that the very first commercial sale of a predecessor of the x86 architecture started in 1970 without Intel parts. As part of the contract termination negotiations, Intel got to keep the instruction set though at that point they'd never shipped a microprocessor — the 4004 wasn't available until late '71 and the 8008 itself wasn't completed until '72.
It feels to me that there are a lot of parallels between RIM now and Apple in 1997 — a product once emblematic of what has become a fundamental device category, and a resilient niche as a result, but marketshare that's been eroded almost down to nothing by later competitors with stronger offerings. So they're betting the house on a bought-in new operating system.
That being said, one reason so much has been written about Apple's recovery is that it was so improbable; there's probably no iPod in RIM's future. So I'm going to keep an open mind but not necessarily a great store of confidence.
The Phantom? That was almost ten years ago now. Based on a quick Google-powered memory refresh it sounds like they made a genuine attempt, showing prototypes and hiring ex-Microsoft staff from the DirectX and XBox teams, but seemed to have only the loosest possible plan for delivery.
It's not a fantastic dataset but looking at the Humble Bundle stats (because they're readily available), Mac users have been responsible for about 40% more purchases than Linux users. Though Linux have been willing to pay more on average so that's turned into Mac users supplying only about 10% more revenue.
Windows users were only about 63% of the buyers. Assuming that the Humble Bundle and the Steam crowd has a reasonably high degree of crossover (and there's a reasonable case for that given that the Humble guys provide Steam keys as part of the purchase in many cases), it would therefore seem that if you can take what would otherwise be a Windows release and make it a cross-platform release then you can add about 50% to your sales figures.
In an ideal world each party is so paranoid about the other two that they ensure all are barred from enforcing the patents in any capacity; the joint venture makes the patents effectively just go away. Though probably they'll just agree some sort of veto and the problem will be exactly what you describe.
Rather than pay $500m this quarter to fill the warehouses with stock they can't move, they're paying $195m this quarter and not having to deal with the overstocking problem. Then there's a definite $40m next quarter and a further $200m that isn't due until at least 2014.
I guess you've got to assume that the overstock problem would be a huge cost (with depreciating assets they can't shift) and/or that liquidity is a problem they're hoping will prove to be short term...
I think the issue with the badges is that they reward posting frequently and receiving upvotes without penalising downvotes. Therefore the best route to a badge is to post polemics as often as possible; every so often you're going to catch the mood and get 50+ upvotes rather than 50+ downvotes, and the repeated attempts will push up your total.
The price-earnings ratio is what the name says — the price of a share divided by the earnings for the share. Most companies hang out in the range 10—17, indicating slow trends in profitability or general stability. Higher values mean a company is overvalued or a significant jump in profits is expected, lower values mean the opposite.
Microsoft's P/E ratio is 14.5. Google's is 21.7. Samsung's is 10.8. Apple's is 12.4. Facebook's is 27.5.
So probably all that's happening is that the expectation of no further big iPad-style new products is being priced in. There's no reason to expect an ongoing dramatic fall at present.
My gut reaction is to brand short selling as morally questionable, not so much because it gives one person a vested interest in the declining fortunes of another — that's the essence of the entire market, surely? — but because of the increased risk it creates of amplifying exceptional circumstances. The gains it allows individuals to make are insufficient compared to the risks of the whole system collapsing at once.
That said, history shows that banning short selling does nothing in the long term to stabilise the market (see e.g. http://blogs.wsj.com/marketbeat/2012/07/23/short-selling-ban-reeks-of-desperation/ ), so I'm not sure it has as much effect as I imagine and I'm happier for cases like this to be treated analogously to theft; the issue isn't so much that the dealer sold equities he didn't then own as that he sold equities there was a good chance he'd never be able to acquire. He made a bargain with someone else that he could fulfil only if it was in his favour. That it was shorting isn't the main issue.
I'm not sure iFixit is complaining so much as reporting, the main objective being consumer information.
That said, the lack of access is more frustrating than it would be with most other computers because the Mac options are so limited, and the one you're meant to mess around inside of hasn't been properly updated in a very long time.
My opinions come from the sum of not just my own experience but that of many other people I've spoken to. Naturally I accept that an aggregate anecdote is still just an anecdote and I'm likely running a biased and self-selecting survey — I'm sure the comments I'm making in social arrangements are highly leading — but it's not just me. Per the article "Since the government defines full employment as being an unemployment rate of 4 per cent [...] US-citizen STEM workers are essentially fully employed, and more STEM folks are needed."; so I'm not just imagining a paucity of available talent.
I also appreciate it's proven easy to score some kneejerk points at my expense so I'll be explicit: the effect of the market is that I'm being paid too much. The work I do doesn't justify the salary I receive. A reformed system would likely lower my salary. I'm in favour of reform.
On the contrary; if I were a Republican I'd believe that a free market would set appropriate wage levels. I wouldn't invest my faith in a market so artificially hobbled by immigration caps that both sides of the aisle agree that reform is necessary. If I were a Democrat I'd probably be concerned about the growing wage gap between those with a skill the immigration system does recognise and those without, created entirely by an artificial cap on availability of the former. If I were simply an American I'd be concerned that there's an area of the economy that could be more productive — producing more tax and benefitting everyone — but doesn't have enough labour to grow.
In practice I'm none of those things so to sum up the problem for you: an artifificial government measure has created a distorted market. The market itself doesn't want the measure and I can't see why anybody else would either.
I don't agree with your reply because I don't agree that the market is functioning.
(@AC: no, but having had a look I do seem to agree with him on a lot of things)
... and the existing quota limits are extraordinarily problematic. Not only are we having problems finding employees at all but we're having seriously to overpay.
There's a bit of a vicious circle at play here — the number of visas on offer per year is capped at far too low a number. Hence if you intend to hire a foreign worker then you need to make sure that the application your company submits in order to helper the worker obtain a visa goes through safely, first time. If it's rejected then it's very probable that there aren't any slots left by the time you get modified paperwork in.
To avoid any suspicion that the application is in any way dodgy you pretty much have to offer the new recruit the average salary. But that then turns the average salary into more like the minimum salary during that year's round of immigration as people already here realise what other employees are getting and what other employers have psychologically adapted to paying.
So the inflation rate on STEM salaries here is ridiculous; it'd be a natural effect of full employment in any normal sector but the only reason we're at full employment is that most of the world's talent is legally unavailable.
Stupidly, once you're already here the criteria changes for advancing to a green card. If your job is actively advertised and nobody else can genuinely be found to take it then you just have to go through the [severely backlogged] array of paperwork for a few years.
What the US needs is to apply similar criteria to temporary visas as to green cards — the test needs to be worker availability with no threshold. Introduce a new visa that makes it easier to remove people's authorisation if there's a crash if economics are a problem. To admit that there's full employment (ie, no further workers available) but then prevent anyone new from being brought is tantamount to deliberate sabotage of the industry.
Your figures sound dodgy (and, prospective down voters, you'll want to stay with this until the end).
The article makes clear that the US contributes the largest proportion of the pot to both stores.
In the US the iPhone 5 recently outsold all Android devices combined (quite accurately reported as e.g. http://qz.com/31396/apple-outsells-android-in-the-us-for-what-could-be-the-last-time-ever/ ).
Conversely, earlier in the year Samsung had a two-to-one lead over Apple(eg, http://www.christianpost.com/news/samsung-sold-double-apples-iphone-sales-thanks-to-galaxy-s3-78977/ ) so that's a highly seasonal trend.
Nevertheless, the reported story is exclusively about 2012 so the presumably temporary reversal of normal sales trends is quite relevant. My feeling is that the Android figures currently look proportionally worse (shortly after the iPhone 5 peak) than they will in, say, six months. If I were a developer making my decision solely on revenue trends I'd put more weight on Android than the bare numbers of this report suggests.
That hypothetical being said, I actually am a developer and can tell you that we consider iOS and Android to be equally important on the grounds that iOS earns us more money right now but the potential for user growth under Android is fantastic. In terms of being healthy not just now but five years from now I think you'd be stupid not to bet on both.