Like some sort of upside-down Laffer Curve: don't price anything as mid-tier?
2518 posts • joined 18 Jun 2009
The most expensive non-build-to-order MacBook Pro 15.4" on store.apple.com is £2,699*. So that's £2249.17 without tax. At the current exchange rate, that's the same as US$2740.84.
The US price (also without tax) is $2,799.
* I didn't bother playing with the build-to-order options to get to El Reg's quoted £2,999 but it trivially goes quite a lot over that if you want: e.g. adding a 2TB SSD adds £1,080 immediately. Getting to £2,969 was easy but still not what I was aiming for exactly to match the story.
I have no evidence but have always assumed that Sinclair's button-per-keyword was a means to avoid the space and time of writing a tokeniser; I guess it may partly also have been because the ZX80 can't run the display and process a key press simultaneously so one screen shake per word was preferable to one per letter? Then anything else he said about it was just marketing.
My understanding, at long arm's length, based on the literature and with no direct experience, is that the Z80 was often preferred to the 6502 because it had built-in DRAM refresh logic.
One of Sinclair's smart moves in the ZX80 and '81 was to use static RAM and repurpose the DRAM refresh counter as part of the video counter, saving some external logic at the cost of having to force the processor into a NOP/HALT cycle during pixel periods because exact refresh timing depends on the instruction stream but variability doesn't work when you're synchronously driving video. The program counter forms the other part, thanks to NOP conveniently being 0x00, meaning that one can easily steal the real value from the bus and then force a NOP before the CPU checks via open collector logic. I'm pretty sure the [proper, non-CPU-assisted] video fetches are used for DRAM refresh on the ZX Spectrum but if you've already tooled up for the Z80, why change?
I think the SID has just been emulated poorly historically because it's a digital-analogue hybrid with unknown analogue logic. So it's the combination of incompletely documented, expensive to emulate and slightly outside of the normal emulator author's core competencies. There are good soundalikes now though, despite the obstacles — e.g. reSID seems well-reviewed.
I'm not sure I agree. This isn't a case of somebody deciding that a store absolute + a load absolute is six bytes and eight cycles but a store absolute + a load immediate is five bytes and six cycles so they'll do the less readable thing, it's a compiler just like any other compiler except that compilation is just-in-time and somebody didn't think enough to realise that constants you read from your processor may not be subjectively constant if processors are heterogeneous.
So as to the code generation itself, this is just a compiler doing exactly what compilers have always done. It isn't self modifying. It's one actor, and it's outputting another — not modifying it, and not modifying itself.
They've just messed up the announcement of completion.
Agreed; this developer happens to have taken offence at a particular company, many others have done basically the same thing with all the other companies, their colleagues, other software components, software patterns, actors, footballers, varieties of plant...
Other than suggesting, if true, that WhatsApp has a poor peer review system, the news is that at least one developer doesn't like Apple?
I'd have no problem with USB-C for power if they'd just add a damned second port, as Google and almost everybody else does. Otherwise the cabling costs are absurd as soon as you want to be able to plug in an external monitor and absolutely any other accessory, while charging. The USB-C passthrough is the first thing to go as soon as you buy anything but an absurdly-priced monitor connector and most hubs either can't pass on power or else can't pass on video.
Intel have actually been fairly good about this — one of the notable things about the original Basis product was the company's refusal to support either Google Fit or Apple HealthKit despite both being launched during its period as the sole product, while the sync software was under active development.
They initially launched the Peak similarly declining to support export of data to either of those platforms or to any other, but relented. So if you have a Peak, you've quite possibly already exfiltrated your data, day by day, without much ado. You're not locked in.
The most offensive thing they've done is keep their server in the loop, so that you may get to keep your data but they get to keep it too. Which is slurping but not lock-in.
Given the full refunds, I think Intel has been fairly decent overall.
It was the first to do continuous heart rate measurement, and therefore was reasonably popular amongst a certain niche. The follow-up Peak inexplicably uses a completely different app that syncs to completely different services but I guess it's a safe bet that everything is being switched off together. So that's fairly sad, but at least I got three years or so. Better than most cloudy devices, I'll bet.
I did, yes — original comment withdrawn.
My personal preferential order is (i) paying for something outright; (ii) getting it for free with adverts; (iii) paying a subscription. So I'd probably still go for the 'free' option but, no, that isn't what I was originally alluding to. I had just somehow completely missed the second paragraph.
Can anyone speak as to the alleged "gnarly" compatibility problems? Given that security is mentioned, I assume it's as simple as some of the relevant calls taking a pointer to a C string to which to write, and a previous guarantee that `wchar_t path[MAXPATH];` would always be long enough, or something of that ilk?
Which is not yet to have mentioned the most telling part of the review:
I have seen two basic types of reactions. One, and there were rather few of these, were that of the unimpressed – but I guess they are just simply not as tech savvy as I and my fellow watch enthusiasts and nerds are.
See? If you're not excited by a ring with a watch in it then, well, it's just because you aren't smart or knowledgable enough. The problem is definitely yours.
I find this same attitude pervades the smart watch discussion.
Agreed — for home users all that's really needed is to query resolution and colour format, then to post an image. There are enough places like the USB forum where such a thing could have been established that I can only assume there's a market reason that each manufacturer wants to spend the money writing and maintaining their own drivers. Do they really gain that much from trying to force their own storefronts upon people, given that they've already put DRM into the ink? I don't think there's still any money in selling the hardware so the convenient obsolescence probably isn't that handy?
Bug filed: file received was less than 600mb, did not announce supply levels in a creepy mechanical voice, did not attempt to redirect me to the manufacturer's website to purchase anything, appeared not to add anything to the system tray, had no effect on computer boot time, indeed did not appear to use my network connection at all. Clearly not a printer driver.
I'm nothing like an expert, but I once dated someone who was and my recollection is that Asperger's didn't make it from DSM-IV to DSM-V on account of being insufficiently well-defined, and in any case was never strongly correlated with intelligence, it also not being a contributor to diagnosis.
When I voted remain, I didn't do so on any single issue, I did so on the balance of issues — most prominently that there is little sovereignty lost outside of the imaginations of the tabloid press, that there are substantial financial and lifestyle advantages to membership, and that the democratic deficit that exists is insufficiently substantial to overcome those advantages.
Does it change my mind that the amount of my tax money that, eventually, goes towards sponsoring equipment for foreign armies will be negligible more than it already is? No. See today's Chilcot Report findings for many of the reasons that I don't consider this, in relative terms, to be a big deal.
A remain voter here. I think the problem with the way the debate was held was that we all spent decades telling future out voters that they were a strange minority rather than explaining why somebody might hold the opposite opinion. There wasn't time to undo the amount of voter resentment that such a level of persistent arrogance had accumulated. Maybe there's 20% who'll never listen to rationality, but arguing with them isn't a waste of time because if you ignore them then before you know it 52% have heard and internalised their opinion. Ignore people you disagree with at your peril.
... and the nation's, in my opinion.
I think it's advisory in the same sense that the rule determining whom the Queen will pick as PM is advisory. Political reality always wins. Brexit could be avoided now only if a majority of people wanted the vote to be overridden and by some mechanism were able to say so — a second referendum, a snap general election, whatever. But I just don't see what convincing argument you could make that the majority of people want a rerun. The majority voted out.
As a voter for remain, I think Cameron et al have made a hash of the whole thing. An issue of such constitutional significance should never have been reduced to a binary decision and decided on by a simple majority. But those were the rules and the exit side won. So — in a democracy — that's that.
I guess that belittling those with concerns for forty years rather than addressing them, then telling them that they were talking about the wrong thing anyway, wasn't such a winning strategy?
Yeah, usually factor one = something you know, factor two = something you have. You need to know your username, password, other identifying information; you need to have a physical USB key, or the correct phone to receive a login code on, or a properly associated token generator. If your bank is like mine then login is one factor but adding a new account payee requires the second factor of a card reader and debit card.
Multiple pass-phrases is just an attempt to prevent you from using the same password as everywhere else, I'm guessing as I type, and entering the 3rd, 9th and 6th characters is probably a protection against key loggers?
Per the quarterlies, Apple sells between 4 and 6 million Macs per quarter, usually bringing in about 10% of their revenue, total company revenue being in the $40–60bn range.
So although they're clearly a niche product, I don't think the vanity label holds water. Compare and contrast with the watch...
Loose speculation has alleged that Apple may have started taking submissions to its application stores as 'bitcode' (think p-code or JVM or CLR byte streams at the point of submission, though Apple compiles to architecture-specific code prior to delivery) because it intends to use its established in-house chip designers to transfer their ARM skills to a laptop.
There's no reason to suppose such a thing, were it not just gross speculation, would be iOS rather than OS X, but hopefully it won't go the Windows RT route of being OS X with all the restrictions of iOS.
There's also a lot of firmly solid evidence to suggest it wouldn't be a budget device by objective standards...
Indeed, the Nokia Lumia I own results from knowing that I'd be without a work phone for at least three months, having nothing else less than five years old, deciding to spend no more than $50 (without a contract), and deciding that if I'm buying whatever the network will sell me for $50 then I'd rather not have an Android because I know the network will have made the $50 Android into a piece of garbage. Conversely, Microsoft has the rules that all carrier-added applications must be uninstallable and that reskinning is not permitted, so all the network could do to my $50 Windows Phone was cost me a few minutes deleting AT&T-this and AT&T-that.
If I'd expected to have the phone for a year or more I'd probably have spent the then-price of $200 for the cheapest decent non-carrier-supplied unlocked Android. I guess it's even less now.
I was in the process of saying exactly the same thing.
If you (i) don't consider it much of a loss if somebody else accesses your LinkedIn account; and (ii) don't want to share your LinkedIn password with any other site because LinkedIn passwords might leak; then something both unique to the site and easy to remember is ideal.
Mine are usually slightly better than that but I am definitely guilty of having very little regard for the quality of passwords that I use for sites which have no privileged information about me whatsoever. What's the worst that can happen here? Somebody might delete or graffiti my online CV? Not only do I have it in various other forms but I'm pretty sure I could reconstitute it from nothing with fairly limited effort. It's not particularly difficult to remember which university I went to and the list of my employers since then.
EDIT: hasty update on this, per the haveibeenpwned.com suggestion above, my LinkedIn password has leaked. So I guess I'll change it. But it's hard to feel a sense of urgency.
I think they're politely not referencing competitors, to avoid lurid misrepresentations, but what they're describing sounds exactly like optionals in Ceylon or Swift, as also available in Java 8 and via Boost in C++ but in both cases being opt-in. If so then the semantics are fairly easy: any reference/pointer that may be null explicitly says so, and there's some sort of single-statement construct for dereferencing arbitrarily many in sequence or else getting some other result if any is null (depending on language possibly also optionally allowing an exception if any is null).
E.g. in no language in particular, result = dictionary[helper.getAdaptor().getProperty()].or(defaultValue)
...without having to test whether dictionary exists, then whether helper exists, then whether getAdaptor returns non-null, then whether getProperty returns non-null.
Biting the hand that feeds IT © 1998–2019