Re: California Cool tossed by East Coast Partypoopers
You mean in contrast to his love of all West Coasters, like the employees of Microsoft and Google?
2101 posts • joined 18 Jun 2009
You mean in contrast to his love of all West Coasters, like the employees of Microsoft and Google?
SimCity 3000 wouldn't have been so bad but for the forced parades — any time you're doing reasonably well the citizens decide to hold a parade in your honour. In a move of fantastic wisdom the user can't skip the parades; you're forced to sit around at the slowest time scale until they stop. That just kills the whole experience as suddenly you're incentivised not to do too well.
The BBC version even made it onto the Electron where you have to jump through even more hoops than usual not to have the display eat about a third of your available space and the CPU ends up running more slowly due to memory contention. That's where I first played it. Both versions use the same jarring four-colour palette though, if memory serves.
The boring historical version is that PostScript was the standard for high end printers so several vendors built desktop platforms around PostScript as the description language for drawing on-screen rather than rolling their own versions of QuickDraw or GDI or whatever — Sun was one (with NeWS), Next Computer was another. PostScript is a full programming language* and when adapting NextStep into OS X Apple looked at the licensing fees for the implementation NextStep had used and decided instead to keep the same primitive drawing semantics but do away with the language.
Separately, over at Adobe they designed PDF as a record of the output of a PostScript program (so, to spend storage in order to save on complexity, at least initially). So PDF also inherits the same primitive drawing semantics as PostScript.
That made it easy for Apple to add PDF rendering and print to PDF to its operating system and all applications just work. There's no translation layer whatsoever, the drawing operations are just serialised and stored or deserialised and performed. As iOS is a close relative of OS X, with exactly the same graphics operations and frameworks, the same stuff just naturally carried over.
The same is not true of Windows because even once they were looking to do something beyond the GDI Microsoft insisted on inventing its own document format in XPS and tied WPF, its modern drawing framework, around that.
(*) trivia: the original LaserWriter — a key component in the early desktop publishing revolution — had a CPU 50% faster than the Mac it was meant to be attached to because it had to do all that high resolution rasterising.
I can believe they still sell iPod Shuffles but that's about it. It's cheap, it's barely bigger than a button and there's no moving parts or screen to scratch so you can take it jogging or to the gym without it being much of a disaster if you drop it or lose it. It's more expensive than the competition but that doesn't negate the market segment it's aimed at. And the iPod Touch is good for the kids, letting them have all the latest apps without a mobile contract.
The Nano and the Classic doesn't seem to have much purpose though.
Microsoft has raised the price of Office for the Mac to be the same as Office for Windows. Mac users aren't paying a premium.
That was my impression too; I was writing in response to Silverburn though I realise I was slightly ambiguous so: it was a MacBook Pro (ie, a 'professional' model) but had no trial versions of anything preinstalled. Not Office, not iWork, not anything.
I tend to prefer Pages over Word because it fits so much better into the OS and hence so much better into normal workflows. Last time I used Word it had not just its own keybindings as referenced by Quxy but its own dictionaries and its own text rendering — which was very heavily hinted and not pair kerned, like Windows XP used to be, so stuck out like a sore thumb. Before Pages I was using IBM Lotus Symphony, which was OpenOffice under a different UI and is now discontinued.
There's nothing missing or wrong with iWork that 90% of users would ever spot. So in practical terms it's 90% as good.
Weirdly it wasn't preinstalled on the Mac I bought recently; I'm not sure if that's because it was refurbished (though iPhoto, GarageBand, etc, were there).
Either the iPhone was revolutionary or augmented reality glasses aren't as e.g. Vuzix will be on the market earlier and practical augmented reality itself is at least a decade old. Similarly either the Mac was revolutionary or self-driving cars aren't as e.g. Mercedes-Benz demonstrated one in the 1980s and even had one drive the normal autobahn from Bavaria to Copenhagen and back in the mid-90s.
I don't see Google attempting to imitate Apple in any sense beyond being in some of the same markets. I consider either both to be revolutionary and transformative or neither. You don't have to pick just one.
That's an advantage; I think the greater advantage if you asked most non-technical people is the user interface. To use an Oyster card or pay by an NFC-enabled debit card I just touch the thing against the sensor. In one fell swoop that identifies that I'm the person making the transaction and that I explicitly wish to proceed.
With anything that deliberately cuts out that need to put the one thing next to the other you have to start layering on apps and menus and so on. Then it stops being something that 90% of consumers would use themselves, let along something they're happy about when the rush hour becomes even more congested as people stand around launching their applications to get into the tube.
And that's quite apart from the fact that a properly-implemented NFC solution could work regardless of whether the phone is charged.
You don't think "Actually I'm really good with numbers so I'm forced to assume that you're lying to me. Put Steve on the line." is the normal way to go?
£1800 is £1500 before VAT. Right now £1500 is US$2,330.25. The premium for buying in the UK is therefore only about 6%. It's not really worth applauding but these sort of stories usually seem to attract misinformation.
Is it worth getting the machine at all? The extra pixels are actually fantastically useful because if you don't want to run at a pretend 1440x900 you can jack up the desktop resolution to a pretend 1920x1200 and due to the pixel density everything still looks perfectly sharp*. So you can finally fit a desktop worth of stuff onto a laptop screen.
It'd be nice if other manufacturers would follow that sort of lead but I guess we're going to have to wait for Microsoft to ditch the desktop completely (as Boot Camp shows it to scale very poorly but Metro-as-was to scale flawlessly) or for Google to make a Chrome move before we get anything usable.
(*) internally that's implemented as rendering the desktop at 3840x2400 and sampling down so the scaler is throwing away information rather than trying to guess it — always a much better position to be in.
If Samsung suddenly switched its devices to Bada then:
(i) developers would abandon them because a weird variant of C++ (Bada-custom collections, two-stage constructors, etc) is hardly attractive;
(ii) subsequently users would abandon them for the lack of Temple Run or whatever it is next month.
People aren't buying Samsung phones just because they like the word 'Samsung', they're buying them because they like Android — they just don't know what Android is and, as long as the phones continue being high quality, probably don't care.
Intel's been working on EFI since 1998 if we really want to get into it.
It's Forth based, right? So we're probably talking about a stack overflow?
There's no central authority instructing the nodes to act; they discover whether it's safe to broadcast through their own local observation. The lack of a centralised actor and the ostensible resulting chaos leads to a more efficient overall system.
I'm not a libertarian but I can see there's a reasonable argument in there.
I think the poster may be confusing refresh times and touch response times — the iPhone display has run with hardware acceleration and a 60Hz response rate since day one whereas the Android OS didn't mandate a GPU at first and versions prior to 3.0 did all drawing and updating, including scrolling, on the CPU.
There's definitely some response lag on iOS devices; I couldn't tell you exactly what it is but it's easy enough to discern if you try dragging or scrolling. Just watch exactly what you put your finger down on, then move it quickly and watch whatever is trying to track your finger always be a few milliseconds behind. It subjectively feels like a lot less than 100ms but is definitely more than a frame, and clearly more than the 1ms Microsoft has demonstrated in the lab. I guess 12ms likely buys you a single frame of lag, which it definitely feels like iOS is doing worse than.
Then I guess the question is: is it a hoax in the same way that leaked government initiatives are sometimes hoaxes — i.e. the most cost effective way of floating an idea before investing any money in it?
But isn't the reason the current one is being withdrawn from the EU that some aspect of it needs to be redesigned for compliance? Though I'll wager it'll be just to seal off the fans, especially as G5-style liquid cooling probably isn't something they'd want to attempt again.
I'll go a step farther and say I honestly don't see that Swartz would have approved of this move — what does a list of bankers have to do with JSTOR's remuneration to publishers rather than authors, to arbitrary and ridiculous sentencing limits or to the failure of the law to differentiate hacking penalties based on motive?
Anonymous continue to act as a group of attention-hungry children with no philosophy or ideology beyond enjoying a bit of bullying. Sometimes they may pick targets you personally don't like but that hardly absolves them.
Here in the US at least, the original, To Play the King and The Final Cut are all carried; at twelve episodes in total they're a very entertaining way to spend a weekend.
I haven't watched the Spacey version yet but I guess there'll be some severe adjustments as if you wanted to ascend to President without winning a national election then you'd need Ford-style to be Speaker of the House and for President and Vice President to resign or die. It's happened exactly once under exceptional circumstances — it's not at all like in the UK where the PM only needs the support of the majority of his peers, giving us relatively frequent 'unelected' leaders like Callaghan, Major (at first) and Brown.
Apple hasn't said anything on the record, it's merely blocked some software with known security issues. You seem to be implying that to do so is criticism and that Apple should be allowed to criticise only if its own software is perfect but if that's the standard then surely none of can criticise Apple unless we've written only flawless software?
As noted above, and in its name, the relevant standard is international, emanating from an industry-recognised body based in Illinois. So the alternative position would have been "everyone in the industry uses this standard, but we know better because we're politicians". I suspect that position is more in disagreement with most people's political leanings than whatever you're accusing.
That shouldn't really be an application-level feature anyway — it should most naturally reside in the GDI (or whatever has supplanted it) according to my understanding of the Windows API as PDF documents are just another abstract canvas to paint to.
It's cheap to criticise; how would you define your anointed 'serious computing'? I can think of no distinction that doesn't either bar all computers more than about five years old (ie, based on processing capacity) or deign that only about 2% of the world takes part.
My feeling is that — even if you exclude leisure browsing — as tablets can do at least 90% of what people use computers for, they are computers. Just like an oven without a hob is still a kind of oven, a two-seater car is still a kind of car, a light aircraft is still a kind of aircraft, Heat is still a kind of magazine and Vin Diesel is still a kind-of actor.
But naturally you're not willing to tell us what that reason is or make any other arguments beyond a bare statement of your position?
It would seem to me that the use case Apple cite — AutoCAD — is quite real, even if rare; designs often need to be shown at sites and in meetings. Tablets (including but not limited to the iPad) have fully functional office suites for 95% of computer productivity tasks and have or are acquiring a bunch of the more specialist software, like Mathematica, DICOM viewers, first draft video editors, etc.
Tablets are serious computing devices, including the iPad.
I've seen some security researchers be quite concerned about the slow proliferation of Android updates too, and I'm pretty sure they're not motivated by iPhone fanaticism. The basic complaint is that differences between versions of the published source code are an authoritative documentation of security problems that Google recognised in the previous version, and if a serious security problem is found there's no point telling everyone to update their OS because a large number of them can't do that thanks to HTC-or-whomever.
Other than that I think you're right about choice, though the "years behind the competition" stuff is obviously a stretch. See e.g. the web browser — a pretty fundamental component. You could copy and paste in Android's as of April 2009. You could copy and paste in the iPhone's as of June 2009.
Ars helpfully did a poll — http://arstechnica.com/apple/2012/12/poll-technica-whats-your-preferred-ios-mapping-app/ — 32% of iPhone users prefer Apple's Maps, 52% Google's and the rest are mainly on Waze (6%) or 'other' (4%), with Nokia and Bing both also managing to break the 1% barrier.
Summary then: Google has already saved the day, though a third of people weren't bothered anyway — and this is amongst technically minded folk that say the day-in day-out headlines.
My understanding of the standard Oric conversation is that somebody has to point out that...
When the ULA scans a video byte it's either an instruction to change the current two-colour palette or to output pixels. The net effect is that you have to leave a gap anywhere you want to change output colours. Teletext did a similar thing but got away with it because words naturally have gaps between them. Video memory was more compact but it was very hard to write multicolour games when compared to the other micros of the day.
(and access to the sound chip was only through the versatile interface adaptor, which was a further pain)
The GSX extension to CP/M shipped in 1982, offering a hardware independent API for graphics. I think you're wading into ill-defined waters trying to talk about the first OS that supported bitmapped graphics; in the consumer market it's going to be one of a bunch of things that shipped with 8-bit micros. If you're looking for hardware independence then Acorn certainly have a shot, with the drawing primitives being OS calls rather than something implemented in the BASIC interpreter (which was a separate ROM and a separate piece of software), using a virtual resolution with subpixel precision (in that all drawing operations occurred at a conceptual 1280x1024 if memory serves, the available display modes being power-of-two divisors of that) and being suitably hardware independent as to work across the BBC, Electron and Archimedes.
And you've definitely got the Macintosh preceding Windows, the Xerox Star preceding that, etc, etc.
Track 41 was an option; others included deliberately malformed sectors (which couldn't be reproduced through the abstraction of a PC's floppy controller), oddly spaced sectors (which you'd time for after using a normal track for calibration), deliberately unformatted tracks and a host of other options.
I don't think the return you imagine is necessarily going to happen.
It's uncontroversial to say that for some people a tablet is a better device than a desktop/laptop. Those people will migrate one way and then not migrate back the other. So I guess the disagreement is: how many people is that
I'd argue that it's a big number, being a large proportion of those that use a computer primarily for accessing the Internet. I further think that the people that just want to access the Internet are the reason that laptops have made their way into shops like Tesco, and that the £300 Tesco-level laptop is responsible for large volumes because it's so easily available and cheap enough compared to its perceived value to be an impulse buy.
So while businesses — not just technological but anything that involves document preparation or significant digital editing or anything like that, being pretty much all of them — and enthusiasts aren't going to migrate permanently to a tablet, a huge chunk of people are.
The 33% drop in share price is because the product has stopped being fashionable, the product being 'shares in Apple'.
The P/E ratio is still low, sales are up and revenues are growing. Apart from the lack of profit growth as highlighted by El Reg, I think there's also the psychological problem that Apple shares are no longer a sure fire thing for an investor.
Our recollections obvious differ; I recall UIQ being a 'get the stylus out and prod at the scrollbar' experience just like Windows CE. It's not a technological step forward that Apple deserves any credit for but launching the iPhone OS only once it could assume a GPU by default was a massive gain for usability. It was immediately easy to run a 60 FPS user interface, removing another barrier between man and machine.
The UIQ machines, at least prior to the iPhone, were unaccelerated with the corresponding user interface lag.
In a lot of press it's because the press releases make that comparison, and the press releases more often make that comparison because — as you imply — there's more people that want to make their camp look larger. It's also a much easier narrative.
The technical press probably do it entirely because firms that chose to support Android devices on their infrastructure really don't care whether there's TouchWiz or whatever on top or not, and people who make money through applications similarly either put resources into iOS or put them into Android. Writing an Android application for a Samsung phone is no different from writing one for an HTC phone.
I guess the main people that really want a firm-by-firm breakdown are investors, which are the exception.
If I dare be contrary, the iPhone was judged as revolutionary because it was the first consumer device with a direct manipulation interface metaphor and because Apple cut sweetheart deals with the networks so that people who bought the iPhone got unlimited data where it generally wasn't available to anyone else for similar monthly rates.
So the difference was not technology but friendliness to the consumer — both in the interface and in the bill that came at the end of the month.
As I recall, quite a few mainstream reports correctly cited the original device's flaws: a slow network connection, no ability to install apps, a single day of battery life, no Exchange support, etc.
The media's willingness to report on the iPhone is also a good thing for everyone because it keeps Google on its toes and we're a free market economy. It's also nothing like an anomaly; if you compare the amount of press the iPhone gets to its installation base then compare the amount of press Windows Phone gets to its installation base you'll see that Apple's device is not the outlier.
It wasn't a vote, it's sales figures. The iPhone hasn't hung onto first place, it's reclaimed it. The driver of that appears to be the decision to keep the two previous generations around as cheaper models, from free on a contract.
Why not? Everyone loves Office 2007, Windows 8, GNOME 3, KDE 4, etc.
You can make money by entering the lottery but it's still not a healthy strategy for running a business. There is, as you say, quite a lot of leeway for redefining what your product is — if your product is radio licensing, touring and appearances then the recorded versions you give away for free are just viral advertisements — but it'd be disingenuous to argue a whole business model based on a tiny subset of available data points.
I don't think Google Play does generate all that much revenue — one of the notable differences between the Apple and Google ecosystems is that the latter tends to be more focussed on apps that are free at the point of delivery and then make money through in-app advertising or through selling additional content. And, of course, Google doesn't have any sort of requirement that they receive a cut of the latter.
Per the App Annie report that El Reg (and many others) wrote about in December, Google were seeing about 85% as many downloads as Google but generating just a quarter of the revenue.
If you take whatever number that is, add Android advertising revenue and subtract development costs I can easily imagine the outcome being negative.
I think the problem is that you have to pick one company or another. The three biggest names in mobiles right now are probably Google, Samsung and Apple — if described at their worst, a personal information thief and wifi snooper, a convicted cartel member and a patent troll.
In those circumstances I think that someone with suitable technical skills buying a Google phone because it's most hackable and then going to the necessary extremes to remove the undesirable behaviour is understandable.
Specifically titles like Alone in the Dark, North and South, Hostages and Alpha Waves.
Not so much Stir Crazy Featuring Bobo, which I seem to remember acquiring only because Your Sinclair offered it as a freebie if you subscribed.
I think the poster's just saying that Ataris were pretty good regardless of the quality of Amigas.
The STE is also quite a bit better than the machine you're probably thinking of — it has a blitter and hardware PCM audio. Like Commodore, Atari released improved hardware as the years ticked by.
I passed the first Microsoft Store I've seen, in New York's Times Square, just the other day. I can't talk to sales totals but it was definitely packed. That said, I'd imagine it's difficult for anything next to Times Square not to be packed.
If Apple's Stores stopped selling anything? I don't think Apple would do anything because I think that'd be symptomatic of the end of Apple. The post-2000 Apple as a purely consumer company lives and dies on its ability to attract lifestyle purchasers. I'm not sure they'd be able to contract back to the design and technology-obsessed* niches they held last time things went bad.
(*) in that the PowerPC really was quite a bit faster than the Pentium for a long period in the late 90s and RISC snobbishness shouldn't be underestimated, and nowadays there's the 'it's also a fully certified UNIX' angle plus things like the retina display.
VP8 is going nowhere — that was true in 2010 and its true now. There's no real incentive for hardware acceleration so there mostly isn't any hardware acceleration. You can't wrap VP8 in Flash (other than with an in-Flash software decoder) so there's no easy transitional compatibility. 99% of computer users already have a paid licence for H.264 that came with the OS (Wimdows, Mac, iOS) or the hardware (Android). Even if they didn't, it's currently free to implement for browsers and if VP8 were to make any headway then the MPEG-LA could just make it free for that use permanently. Given that its also the Bluray, etc, standard, it'll probably always have better tools.
From the dirtier side of the business, the MPEG-LA has 18 companies that claim to have patents covering VP8, also available as insurance. That's probably just sabre rattling but you shouldn't bet your company on it.
If Google switched off H.264 on YouTube it'd just cut off most of the audience — such as anyone using Flash — and therefore most of Google's money.
Since Factortame there's been official recognition that certain statutes are of a constitutional nature with the effect that they're not subject to implicit repeal — the normal rule is that if one act says one thing and another says another then the later one wins because the earlier Parliament can't bind the later; however if the earlier is recognised as a constitutional statute by the court then it'll override the later unless the later explicitly says that the former doesn't apply.
Amongst those acts recognised as constitutional is the Human RIghts Act. Since the ECHR which the HRA incorporates protects freedom of speech in Article 10 technically, even in the UK, there are constitutional guarantees of freedom of expression. Though they're explicitly subject to concerns about national security, public safety, etc, etc, so a WBC-style organisation wouldn't be safe.
I don't even really agree with the DOXing — like when The News of the World publishes lists of paedophiles there's too much of a risk that an error will have identified the wrong person or the message will get confused somewhere and someone not even identified by Anonymous will suffer. In general I don't support any similarly one sided attempt to render justice; any system created by people is just too fallible.
What I am thoroughly in support of is the online petition mentioned in the article to get the WBC legally recognised as a hate group. Let's have any measures against this sort of disgusting activity administered by people that are accountable and subject to appeal.
There's really no ground on which you can give Nokia more credit for being amongst the first to ship a new kind of solid state storage that they've started buying in than you can give Apple for being amongst the first to ship capacitive multitouch based on the technology of an entire company they've bought and then funded for a few years.
Let's hear it for Micron and FingerWorks.
Then I guess the solution is to buy 'The ZX Spectrum ULA: How to Design a Microcomputer' (ISBN-10 0956507107; published in 2010 so still widely available) as the ULA is fully documented and imaged within. You could definitely build an entire new ZX Spectrum with that and even have the correct horizon on Aquaplane, the correct multicolour text on Uridium, etc.
There's no mark II; I've heard rumours some sort of internal disagreement about fjords.