You're yawning inappropriately. I think what you meant to yawn about was another year, another stream of rumours months in advance about a bunch of potential incremental improvements, many of which probably aren't accurate anyway.
2118 posts • joined 18 Jun 2009
You're yawning inappropriately. I think what you meant to yawn about was another year, another stream of rumours months in advance about a bunch of potential incremental improvements, many of which probably aren't accurate anyway.
If your phone is locked then you can't join another network, though plenty exist. If your console is locked then you can't buy games from another source, of which none exist. It's cause and effect but a real difference, at least when it comes to temporary purposive exceptions to existing laws.
Re: the PC model; I don't think the 3DO was the same thing at all. It was custom developed hardware with a single stationary target. So lack of subsidies was a real issue — compare and contrast with other consoles that typically cost quite a bit less than the hardware cost at launch, and with the PC that slowly gained steam as a gaming device over more than a decade.
I think that the problem with the PC now is simply that you can buy a very expensive one so some people do buy a very expensive one so that moves the centre of gravity for games. PC gaming is likely to remain more expensive than console gaming because nobody can put a foot down and say 'there is no choice for anybody; all must stick with the older technology in order to reduce cost for new owners and to keep things simple'. For people that game on PCs, that's probably a good thing as they retain the freedom to spend more on a better box if they want to.
The difference is as you say — with mobile phones they're usually subsidised but you explicitly pay the subsidy back over the course of your contract. There's also often no technical barrier to using them on another network.
Conversely, with consoles you get a subsidy and you can't point at the exact date and payment that paid off the subsidy. There are also no other commercial entities that could offer you an alternative service.
I guess the 3DO is instructive example of why things work like that, being a console that wasn't subsidised with a specification that anyone could implement and no licensing costs for games; net result: a $700 console that nobody bought. The PC model just didn't work. Consumers had a choice and preferred a subsidised console with more expensive games, embracing the PlayStation instead, which was also from a newcomer, so established businesses versus upstarts wasn't really a factor.
Based on the spec at https://wiki.ubuntu.com/MirSpec the objective is to have Mir sitting underneath QT. So at least KDE should be in.
The implementation will need to be very good.
The central contradiction as I imagine it is that normally you move your eyes to look at something. If moving your eyes also moves the thing then you're getting very unnatural feedback.
If they solve that by having the scrolling occur only when you look at certain areas of the screen — e.g. look at the bottom to scroll down, return your eyes to the main portion to stop — then you have to judge when to stop based on peripheral vision.
Supposing the people at Samsung have solved these problems then they deserve to be number one.
I can explain the theory. By reducing what the connector does to 'a serial bidirectional stream of data' you turn all possible external connectors into mere software extensions. Whatever you want to output must already be data within the device, so you export that data over the connector and let the cable worry about reformatting it.
In this case that appears likely to have put some sort of video codec into the loop between device and cable and leaves the cable having to decompress video and run a framebuffer.
So what Apple has done, in contrast to the single-port Android phones, is made no concessions whatsoever to the two or three cables it's pretty obvious most people are going to use in real life right now. I guess the calculation was that the chips that have to go into the cables are going to be very cheap very soon and they need a connector that they can stick with for ten or more years in order to ensure accessory lock-in. There's probably also an argument that they've overreacted to the old connector having long disused pins for Firewire, still having what will very soon be obsolete pins for analogue video, etc, etc.
Are you sure you're not being a bit vague about the difference between civil and criminal law? And possibly about the difference between disclosure (ie, when a party becomes aware that evidence exists) and inspection (ie, when you deliver that evidence that is already known to exist)?
Chad H and others above are correct. The point of the law is that people can't go to court and say 'I think The Ford Motor Company has been employing assassins to come round my house at night and move my bins; I demand they disclose appropriate paperwork on all contracts paid for in June so that I can check'. Such a system would quickly bring commerce to a halt as a million crackpots come out of the woodwork and insist on expansive document disclosure from every company they dislike.
Apple may be acting disingenuously — it's difficult to know without any idea what knowing the cost of documentary disclosure (eg, maybe the engineers emailed each other about it on and off for months; who's going to trawl through and find that stuff, and redact anything not related to the topic that happened to be in the same thread of conversation and is commercially sensitive?) — but the principle is solid.
I guess it's a bit like the idea that if the police can't prove you performed a crime then you shouldn't go to jail (although softer, because that's criminal and this is civil); the point is to prevent harassment by people (or authorities) on a fishing trip.
Surely it depends how that SD card slot is wired up internally? It's normal for them to reside on the USB bus (just like the keyboard and trackpad, usually) so if a system is USB 2.0 only then the SD card slot is likely limited to that bandwidth.
In this case Sim City 2000 remains on sale, via gog.com, for a grand total of $6. So regardless of the illegality of abandonware, ripping this game off has no moral justification either.
More likely Samsung see the value in the idea and therefore think it's a useful* thing to add to their handsets. Because all it does is reproduce the well-established historical look of tickets on a digital screen they don't see any IP hurdle to including it.
I don't like the copying debate because it sort of endorses the idea that there's necessarily always something wrong in copying — that if any feature of your product has any close antecedent then you've no right to claim creativity. In this case it's clear that the Apple feature has inspired the Samsung but I don't see that there's anything wrong with that, especially if Samsung end up doing it better. On a technical level, Apple's implementation is very lacking as every app that wants to put something into the passbook has to push it there, which in most apps means making your booking then digging through submenus to find the 'put into Passbook' option. If I've booked something in a compatible app, the pass should just be there in the Passbook; I shouldn't have to think about whether it's pushed or pulled and I definitely shouldn't have to do anything manually.
* Freudian typo: sueful.
So now I can do it for him.
FIrefox already has an established developer community so I don't think there's too much risk of early abandonment. Mozilla has shown no desire to sell parts of their software so I assume the licensing will be exactly like the browser — install the free package and there's no trade mark issues. That's unlike Android where you can't provide some components or use the name without paying a fee.
I therefore think the Firefox OS could make a play for the very low end; the area where unlicensed Android currently plays but with the advantage that if nothing underhand is going on then there's an opportunity for more legitimate companies to supply a more visible push.
It's not much of a chance but then I wouldn't have given Mozilla much of a chance in the early 2000s so I think it'd be foolish to write the thing off.
That wasn't a vote, it was a write-in campaign, and I think it was pretty harmless considering the first USS Enterprise dates from 1775 and took part in the War of Independence.
AmigaOS didn't even get a standard widget set until 1990 and never had protected memory. One therefore has to question the designation of 'best in class' when e.g. OS/2 supplied both in 1988.
Android, iOS and other handset OSes are designed so that the user never explicitly closes programs. That's why it isn't particularly intuitive on either of them — whether it's iOS's long press or Android's digging through the system settings. Thinking that you need to close programs is akin to a superstitious belief.
It is sad that people didn't want WebOS but it was pushed very heavily, with TV commercials featuring U2 and deliberate public spats with Apple (ending up in the USB Implementers Forum if memory serves) to get the bloggers on side. People simply didn't want it, because 'proper multitasking' actually isn't a feature they care about. The availability of Angry Birds, Temple Run, etc, is more interesting. Consumers are more interested in the total sum of what they can do with a device than with the technical way in which it is done.
I think you're being seriously paranoid — proprietary toolkits are no more "designed to lock you into a particular platform" than cars are designed to pollute the air or houses are designed to reduce the amount of public space.
I think it's the combination of having $137bn in reserves and shares that are about a third down from their 52-week high that is probably bring people out of the woodwork. They probably feel either like they're owed a payout for loyalty or that it's worth chancing it anyway.
You mean in contrast to his love of all West Coasters, like the employees of Microsoft and Google?
SimCity 3000 wouldn't have been so bad but for the forced parades — any time you're doing reasonably well the citizens decide to hold a parade in your honour. In a move of fantastic wisdom the user can't skip the parades; you're forced to sit around at the slowest time scale until they stop. That just kills the whole experience as suddenly you're incentivised not to do too well.
The BBC version even made it onto the Electron where you have to jump through even more hoops than usual not to have the display eat about a third of your available space and the CPU ends up running more slowly due to memory contention. That's where I first played it. Both versions use the same jarring four-colour palette though, if memory serves.
The boring historical version is that PostScript was the standard for high end printers so several vendors built desktop platforms around PostScript as the description language for drawing on-screen rather than rolling their own versions of QuickDraw or GDI or whatever — Sun was one (with NeWS), Next Computer was another. PostScript is a full programming language* and when adapting NextStep into OS X Apple looked at the licensing fees for the implementation NextStep had used and decided instead to keep the same primitive drawing semantics but do away with the language.
Separately, over at Adobe they designed PDF as a record of the output of a PostScript program (so, to spend storage in order to save on complexity, at least initially). So PDF also inherits the same primitive drawing semantics as PostScript.
That made it easy for Apple to add PDF rendering and print to PDF to its operating system and all applications just work. There's no translation layer whatsoever, the drawing operations are just serialised and stored or deserialised and performed. As iOS is a close relative of OS X, with exactly the same graphics operations and frameworks, the same stuff just naturally carried over.
The same is not true of Windows because even once they were looking to do something beyond the GDI Microsoft insisted on inventing its own document format in XPS and tied WPF, its modern drawing framework, around that.
(*) trivia: the original LaserWriter — a key component in the early desktop publishing revolution — had a CPU 50% faster than the Mac it was meant to be attached to because it had to do all that high resolution rasterising.
I can believe they still sell iPod Shuffles but that's about it. It's cheap, it's barely bigger than a button and there's no moving parts or screen to scratch so you can take it jogging or to the gym without it being much of a disaster if you drop it or lose it. It's more expensive than the competition but that doesn't negate the market segment it's aimed at. And the iPod Touch is good for the kids, letting them have all the latest apps without a mobile contract.
The Nano and the Classic doesn't seem to have much purpose though.
Microsoft has raised the price of Office for the Mac to be the same as Office for Windows. Mac users aren't paying a premium.
That was my impression too; I was writing in response to Silverburn though I realise I was slightly ambiguous so: it was a MacBook Pro (ie, a 'professional' model) but had no trial versions of anything preinstalled. Not Office, not iWork, not anything.
I tend to prefer Pages over Word because it fits so much better into the OS and hence so much better into normal workflows. Last time I used Word it had not just its own keybindings as referenced by Quxy but its own dictionaries and its own text rendering — which was very heavily hinted and not pair kerned, like Windows XP used to be, so stuck out like a sore thumb. Before Pages I was using IBM Lotus Symphony, which was OpenOffice under a different UI and is now discontinued.
There's nothing missing or wrong with iWork that 90% of users would ever spot. So in practical terms it's 90% as good.
Weirdly it wasn't preinstalled on the Mac I bought recently; I'm not sure if that's because it was refurbished (though iPhoto, GarageBand, etc, were there).
Either the iPhone was revolutionary or augmented reality glasses aren't as e.g. Vuzix will be on the market earlier and practical augmented reality itself is at least a decade old. Similarly either the Mac was revolutionary or self-driving cars aren't as e.g. Mercedes-Benz demonstrated one in the 1980s and even had one drive the normal autobahn from Bavaria to Copenhagen and back in the mid-90s.
I don't see Google attempting to imitate Apple in any sense beyond being in some of the same markets. I consider either both to be revolutionary and transformative or neither. You don't have to pick just one.
That's an advantage; I think the greater advantage if you asked most non-technical people is the user interface. To use an Oyster card or pay by an NFC-enabled debit card I just touch the thing against the sensor. In one fell swoop that identifies that I'm the person making the transaction and that I explicitly wish to proceed.
With anything that deliberately cuts out that need to put the one thing next to the other you have to start layering on apps and menus and so on. Then it stops being something that 90% of consumers would use themselves, let along something they're happy about when the rush hour becomes even more congested as people stand around launching their applications to get into the tube.
And that's quite apart from the fact that a properly-implemented NFC solution could work regardless of whether the phone is charged.
You don't think "Actually I'm really good with numbers so I'm forced to assume that you're lying to me. Put Steve on the line." is the normal way to go?
£1800 is £1500 before VAT. Right now £1500 is US$2,330.25. The premium for buying in the UK is therefore only about 6%. It's not really worth applauding but these sort of stories usually seem to attract misinformation.
Is it worth getting the machine at all? The extra pixels are actually fantastically useful because if you don't want to run at a pretend 1440x900 you can jack up the desktop resolution to a pretend 1920x1200 and due to the pixel density everything still looks perfectly sharp*. So you can finally fit a desktop worth of stuff onto a laptop screen.
It'd be nice if other manufacturers would follow that sort of lead but I guess we're going to have to wait for Microsoft to ditch the desktop completely (as Boot Camp shows it to scale very poorly but Metro-as-was to scale flawlessly) or for Google to make a Chrome move before we get anything usable.
(*) internally that's implemented as rendering the desktop at 3840x2400 and sampling down so the scaler is throwing away information rather than trying to guess it — always a much better position to be in.
If Samsung suddenly switched its devices to Bada then:
(i) developers would abandon them because a weird variant of C++ (Bada-custom collections, two-stage constructors, etc) is hardly attractive;
(ii) subsequently users would abandon them for the lack of Temple Run or whatever it is next month.
People aren't buying Samsung phones just because they like the word 'Samsung', they're buying them because they like Android — they just don't know what Android is and, as long as the phones continue being high quality, probably don't care.
Intel's been working on EFI since 1998 if we really want to get into it.
It's Forth based, right? So we're probably talking about a stack overflow?
There's no central authority instructing the nodes to act; they discover whether it's safe to broadcast through their own local observation. The lack of a centralised actor and the ostensible resulting chaos leads to a more efficient overall system.
I'm not a libertarian but I can see there's a reasonable argument in there.
I think the poster may be confusing refresh times and touch response times — the iPhone display has run with hardware acceleration and a 60Hz response rate since day one whereas the Android OS didn't mandate a GPU at first and versions prior to 3.0 did all drawing and updating, including scrolling, on the CPU.
There's definitely some response lag on iOS devices; I couldn't tell you exactly what it is but it's easy enough to discern if you try dragging or scrolling. Just watch exactly what you put your finger down on, then move it quickly and watch whatever is trying to track your finger always be a few milliseconds behind. It subjectively feels like a lot less than 100ms but is definitely more than a frame, and clearly more than the 1ms Microsoft has demonstrated in the lab. I guess 12ms likely buys you a single frame of lag, which it definitely feels like iOS is doing worse than.
Then I guess the question is: is it a hoax in the same way that leaked government initiatives are sometimes hoaxes — i.e. the most cost effective way of floating an idea before investing any money in it?
But isn't the reason the current one is being withdrawn from the EU that some aspect of it needs to be redesigned for compliance? Though I'll wager it'll be just to seal off the fans, especially as G5-style liquid cooling probably isn't something they'd want to attempt again.
I'll go a step farther and say I honestly don't see that Swartz would have approved of this move — what does a list of bankers have to do with JSTOR's remuneration to publishers rather than authors, to arbitrary and ridiculous sentencing limits or to the failure of the law to differentiate hacking penalties based on motive?
Anonymous continue to act as a group of attention-hungry children with no philosophy or ideology beyond enjoying a bit of bullying. Sometimes they may pick targets you personally don't like but that hardly absolves them.
Here in the US at least, the original, To Play the King and The Final Cut are all carried; at twelve episodes in total they're a very entertaining way to spend a weekend.
I haven't watched the Spacey version yet but I guess there'll be some severe adjustments as if you wanted to ascend to President without winning a national election then you'd need Ford-style to be Speaker of the House and for President and Vice President to resign or die. It's happened exactly once under exceptional circumstances — it's not at all like in the UK where the PM only needs the support of the majority of his peers, giving us relatively frequent 'unelected' leaders like Callaghan, Major (at first) and Brown.
Apple hasn't said anything on the record, it's merely blocked some software with known security issues. You seem to be implying that to do so is criticism and that Apple should be allowed to criticise only if its own software is perfect but if that's the standard then surely none of can criticise Apple unless we've written only flawless software?
As noted above, and in its name, the relevant standard is international, emanating from an industry-recognised body based in Illinois. So the alternative position would have been "everyone in the industry uses this standard, but we know better because we're politicians". I suspect that position is more in disagreement with most people's political leanings than whatever you're accusing.
That shouldn't really be an application-level feature anyway — it should most naturally reside in the GDI (or whatever has supplanted it) according to my understanding of the Windows API as PDF documents are just another abstract canvas to paint to.
It's cheap to criticise; how would you define your anointed 'serious computing'? I can think of no distinction that doesn't either bar all computers more than about five years old (ie, based on processing capacity) or deign that only about 2% of the world takes part.
My feeling is that — even if you exclude leisure browsing — as tablets can do at least 90% of what people use computers for, they are computers. Just like an oven without a hob is still a kind of oven, a two-seater car is still a kind of car, a light aircraft is still a kind of aircraft, Heat is still a kind of magazine and Vin Diesel is still a kind-of actor.
But naturally you're not willing to tell us what that reason is or make any other arguments beyond a bare statement of your position?
It would seem to me that the use case Apple cite — AutoCAD — is quite real, even if rare; designs often need to be shown at sites and in meetings. Tablets (including but not limited to the iPad) have fully functional office suites for 95% of computer productivity tasks and have or are acquiring a bunch of the more specialist software, like Mathematica, DICOM viewers, first draft video editors, etc.
Tablets are serious computing devices, including the iPad.
I've seen some security researchers be quite concerned about the slow proliferation of Android updates too, and I'm pretty sure they're not motivated by iPhone fanaticism. The basic complaint is that differences between versions of the published source code are an authoritative documentation of security problems that Google recognised in the previous version, and if a serious security problem is found there's no point telling everyone to update their OS because a large number of them can't do that thanks to HTC-or-whomever.
Other than that I think you're right about choice, though the "years behind the competition" stuff is obviously a stretch. See e.g. the web browser — a pretty fundamental component. You could copy and paste in Android's as of April 2009. You could copy and paste in the iPhone's as of June 2009.
Ars helpfully did a poll — http://arstechnica.com/apple/2012/12/poll-technica-whats-your-preferred-ios-mapping-app/ — 32% of iPhone users prefer Apple's Maps, 52% Google's and the rest are mainly on Waze (6%) or 'other' (4%), with Nokia and Bing both also managing to break the 1% barrier.
Summary then: Google has already saved the day, though a third of people weren't bothered anyway — and this is amongst technically minded folk that say the day-in day-out headlines.
My understanding of the standard Oric conversation is that somebody has to point out that...
When the ULA scans a video byte it's either an instruction to change the current two-colour palette or to output pixels. The net effect is that you have to leave a gap anywhere you want to change output colours. Teletext did a similar thing but got away with it because words naturally have gaps between them. Video memory was more compact but it was very hard to write multicolour games when compared to the other micros of the day.
(and access to the sound chip was only through the versatile interface adaptor, which was a further pain)
The GSX extension to CP/M shipped in 1982, offering a hardware independent API for graphics. I think you're wading into ill-defined waters trying to talk about the first OS that supported bitmapped graphics; in the consumer market it's going to be one of a bunch of things that shipped with 8-bit micros. If you're looking for hardware independence then Acorn certainly have a shot, with the drawing primitives being OS calls rather than something implemented in the BASIC interpreter (which was a separate ROM and a separate piece of software), using a virtual resolution with subpixel precision (in that all drawing operations occurred at a conceptual 1280x1024 if memory serves, the available display modes being power-of-two divisors of that) and being suitably hardware independent as to work across the BBC, Electron and Archimedes.
And you've definitely got the Macintosh preceding Windows, the Xerox Star preceding that, etc, etc.
Track 41 was an option; others included deliberately malformed sectors (which couldn't be reproduced through the abstraction of a PC's floppy controller), oddly spaced sectors (which you'd time for after using a normal track for calibration), deliberately unformatted tracks and a host of other options.
I don't think the return you imagine is necessarily going to happen.
It's uncontroversial to say that for some people a tablet is a better device than a desktop/laptop. Those people will migrate one way and then not migrate back the other. So I guess the disagreement is: how many people is that
I'd argue that it's a big number, being a large proportion of those that use a computer primarily for accessing the Internet. I further think that the people that just want to access the Internet are the reason that laptops have made their way into shops like Tesco, and that the £300 Tesco-level laptop is responsible for large volumes because it's so easily available and cheap enough compared to its perceived value to be an impulse buy.
So while businesses — not just technological but anything that involves document preparation or significant digital editing or anything like that, being pretty much all of them — and enthusiasts aren't going to migrate permanently to a tablet, a huge chunk of people are.
The 33% drop in share price is because the product has stopped being fashionable, the product being 'shares in Apple'.
The P/E ratio is still low, sales are up and revenues are growing. Apart from the lack of profit growth as highlighted by El Reg, I think there's also the psychological problem that Apple shares are no longer a sure fire thing for an investor.