Re: @Flawless (@Sisk)
In this case Sim City 2000 remains on sale, via gog.com, for a grand total of $6. So regardless of the illegality of abandonware, ripping this game off has no moral justification either.
2159 posts • joined 18 Jun 2009
In this case Sim City 2000 remains on sale, via gog.com, for a grand total of $6. So regardless of the illegality of abandonware, ripping this game off has no moral justification either.
More likely Samsung see the value in the idea and therefore think it's a useful* thing to add to their handsets. Because all it does is reproduce the well-established historical look of tickets on a digital screen they don't see any IP hurdle to including it.
I don't like the copying debate because it sort of endorses the idea that there's necessarily always something wrong in copying — that if any feature of your product has any close antecedent then you've no right to claim creativity. In this case it's clear that the Apple feature has inspired the Samsung but I don't see that there's anything wrong with that, especially if Samsung end up doing it better. On a technical level, Apple's implementation is very lacking as every app that wants to put something into the passbook has to push it there, which in most apps means making your booking then digging through submenus to find the 'put into Passbook' option. If I've booked something in a compatible app, the pass should just be there in the Passbook; I shouldn't have to think about whether it's pushed or pulled and I definitely shouldn't have to do anything manually.
* Freudian typo: sueful.
So now I can do it for him.
FIrefox already has an established developer community so I don't think there's too much risk of early abandonment. Mozilla has shown no desire to sell parts of their software so I assume the licensing will be exactly like the browser — install the free package and there's no trade mark issues. That's unlike Android where you can't provide some components or use the name without paying a fee.
I therefore think the Firefox OS could make a play for the very low end; the area where unlicensed Android currently plays but with the advantage that if nothing underhand is going on then there's an opportunity for more legitimate companies to supply a more visible push.
It's not much of a chance but then I wouldn't have given Mozilla much of a chance in the early 2000s so I think it'd be foolish to write the thing off.
That wasn't a vote, it was a write-in campaign, and I think it was pretty harmless considering the first USS Enterprise dates from 1775 and took part in the War of Independence.
AmigaOS didn't even get a standard widget set until 1990 and never had protected memory. One therefore has to question the designation of 'best in class' when e.g. OS/2 supplied both in 1988.
Android, iOS and other handset OSes are designed so that the user never explicitly closes programs. That's why it isn't particularly intuitive on either of them — whether it's iOS's long press or Android's digging through the system settings. Thinking that you need to close programs is akin to a superstitious belief.
It is sad that people didn't want WebOS but it was pushed very heavily, with TV commercials featuring U2 and deliberate public spats with Apple (ending up in the USB Implementers Forum if memory serves) to get the bloggers on side. People simply didn't want it, because 'proper multitasking' actually isn't a feature they care about. The availability of Angry Birds, Temple Run, etc, is more interesting. Consumers are more interested in the total sum of what they can do with a device than with the technical way in which it is done.
I think you're being seriously paranoid — proprietary toolkits are no more "designed to lock you into a particular platform" than cars are designed to pollute the air or houses are designed to reduce the amount of public space.
I think it's the combination of having $137bn in reserves and shares that are about a third down from their 52-week high that is probably bring people out of the woodwork. They probably feel either like they're owed a payout for loyalty or that it's worth chancing it anyway.
You mean in contrast to his love of all West Coasters, like the employees of Microsoft and Google?
SimCity 3000 wouldn't have been so bad but for the forced parades — any time you're doing reasonably well the citizens decide to hold a parade in your honour. In a move of fantastic wisdom the user can't skip the parades; you're forced to sit around at the slowest time scale until they stop. That just kills the whole experience as suddenly you're incentivised not to do too well.
The BBC version even made it onto the Electron where you have to jump through even more hoops than usual not to have the display eat about a third of your available space and the CPU ends up running more slowly due to memory contention. That's where I first played it. Both versions use the same jarring four-colour palette though, if memory serves.
The boring historical version is that PostScript was the standard for high end printers so several vendors built desktop platforms around PostScript as the description language for drawing on-screen rather than rolling their own versions of QuickDraw or GDI or whatever — Sun was one (with NeWS), Next Computer was another. PostScript is a full programming language* and when adapting NextStep into OS X Apple looked at the licensing fees for the implementation NextStep had used and decided instead to keep the same primitive drawing semantics but do away with the language.
Separately, over at Adobe they designed PDF as a record of the output of a PostScript program (so, to spend storage in order to save on complexity, at least initially). So PDF also inherits the same primitive drawing semantics as PostScript.
That made it easy for Apple to add PDF rendering and print to PDF to its operating system and all applications just work. There's no translation layer whatsoever, the drawing operations are just serialised and stored or deserialised and performed. As iOS is a close relative of OS X, with exactly the same graphics operations and frameworks, the same stuff just naturally carried over.
The same is not true of Windows because even once they were looking to do something beyond the GDI Microsoft insisted on inventing its own document format in XPS and tied WPF, its modern drawing framework, around that.
(*) trivia: the original LaserWriter — a key component in the early desktop publishing revolution — had a CPU 50% faster than the Mac it was meant to be attached to because it had to do all that high resolution rasterising.
I can believe they still sell iPod Shuffles but that's about it. It's cheap, it's barely bigger than a button and there's no moving parts or screen to scratch so you can take it jogging or to the gym without it being much of a disaster if you drop it or lose it. It's more expensive than the competition but that doesn't negate the market segment it's aimed at. And the iPod Touch is good for the kids, letting them have all the latest apps without a mobile contract.
The Nano and the Classic doesn't seem to have much purpose though.
Microsoft has raised the price of Office for the Mac to be the same as Office for Windows. Mac users aren't paying a premium.
That was my impression too; I was writing in response to Silverburn though I realise I was slightly ambiguous so: it was a MacBook Pro (ie, a 'professional' model) but had no trial versions of anything preinstalled. Not Office, not iWork, not anything.
I tend to prefer Pages over Word because it fits so much better into the OS and hence so much better into normal workflows. Last time I used Word it had not just its own keybindings as referenced by Quxy but its own dictionaries and its own text rendering — which was very heavily hinted and not pair kerned, like Windows XP used to be, so stuck out like a sore thumb. Before Pages I was using IBM Lotus Symphony, which was OpenOffice under a different UI and is now discontinued.
There's nothing missing or wrong with iWork that 90% of users would ever spot. So in practical terms it's 90% as good.
Weirdly it wasn't preinstalled on the Mac I bought recently; I'm not sure if that's because it was refurbished (though iPhoto, GarageBand, etc, were there).
Either the iPhone was revolutionary or augmented reality glasses aren't as e.g. Vuzix will be on the market earlier and practical augmented reality itself is at least a decade old. Similarly either the Mac was revolutionary or self-driving cars aren't as e.g. Mercedes-Benz demonstrated one in the 1980s and even had one drive the normal autobahn from Bavaria to Copenhagen and back in the mid-90s.
I don't see Google attempting to imitate Apple in any sense beyond being in some of the same markets. I consider either both to be revolutionary and transformative or neither. You don't have to pick just one.
That's an advantage; I think the greater advantage if you asked most non-technical people is the user interface. To use an Oyster card or pay by an NFC-enabled debit card I just touch the thing against the sensor. In one fell swoop that identifies that I'm the person making the transaction and that I explicitly wish to proceed.
With anything that deliberately cuts out that need to put the one thing next to the other you have to start layering on apps and menus and so on. Then it stops being something that 90% of consumers would use themselves, let along something they're happy about when the rush hour becomes even more congested as people stand around launching their applications to get into the tube.
And that's quite apart from the fact that a properly-implemented NFC solution could work regardless of whether the phone is charged.
You don't think "Actually I'm really good with numbers so I'm forced to assume that you're lying to me. Put Steve on the line." is the normal way to go?
£1800 is £1500 before VAT. Right now £1500 is US$2,330.25. The premium for buying in the UK is therefore only about 6%. It's not really worth applauding but these sort of stories usually seem to attract misinformation.
Is it worth getting the machine at all? The extra pixels are actually fantastically useful because if you don't want to run at a pretend 1440x900 you can jack up the desktop resolution to a pretend 1920x1200 and due to the pixel density everything still looks perfectly sharp*. So you can finally fit a desktop worth of stuff onto a laptop screen.
It'd be nice if other manufacturers would follow that sort of lead but I guess we're going to have to wait for Microsoft to ditch the desktop completely (as Boot Camp shows it to scale very poorly but Metro-as-was to scale flawlessly) or for Google to make a Chrome move before we get anything usable.
(*) internally that's implemented as rendering the desktop at 3840x2400 and sampling down so the scaler is throwing away information rather than trying to guess it — always a much better position to be in.
If Samsung suddenly switched its devices to Bada then:
(i) developers would abandon them because a weird variant of C++ (Bada-custom collections, two-stage constructors, etc) is hardly attractive;
(ii) subsequently users would abandon them for the lack of Temple Run or whatever it is next month.
People aren't buying Samsung phones just because they like the word 'Samsung', they're buying them because they like Android — they just don't know what Android is and, as long as the phones continue being high quality, probably don't care.
Intel's been working on EFI since 1998 if we really want to get into it.
It's Forth based, right? So we're probably talking about a stack overflow?
There's no central authority instructing the nodes to act; they discover whether it's safe to broadcast through their own local observation. The lack of a centralised actor and the ostensible resulting chaos leads to a more efficient overall system.
I'm not a libertarian but I can see there's a reasonable argument in there.
I think the poster may be confusing refresh times and touch response times — the iPhone display has run with hardware acceleration and a 60Hz response rate since day one whereas the Android OS didn't mandate a GPU at first and versions prior to 3.0 did all drawing and updating, including scrolling, on the CPU.
There's definitely some response lag on iOS devices; I couldn't tell you exactly what it is but it's easy enough to discern if you try dragging or scrolling. Just watch exactly what you put your finger down on, then move it quickly and watch whatever is trying to track your finger always be a few milliseconds behind. It subjectively feels like a lot less than 100ms but is definitely more than a frame, and clearly more than the 1ms Microsoft has demonstrated in the lab. I guess 12ms likely buys you a single frame of lag, which it definitely feels like iOS is doing worse than.
Then I guess the question is: is it a hoax in the same way that leaked government initiatives are sometimes hoaxes — i.e. the most cost effective way of floating an idea before investing any money in it?
But isn't the reason the current one is being withdrawn from the EU that some aspect of it needs to be redesigned for compliance? Though I'll wager it'll be just to seal off the fans, especially as G5-style liquid cooling probably isn't something they'd want to attempt again.
I'll go a step farther and say I honestly don't see that Swartz would have approved of this move — what does a list of bankers have to do with JSTOR's remuneration to publishers rather than authors, to arbitrary and ridiculous sentencing limits or to the failure of the law to differentiate hacking penalties based on motive?
Anonymous continue to act as a group of attention-hungry children with no philosophy or ideology beyond enjoying a bit of bullying. Sometimes they may pick targets you personally don't like but that hardly absolves them.
Here in the US at least, the original, To Play the King and The Final Cut are all carried; at twelve episodes in total they're a very entertaining way to spend a weekend.
I haven't watched the Spacey version yet but I guess there'll be some severe adjustments as if you wanted to ascend to President without winning a national election then you'd need Ford-style to be Speaker of the House and for President and Vice President to resign or die. It's happened exactly once under exceptional circumstances — it's not at all like in the UK where the PM only needs the support of the majority of his peers, giving us relatively frequent 'unelected' leaders like Callaghan, Major (at first) and Brown.
Apple hasn't said anything on the record, it's merely blocked some software with known security issues. You seem to be implying that to do so is criticism and that Apple should be allowed to criticise only if its own software is perfect but if that's the standard then surely none of can criticise Apple unless we've written only flawless software?
As noted above, and in its name, the relevant standard is international, emanating from an industry-recognised body based in Illinois. So the alternative position would have been "everyone in the industry uses this standard, but we know better because we're politicians". I suspect that position is more in disagreement with most people's political leanings than whatever you're accusing.
That shouldn't really be an application-level feature anyway — it should most naturally reside in the GDI (or whatever has supplanted it) according to my understanding of the Windows API as PDF documents are just another abstract canvas to paint to.
It's cheap to criticise; how would you define your anointed 'serious computing'? I can think of no distinction that doesn't either bar all computers more than about five years old (ie, based on processing capacity) or deign that only about 2% of the world takes part.
My feeling is that — even if you exclude leisure browsing — as tablets can do at least 90% of what people use computers for, they are computers. Just like an oven without a hob is still a kind of oven, a two-seater car is still a kind of car, a light aircraft is still a kind of aircraft, Heat is still a kind of magazine and Vin Diesel is still a kind-of actor.
But naturally you're not willing to tell us what that reason is or make any other arguments beyond a bare statement of your position?
It would seem to me that the use case Apple cite — AutoCAD — is quite real, even if rare; designs often need to be shown at sites and in meetings. Tablets (including but not limited to the iPad) have fully functional office suites for 95% of computer productivity tasks and have or are acquiring a bunch of the more specialist software, like Mathematica, DICOM viewers, first draft video editors, etc.
Tablets are serious computing devices, including the iPad.
I've seen some security researchers be quite concerned about the slow proliferation of Android updates too, and I'm pretty sure they're not motivated by iPhone fanaticism. The basic complaint is that differences between versions of the published source code are an authoritative documentation of security problems that Google recognised in the previous version, and if a serious security problem is found there's no point telling everyone to update their OS because a large number of them can't do that thanks to HTC-or-whomever.
Other than that I think you're right about choice, though the "years behind the competition" stuff is obviously a stretch. See e.g. the web browser — a pretty fundamental component. You could copy and paste in Android's as of April 2009. You could copy and paste in the iPhone's as of June 2009.
Ars helpfully did a poll — http://arstechnica.com/apple/2012/12/poll-technica-whats-your-preferred-ios-mapping-app/ — 32% of iPhone users prefer Apple's Maps, 52% Google's and the rest are mainly on Waze (6%) or 'other' (4%), with Nokia and Bing both also managing to break the 1% barrier.
Summary then: Google has already saved the day, though a third of people weren't bothered anyway — and this is amongst technically minded folk that say the day-in day-out headlines.
My understanding of the standard Oric conversation is that somebody has to point out that...
When the ULA scans a video byte it's either an instruction to change the current two-colour palette or to output pixels. The net effect is that you have to leave a gap anywhere you want to change output colours. Teletext did a similar thing but got away with it because words naturally have gaps between them. Video memory was more compact but it was very hard to write multicolour games when compared to the other micros of the day.
(and access to the sound chip was only through the versatile interface adaptor, which was a further pain)
The GSX extension to CP/M shipped in 1982, offering a hardware independent API for graphics. I think you're wading into ill-defined waters trying to talk about the first OS that supported bitmapped graphics; in the consumer market it's going to be one of a bunch of things that shipped with 8-bit micros. If you're looking for hardware independence then Acorn certainly have a shot, with the drawing primitives being OS calls rather than something implemented in the BASIC interpreter (which was a separate ROM and a separate piece of software), using a virtual resolution with subpixel precision (in that all drawing operations occurred at a conceptual 1280x1024 if memory serves, the available display modes being power-of-two divisors of that) and being suitably hardware independent as to work across the BBC, Electron and Archimedes.
And you've definitely got the Macintosh preceding Windows, the Xerox Star preceding that, etc, etc.
Track 41 was an option; others included deliberately malformed sectors (which couldn't be reproduced through the abstraction of a PC's floppy controller), oddly spaced sectors (which you'd time for after using a normal track for calibration), deliberately unformatted tracks and a host of other options.
I don't think the return you imagine is necessarily going to happen.
It's uncontroversial to say that for some people a tablet is a better device than a desktop/laptop. Those people will migrate one way and then not migrate back the other. So I guess the disagreement is: how many people is that
I'd argue that it's a big number, being a large proportion of those that use a computer primarily for accessing the Internet. I further think that the people that just want to access the Internet are the reason that laptops have made their way into shops like Tesco, and that the £300 Tesco-level laptop is responsible for large volumes because it's so easily available and cheap enough compared to its perceived value to be an impulse buy.
So while businesses — not just technological but anything that involves document preparation or significant digital editing or anything like that, being pretty much all of them — and enthusiasts aren't going to migrate permanently to a tablet, a huge chunk of people are.
The 33% drop in share price is because the product has stopped being fashionable, the product being 'shares in Apple'.
The P/E ratio is still low, sales are up and revenues are growing. Apart from the lack of profit growth as highlighted by El Reg, I think there's also the psychological problem that Apple shares are no longer a sure fire thing for an investor.
Our recollections obvious differ; I recall UIQ being a 'get the stylus out and prod at the scrollbar' experience just like Windows CE. It's not a technological step forward that Apple deserves any credit for but launching the iPhone OS only once it could assume a GPU by default was a massive gain for usability. It was immediately easy to run a 60 FPS user interface, removing another barrier between man and machine.
The UIQ machines, at least prior to the iPhone, were unaccelerated with the corresponding user interface lag.
In a lot of press it's because the press releases make that comparison, and the press releases more often make that comparison because — as you imply — there's more people that want to make their camp look larger. It's also a much easier narrative.
The technical press probably do it entirely because firms that chose to support Android devices on their infrastructure really don't care whether there's TouchWiz or whatever on top or not, and people who make money through applications similarly either put resources into iOS or put them into Android. Writing an Android application for a Samsung phone is no different from writing one for an HTC phone.
I guess the main people that really want a firm-by-firm breakdown are investors, which are the exception.
If I dare be contrary, the iPhone was judged as revolutionary because it was the first consumer device with a direct manipulation interface metaphor and because Apple cut sweetheart deals with the networks so that people who bought the iPhone got unlimited data where it generally wasn't available to anyone else for similar monthly rates.
So the difference was not technology but friendliness to the consumer — both in the interface and in the bill that came at the end of the month.
As I recall, quite a few mainstream reports correctly cited the original device's flaws: a slow network connection, no ability to install apps, a single day of battery life, no Exchange support, etc.
The media's willingness to report on the iPhone is also a good thing for everyone because it keeps Google on its toes and we're a free market economy. It's also nothing like an anomaly; if you compare the amount of press the iPhone gets to its installation base then compare the amount of press Windows Phone gets to its installation base you'll see that Apple's device is not the outlier.
It wasn't a vote, it's sales figures. The iPhone hasn't hung onto first place, it's reclaimed it. The driver of that appears to be the decision to keep the two previous generations around as cheaper models, from free on a contract.
Why not? Everyone loves Office 2007, Windows 8, GNOME 3, KDE 4, etc.
You can make money by entering the lottery but it's still not a healthy strategy for running a business. There is, as you say, quite a lot of leeway for redefining what your product is — if your product is radio licensing, touring and appearances then the recorded versions you give away for free are just viral advertisements — but it'd be disingenuous to argue a whole business model based on a tiny subset of available data points.
I don't think Google Play does generate all that much revenue — one of the notable differences between the Apple and Google ecosystems is that the latter tends to be more focussed on apps that are free at the point of delivery and then make money through in-app advertising or through selling additional content. And, of course, Google doesn't have any sort of requirement that they receive a cut of the latter.
Per the App Annie report that El Reg (and many others) wrote about in December, Google were seeing about 85% as many downloads as Google but generating just a quarter of the revenue.
If you take whatever number that is, add Android advertising revenue and subtract development costs I can easily imagine the outcome being negative.
I think the problem is that you have to pick one company or another. The three biggest names in mobiles right now are probably Google, Samsung and Apple — if described at their worst, a personal information thief and wifi snooper, a convicted cartel member and a patent troll.
In those circumstances I think that someone with suitable technical skills buying a Google phone because it's most hackable and then going to the necessary extremes to remove the undesirable behaviour is understandable.
Specifically titles like Alone in the Dark, North and South, Hostages and Alpha Waves.
Not so much Stir Crazy Featuring Bobo, which I seem to remember acquiring only because Your Sinclair offered it as a freebie if you subscribed.