It's the same on iOS, with the extra caveat that they've disabled pinch zoom so you can't even see the whole screen.
2006 posts • joined 18 Jun 2009
It's the same on iOS, with the extra caveat that they've disabled pinch zoom so you can't even see the whole screen.
On the plus side, they've stopped reticking that 'Reopen windows when logging back in' button.
That's because the AC didn't agree with you. He said some positive about Tim Cook and his management of the supply chain. Your main point appears to have been a lazy jab at Steve Jobs. The AC's comments can't be read as the same thing with a different phrasing because Tim Cook and Steve Jobs worked together for more than a decade.
I'm pretty sure Acorn systems were sold in the US via Olivetti. I think the main reason that RISC OS never gained the staying power of Mac OS is that Apple did the graphical desktop four years before Acorn and so managed to grab niches in publishing and design that sustained them when Microsoft came along and did the GUI for everyone else. Acorn's educational niche wasn't sustainable because, as noted above, there's a lot of political meddling in education and it's easy to score points with 'business picked Microsoft, we should be training them on Windows'.
I guess it's a shame but the triumph of ARM makes it difficult to be very upset.
I was just about to post a message to the effect that going back to battery life as a boasting feature is the main advancement I'm looking for in mobile phones... but then I realised that eye tracking while the screen is on in order to make a value judgment as to whether you're still looking is a feature I'd actually really like. Especially when it's late at night and I've set my phone brightness to the absolute minimum (such that the about to go to sleep screen dimming isn't visible), I'm forever being annoyed by the phone just suddenly going to standby. Even with the small amount of light it's shining out at me in those circumstances it could probably still spot my eyes if it were trying.
I'm probably describing the experience of quite a lot of people on this site when I say that things worked the other way around for me: having access to computers from a young age is probably what put me into the top set for maths at school.
As a Sam owner and someone who has written software for it (no, nothing notable; my best effort) I think its problems were more about the spec than the launch.
There's no hardware scrolling and if you wanted to scroll the high quality display in its entirety it'd take four frames. So most of the then-current style of action and platform games are straight out unless you want to render them in the Spectrum graphics mode or the Timex-style Spectrum graphics mode but with separate attributes for each 8x1 block. Cue a slew of puzzle games.
With respect to the expected sales point re: the Spectrum, the paging scheme is entirely different from the 128k Spectrum so there's no way to run 128k games at all. That's in addition to the timing differences that make many Spectrum games fail to load (the Spectrum tape interface being essentially a 1-bit ADC that the CPU polls in carefully timed loops); and they declined to licence or otherwise replicate the Sinclair ROM so you're not getting even the Spectrum compatibility it can do out of the box.
Within a year of launch, prices were something like £200 for the Sam, £300 for the Atari ST. So at that point you're not even looking at good value for money, especially once software catalogues are factored in.
MGT were hobbled from the start by development budgets, I think. If you compare and contrast to the Atari Lynx of the same year, that had a quarter of the RAM but a faster CPU, a scaling blitter, a dedicated fixed point maths unit and a built-in LCD screen, for only about £130 — and that was before console manufacturers were in the habit of subsidising the hardware with future software sales.
Subjectively speaking though? I loved the little thing, and used it through to at least 1995. Both it and another I bought are likely still where I left them when I eventually went to university.
Supposong you're complaining primarily about tense, would you accept 'was designed by Sophie Wilson'? Citing people by their current names is quite normal, e.g. 'Elton John was born in 1947'.
Chaos? Xeno? Bruce Lee? Stop the Express? Nebulus? Driller? Exolon? Target: Renegade? Thrust? Splat? Turrican? Dan Dare? Wizball?
I'm confident those were all very good.
On the other hand, lots of places — such as El Reg — seem to consider it worth reporting without being particularly invested in either side, so you shouldn't just write off the factual content of the story on account of the person that broke it.
Further to what King Jack says, the Carl Zeiss lenses that Nokia have always thought to be a huge selling point have always been a bit of a joke as far as I'm concerned; nobody who knows and cares about Carl Zeiss lenses is going to think that a full-automatic device with a point-and-shoot sized sensor is going to be significantly constrained by lens quality or that at the price points the devices are sold at this is anything more than a vanity labelling exercise, and people who just want to point and shoot aren't likely to have heard of Carl Zeiss.
Kodak is sad but was hardly inevitable — compare and contrast with the trajectory of Fujifilm, currently riding high on the well-received X series of enthusiast cameras.
Surely this'll end up looking really bad for Microsoft? As in: turn your company into a Microsoft shop and expect bankruptcy; try to sell WP7 and expect consumers to walk away.
Most of your down votes will be the trolling attempt at tying industry-wide problems uniquely to Apple, I think.
I would have been more scathing than merely not describing them as innovative. Ditto for the whole game. It was well crafted and fun, but far from innovative in any sense.
You'll all be thinking of GEOS rather than GEM, surely? GEM was the one that came with the Atari ST and was also available for all of the other 16-or-better bit machines from Digital Research, GEOS was — I think — the Commodore 64 one with the surprisingly complete set of applications. I've seen other 8 bit GUIs but never anything that went significantly beyond a basic proof of concept, with a calculator, text editor and nothing much else of production use.
It's a Silicon Valley bias thing; Tramiel dared to be over on the east coast, Sinclair, Sugar and Curry weren't even in the same country.
Tramiel was as important to the business as any of them. His cutthroat approach to price cutting can just as easily be cast as striving to include more features at the same price, hence pushing the industry forward.
Is that first manufacturer not Amazon? Albeit that they're frustratingly restricting their product to the US only for the time being. And that £200 exactly iPad sized Android tablet you can get through Argos that was reviewed here recently looked pretty good for us Brit types.
I have both devices in front of me right now. Prerendered content looks identical, as do apps that have yet to be adapted for the retina display.
If there's a complaint at all, you quickly mentally adjust to the decent text rendering offered by [almost] all apps that render their text live, making the prerendered stuff look worse by comparison. It then looks equally bad on both devices.
Games that haven't been updated, like Angry Birds, look the same on both devices but even then my brain doesn't really notice anything particularly odd. It's really just the text where most people will be conscious of the difference.
Not on purpose; I just wrote an Elite clone and needed a scripting language, so I threw a z80 emulator that I already had written in, being one of the 300,000 people to have written a Spectrum emulator at some point. I'm aware this was an absurd way to write such a thing, but it was just a personal hobby for fun.
Anyway, the way I had things set up left every individual world entity with its own little 64kb address space and a personal z80. I then had some fun scripting them myself, then got bored and put it all away, being aware that games in which you program things are ten a penny, Elite clones aren't exactly rare and there was no reason anyone should care about yet another.
I'm sure Notch's effort will be top drawer though, and should be fun because it'll attract a whole bunch of other talented people.
As part of their desperate attempts to become relevant again c.1999, Apple built Java directly into OS X and made it an on-the-box feature. The OS hence not only could run standard Java apps exactly as if they were native but included a rich set of bindings so that you could write fully native apps directly with the native frameworks but in Java. Per its designers, Java descends more from Objective-C than from C++ so I guess Apple were positioning themselves to be able to go fully Java if the market embraced it, hence they needed direct control over the thing.
In the end the market chose Objective-C (though revisionists don't seem to remember it this way), Apple worked on advancing that and deprecated the native Java bindings after only a few versions and dumped the default inclusion of the Java runtime at all as of the current version. Cyberduck is the only big OS X app I'm aware of with a Java core, Neooffice/J having once also been quite popular but probably not so much since Open/LibreOffice went native.
It was quite stupid that Apple were still maintaining Java separately and more slowly, and this is exactly the sort of flaw that doing so has exposed. So it's good that they don't do that any more, though it's far from being Apple's only security problem.
There's actually an allusion to what is effectively ClearType in the Atari Lynx system manual, presumably because some marketing person wanted an excuse to claim three times the horizontal resolution. So it's an idea that was definitely out there in the ether long before Microsoft actually did something useful with it and during the RISC OS period.
You're right though — I don't think RISC OS actually used anything like that technology. I bet almost no-one ever even connected an Archimedes to a colour LCD screen during its production lifetime.
I think you may be behind the times — Android has been the most popular platform for a couple of years; that is now also has a simple majority of the market doesn't really change that, especially since the losses of RIM and Nokia are being sucked up by both the Android manufacturers and Apple. So far Apple hasn't lost any market share, it's just that Android phones have acquired it much more quickly.
I guess it's a comedown for the Apple-or-nothing set from the iPod experience, but they can just switch their attention to tablets for a couple of years.
They're turning their phones into Holgas without paying about £25 for a camera that was originally specifically designed to be profitable at something like 50p. And then they're saving money on not having to find somewhere to develop all that lovely medium format film.
Why do they want a Holga in the first place? Just for fun, I imagine.
A friend of mine once had a phone that advertised on the box the fact that the screen could be used as a compact mirror while switched off. I almost bought one myself, just to reward that level of gall!
I think you're conflating vibrancy and accuracy. I will say for the benefit of fairness that Apple's screens are much better than the industry average for colour reproduction, but the shiny coating seems to be a consumer-oriented attempt to give the colours extra pop rather than an attempt better to please people who concern themselves with colour spaces.
I know professional photographers who work directly on Apple screens for their entire production line but when I was in publishing it was more common for companies with a strong interest in colour correctness to buy monitors worth at least two or three times the cost of a Mac, to connect to their Macs.
This is one of those areas where I keep hoping consumer-priced machines will make progress but it seems consumers don't care about gamuts so there's no real reason for manufacturers to expend the effort.
I should expect so too, since it's just incompetence on Facebook's side. On iOS there's the keychain exactly to allow developers securely to store information without having to know anything about the topic for themselves, and I'd be extraordinarily surprised if there's no similar API in Android.
Facebook's developers have simply been lazy.
(1) you don't need to click to bring up the menu bar, just mouse up to where the menu bar normally is;
(2) the dock does appear available on auto-hide, just mouse down to where the dock normally is.
• exactly as on every other desktop in the world, not every app can go full screen. I wouldn't agree that having a flag to indicate whether an app can go fullscreen and giving it a default value of 'off' given that fullscreen wasn't previous available is "the stupidest implementation"; I'd rather say it was exactly the correct implementation.
• part of your argument appears to be that the implementation is broken because it took seven iterations to appear. I'm not sure that stands up to logical inspection, though if it helps then it actually took almost thirty years to appear since the classic OS didn't have a full-screen option either. Which presumably means that the implementation that did appear is even worse?
Naturally I appreciate you'll get upvotes and I'll get downvotes because the audience here is anti-establishment and I'm defending a hugely profitable and hugely arrogant company that is often harmful to the industry.
What can you possibly have against the Mac's implementation of full-screen apps? You press the relevant button, the app goes full screen. Individual apps get individual virtual screens so you can three-finger swipe between them (or use control + cursors if you're a keyboard person). Care to enlighten us on the flaws in that?
As for MSVC 2011, I don't really see what the uproar is about. I've had no problems finding any of the supplied tools (easily, without extended hunting) and if anything the fact that colour is now reserved for content I'm actually working on has made the overall display much clearer and easier to work with.
I suggest you reread my post and save your spleen for the many instances where people actually complain about price differences.
In the US the price is $139 (the $99 is the ad-supported version). $139 is £87.13. £87.13 + VAT is £104.56. So Amazon are charging pretty much exactly the same price in the UK as in the US.
Besides the obvious points about taking the iPhone rumours with a pinch of salt and the fact that smaller internal components means bigger batteries, larger screens only mean a larger top surface area. The bragging game of who can shave a further 0.2mm off the depth seems to continue apace. So there's still quite a lot of effort to get them smaller by volume.
Per standard money gouging practice, the iPhone and iPad versions are different builds and therefore probably have different Game Centre entries. Assuming that's correct you're probably looking at between 2 and 3 million on the iOS side, possibly even more.
It's also got a front-page banner and is number two in the charts on the Mac App Store, which probably makes for a sizeable part of another million or so.
I'm going to guess that Android contributes about the same amount as the iOS ports or slightly more because there are a lot more Android phones than iPhones and the app is free, but on the other hand you've got the iPad and the iPod Touch working against the prima facie numerical advantage of the platform.
That leaves a few million for the PC, which I find difficult to believe because of the limited distribution channels (they seem not to be in Steam, for example) but not hugely unlikely given that everything above is just a massive guess.
You've got to love straw man logic;
"OMG they pointed out that the plucked-out-of-the-air £400 was a misrepresentation; they must be saying that money doesn't matter!"
Takeaway conclusions: the Viewsonic looks lovely, the iPad isn't as much more expensive as some people seem to think even though it is much more expensive in relative terms, transparent attack hounds make Internet comment boards boring.
If I dare stick my head above the parapet, I reckon more El Reg readers are likely to be interested in mobile games than in console or PC games. My reasoning being that the percentage of readers who don't own a smart phone of some variety is going to be significantly less than those that use Linux or Mac and mobile games are almost always multi-format whereas PC games are almost always exclusively for Windows (albeit sometimes with a Mac port a little later).
Conversely, if El Reg are listing the ten games worth playing but most easily overlooked then I'd actually have expected the minority platforms to be better represented. You're more likely to miss a game on a lesser used platform.
It's a bit of a fiction; the point is that the nominal 144 DPI graphic is double the nominal 72 DPI that the existing graphics declare. The Mac I'm typing this on now has a widescreen 1440x900 15.4" screen, for a lower-than-2002 110 DPI, but still significantly more than 72. I also make it 220 DPI on the 15.4" in the article (approximately 3396 pixels along the diagonal of 15.4") but whatever.
I think this is why Apple have gone to the other extreme and embraced Twitter, building its APIs directly into the current iOS and the next OS X and putting the effort in to make them sit naturally amongst Apple's own.
Ping is not only a laughable failure, but I don't even understand the logic behind it in the first place. They thought it'd be a good idea to shoehorn yet more functionality into iTunes so that we could sort of tweet, but not to very many people and only about a small subset of things?
Analogue signals are present on the dock connector but so is a full USB connection and all iOS hardware supports USB audio hardware (or, at least, did when the iPad 1 came out; I haven't necessarily kept up). I think there's also a way to get audio out without presenting yourself as USB audio hardware but it may rely on licensing IP from Apple, to supply the correct unlock code. Failing all of those options, you could just strip digital audio from the HDMI output.
As a rule, cheap docks just use the analogue audio out, expensive ones substitute their own DAC. As you can imagine, the tiny thing built into the iPhone isn't of audiophile quality.
While I agree that open source is only free if time is free, the story is about open source being _cheaper_, not being free. I can easily see how it would be cheaper.
Firstly, there's an open market for supporting software like Linux because it doesn't come from a single source. You can shop around to find the best support deal for your organisation.
Secondly, most system support in most organisations is provided internally by a department hired and trained for whatever software stack happens to be in use. It's relatively rare that you kick a problem back to the supplier and very unlikely if they're going to restrict what they use to the big name projects that have seen wide deployment, like the Apaches, OpenOffices, etc of the world. So in-houe support costs probably remain the same.
That all being considered, you hopefully end up spending slightly less on support in total and nothing whatsoever on licences.
On the assumption that someone reasonably intelligent has set up the computers and locked them down in the same way that most corporate machines are locked down (so, e.g. for a desk staffer it'd boot up to a GUI desktop with a browser, a Word-like word processor, etc, and all customisation and package management would be disabled) I also don't imagine you're looking at any real extra training costs and in any case training costs are a one off. You'd budget for maybe a month of slowed productivity as a switchover cost, which probably would pay for itself within a year.
It's technically not tethered to iTunes any more, thank goodness, though the disconnection isn't immediately as helpful as it could be.
Apps, music, movies, etc can all be bought and consumed directly on the device. The free part of Apple's iCloud service keeps your device, apps and app settings backed up and synchronises them across multiple devices if you want.
Where they haven't eliminated iTunes is in importing music (and, to a lesser extent, movies) from anywhere other than the iTunes Store. If you want to buy MP3s elsewhere you'll need to put them into iTunes at some point. If you opt to subscribe to iTunes Match (US$25/year, I think) then they can sync to your device over the Internet so there's no physical tether — and I guess you needn't technically keep the original file if you don't want to — but that's a relatively minor sop.
I say movies are a problem to a lesser extent because I don't think anybody sells them DRM free so they're less of an issue in practice, and in any case you can import them straight from SD card or USB stick with the camera connector if you want. You have to copy them from the card/stick to the device before you can play them so this explicitly isn't a completely satisfying way to circumvent Apple's desire that all storage be internal.
Photos similarly can go in via USB or SD card, and most people with photos they want to import probably start from having an SD card so I'm willing to give Apple full credit there.
Me? I've paid for iTunes Match because I have a Mac and therefore a copy of iTunes that doesn't gunk my whole system up to hand, and like you I don't really want my tablet to be dependant on a tether.
The core is written in C++. That's well known because it uses the open source Box2D framework, slightly controversially without giving any credit (short version: the licence doesn't require it; people think they should anyway as a courtesy).
Windows Phone 7 uses the Microsoft-invented C# and shuns any language not invented by Microsoft. For security reasons, we're told. All the other major and minor platforms can be targeted with C++ (including iOS, Android, Windows, Mac, Bada...) so there's a significant extra cost in supporting them for a multi-platform title.
Quite probably the original port was subsidised and Rovio thought it worth the punt. As El Reg imply, they're probably otherwise aware that the time they have to milk the franchise is likely to be short and that decisions need to focus on the facts right now.
I guess that at a the existing 9cm screen (yes, it's all metric really) and aspect ratio of 3:2 gives a width of close enough to 5cm and a height of about 7.5cm.
If you were to keep that width but extend the diagonal to 4.6", which I'm going to take as 11.7cm then you'd get a height of about 10.6cm and an aspect ratio close to 2:1. So the screen would fit on the front of the current sized iPhone (quoted by Apple as 11.52cm) with almost a centimetre to spare for a home button, speaker grille and so on.
Furthermore, all existing apps could be displayed identically, in letterbox.
That said, like you I remain sceptical just because of Apple's regard for the ecosystem. Fine, the autoresizing masks on UIViews mean that a large number of apps could be made to work just by ensuring the correct boxes were ticked but it definitely wouldn't be that easy for everyone.
Although I'm still awaiting publication of the follow-up to Commodore: A Company on the Edge, my feeling was that the Amiga was killed because Commodore decided to market it entirely in the computers-that-connect-to-the-television category so it became known as a high-end games machine and then the consoles became an easier way to play high-end games.
It's not that a properly supported Amiga would inevitably lose to Windows on the desktop so much as that Commodore never let it compete.
Quite correct — and the Mac version is £10.49, which makes the iPad version look even better.
If you're just collecting photos, cropping and possibly adjusting colour balance, don't you probably have Picasa or iPhoto, or you can probably do it directly on Flickr? The number of people that want to do only basic editing and also want manually to manage storage has to be vanishingly small.
I may give Photoshop a spin out of curiosity but Pixelmator matches my slender needs — though pretty much anything with layers and a clone brush would. Including Ifranview.
I think I was overly negative before; to pick a favourite set I'd go Baker + Sarah Jane + Harry. I seriously considered Troughton + Jamie but couldn't pick a favourite third (or third and fourth).
That being said, another thing I can't be that negative about is the current cast, as I think they're all excellent. I'm not willing to say the same about the stories. I'm finding one or two episodes a year to be really good television and most of the rest to be, well, like they want to be Lost but without any discipline. If you're going to invest in story arcs then there's only so many times you can cheat the audience by answering a question with a question or by inserting some get out of jail free nuance into established events before it just becomes impossible to suspend disbelief.
Are you kidding? Even if you restrict yourself to the Pertwees then I'd still pick either of the others above Jo Grant. And I wouldn't even go Pertwee if given free choice — his stuff always appears to be self consciously trite.
Sadly for us, it doesn't appear that having strong opinions in this area is helpful in securing the job.
Given that they've sold three million of them in the first three days, and judging by the full-year sales figures for the previous two devices, one assumes iPads are being bought by more than just Apple's hardcore customer niche. Compare to the AppleTV to see how well an Apple product does when people on the whole just aren't all that interested.
I make it almost 47 celsius (don't you subtract 32 and multiply by five ninths?), so it seems to be the difference between the iPad 2 topping out at almost exactly healthy body temperature and the iPad 3 pushing on through to about 10% above healthy body temperature. So I can understand why people are noticing even though I agree with your executive summary.
It wouldn't have been too much of a problem; these little vanity channels don't sound like they'll have much of a lifespan to me. In a small country that already has regional news broadcasts, who exactly are they for?
I was working on the assumption that developers are showing an increasing interest in Windows Phone because Microsoft's development tools and languages are very nice. So they're interested in developing for Windows Phone in the sense that they're likely to play around with it because they expect it to be a pleasant experience and because learning additional platforms and languages never hurts.
They're probably also optimistic that a sufficiently large market will appear for it to be worth releasing products. We're probably only talking about something minor like a 10% share for the sort of apps that are not in themselves directly profitable (Facebook, DropBox, other service front ends, anything promotional or sponsored) to be worthwhile to port, and that wouldn't exactly disturb the Android freight train.
So, no, I don't think developer interest need always be a trailing indicator of market share. Indeed it's almost the only way I can make sense of the survey.