Re: One of the best places to work?
Most of your down votes will be the trolling attempt at tying industry-wide problems uniquely to Apple, I think.
2094 posts • joined 18 Jun 2009
Most of your down votes will be the trolling attempt at tying industry-wide problems uniquely to Apple, I think.
I would have been more scathing than merely not describing them as innovative. Ditto for the whole game. It was well crafted and fun, but far from innovative in any sense.
You'll all be thinking of GEOS rather than GEM, surely? GEM was the one that came with the Atari ST and was also available for all of the other 16-or-better bit machines from Digital Research, GEOS was — I think — the Commodore 64 one with the surprisingly complete set of applications. I've seen other 8 bit GUIs but never anything that went significantly beyond a basic proof of concept, with a calculator, text editor and nothing much else of production use.
It's a Silicon Valley bias thing; Tramiel dared to be over on the east coast, Sinclair, Sugar and Curry weren't even in the same country.
Tramiel was as important to the business as any of them. His cutthroat approach to price cutting can just as easily be cast as striving to include more features at the same price, hence pushing the industry forward.
Is that first manufacturer not Amazon? Albeit that they're frustratingly restricting their product to the US only for the time being. And that £200 exactly iPad sized Android tablet you can get through Argos that was reviewed here recently looked pretty good for us Brit types.
I have both devices in front of me right now. Prerendered content looks identical, as do apps that have yet to be adapted for the retina display.
If there's a complaint at all, you quickly mentally adjust to the decent text rendering offered by [almost] all apps that render their text live, making the prerendered stuff look worse by comparison. It then looks equally bad on both devices.
Games that haven't been updated, like Angry Birds, look the same on both devices but even then my brain doesn't really notice anything particularly odd. It's really just the text where most people will be conscious of the difference.
Not on purpose; I just wrote an Elite clone and needed a scripting language, so I threw a z80 emulator that I already had written in, being one of the 300,000 people to have written a Spectrum emulator at some point. I'm aware this was an absurd way to write such a thing, but it was just a personal hobby for fun.
Anyway, the way I had things set up left every individual world entity with its own little 64kb address space and a personal z80. I then had some fun scripting them myself, then got bored and put it all away, being aware that games in which you program things are ten a penny, Elite clones aren't exactly rare and there was no reason anyone should care about yet another.
I'm sure Notch's effort will be top drawer though, and should be fun because it'll attract a whole bunch of other talented people.
As part of their desperate attempts to become relevant again c.1999, Apple built Java directly into OS X and made it an on-the-box feature. The OS hence not only could run standard Java apps exactly as if they were native but included a rich set of bindings so that you could write fully native apps directly with the native frameworks but in Java. Per its designers, Java descends more from Objective-C than from C++ so I guess Apple were positioning themselves to be able to go fully Java if the market embraced it, hence they needed direct control over the thing.
In the end the market chose Objective-C (though revisionists don't seem to remember it this way), Apple worked on advancing that and deprecated the native Java bindings after only a few versions and dumped the default inclusion of the Java runtime at all as of the current version. Cyberduck is the only big OS X app I'm aware of with a Java core, Neooffice/J having once also been quite popular but probably not so much since Open/LibreOffice went native.
It was quite stupid that Apple were still maintaining Java separately and more slowly, and this is exactly the sort of flaw that doing so has exposed. So it's good that they don't do that any more, though it's far from being Apple's only security problem.
There's actually an allusion to what is effectively ClearType in the Atari Lynx system manual, presumably because some marketing person wanted an excuse to claim three times the horizontal resolution. So it's an idea that was definitely out there in the ether long before Microsoft actually did something useful with it and during the RISC OS period.
You're right though — I don't think RISC OS actually used anything like that technology. I bet almost no-one ever even connected an Archimedes to a colour LCD screen during its production lifetime.
I think you may be behind the times — Android has been the most popular platform for a couple of years; that is now also has a simple majority of the market doesn't really change that, especially since the losses of RIM and Nokia are being sucked up by both the Android manufacturers and Apple. So far Apple hasn't lost any market share, it's just that Android phones have acquired it much more quickly.
I guess it's a comedown for the Apple-or-nothing set from the iPod experience, but they can just switch their attention to tablets for a couple of years.
They're turning their phones into Holgas without paying about £25 for a camera that was originally specifically designed to be profitable at something like 50p. And then they're saving money on not having to find somewhere to develop all that lovely medium format film.
Why do they want a Holga in the first place? Just for fun, I imagine.
A friend of mine once had a phone that advertised on the box the fact that the screen could be used as a compact mirror while switched off. I almost bought one myself, just to reward that level of gall!
I think you're conflating vibrancy and accuracy. I will say for the benefit of fairness that Apple's screens are much better than the industry average for colour reproduction, but the shiny coating seems to be a consumer-oriented attempt to give the colours extra pop rather than an attempt better to please people who concern themselves with colour spaces.
I know professional photographers who work directly on Apple screens for their entire production line but when I was in publishing it was more common for companies with a strong interest in colour correctness to buy monitors worth at least two or three times the cost of a Mac, to connect to their Macs.
This is one of those areas where I keep hoping consumer-priced machines will make progress but it seems consumers don't care about gamuts so there's no real reason for manufacturers to expend the effort.
I should expect so too, since it's just incompetence on Facebook's side. On iOS there's the keychain exactly to allow developers securely to store information without having to know anything about the topic for themselves, and I'd be extraordinarily surprised if there's no similar API in Android.
Facebook's developers have simply been lazy.
(1) you don't need to click to bring up the menu bar, just mouse up to where the menu bar normally is;
(2) the dock does appear available on auto-hide, just mouse down to where the dock normally is.
• exactly as on every other desktop in the world, not every app can go full screen. I wouldn't agree that having a flag to indicate whether an app can go fullscreen and giving it a default value of 'off' given that fullscreen wasn't previous available is "the stupidest implementation"; I'd rather say it was exactly the correct implementation.
• part of your argument appears to be that the implementation is broken because it took seven iterations to appear. I'm not sure that stands up to logical inspection, though if it helps then it actually took almost thirty years to appear since the classic OS didn't have a full-screen option either. Which presumably means that the implementation that did appear is even worse?
Naturally I appreciate you'll get upvotes and I'll get downvotes because the audience here is anti-establishment and I'm defending a hugely profitable and hugely arrogant company that is often harmful to the industry.
What can you possibly have against the Mac's implementation of full-screen apps? You press the relevant button, the app goes full screen. Individual apps get individual virtual screens so you can three-finger swipe between them (or use control + cursors if you're a keyboard person). Care to enlighten us on the flaws in that?
As for MSVC 2011, I don't really see what the uproar is about. I've had no problems finding any of the supplied tools (easily, without extended hunting) and if anything the fact that colour is now reserved for content I'm actually working on has made the overall display much clearer and easier to work with.
I suggest you reread my post and save your spleen for the many instances where people actually complain about price differences.
In the US the price is $139 (the $99 is the ad-supported version). $139 is £87.13. £87.13 + VAT is £104.56. So Amazon are charging pretty much exactly the same price in the UK as in the US.
Besides the obvious points about taking the iPhone rumours with a pinch of salt and the fact that smaller internal components means bigger batteries, larger screens only mean a larger top surface area. The bragging game of who can shave a further 0.2mm off the depth seems to continue apace. So there's still quite a lot of effort to get them smaller by volume.
Per standard money gouging practice, the iPhone and iPad versions are different builds and therefore probably have different Game Centre entries. Assuming that's correct you're probably looking at between 2 and 3 million on the iOS side, possibly even more.
It's also got a front-page banner and is number two in the charts on the Mac App Store, which probably makes for a sizeable part of another million or so.
I'm going to guess that Android contributes about the same amount as the iOS ports or slightly more because there are a lot more Android phones than iPhones and the app is free, but on the other hand you've got the iPad and the iPod Touch working against the prima facie numerical advantage of the platform.
That leaves a few million for the PC, which I find difficult to believe because of the limited distribution channels (they seem not to be in Steam, for example) but not hugely unlikely given that everything above is just a massive guess.
You've got to love straw man logic;
"OMG they pointed out that the plucked-out-of-the-air £400 was a misrepresentation; they must be saying that money doesn't matter!"
Takeaway conclusions: the Viewsonic looks lovely, the iPad isn't as much more expensive as some people seem to think even though it is much more expensive in relative terms, transparent attack hounds make Internet comment boards boring.
If I dare stick my head above the parapet, I reckon more El Reg readers are likely to be interested in mobile games than in console or PC games. My reasoning being that the percentage of readers who don't own a smart phone of some variety is going to be significantly less than those that use Linux or Mac and mobile games are almost always multi-format whereas PC games are almost always exclusively for Windows (albeit sometimes with a Mac port a little later).
Conversely, if El Reg are listing the ten games worth playing but most easily overlooked then I'd actually have expected the minority platforms to be better represented. You're more likely to miss a game on a lesser used platform.
It's a bit of a fiction; the point is that the nominal 144 DPI graphic is double the nominal 72 DPI that the existing graphics declare. The Mac I'm typing this on now has a widescreen 1440x900 15.4" screen, for a lower-than-2002 110 DPI, but still significantly more than 72. I also make it 220 DPI on the 15.4" in the article (approximately 3396 pixels along the diagonal of 15.4") but whatever.
I think this is why Apple have gone to the other extreme and embraced Twitter, building its APIs directly into the current iOS and the next OS X and putting the effort in to make them sit naturally amongst Apple's own.
Ping is not only a laughable failure, but I don't even understand the logic behind it in the first place. They thought it'd be a good idea to shoehorn yet more functionality into iTunes so that we could sort of tweet, but not to very many people and only about a small subset of things?
Analogue signals are present on the dock connector but so is a full USB connection and all iOS hardware supports USB audio hardware (or, at least, did when the iPad 1 came out; I haven't necessarily kept up). I think there's also a way to get audio out without presenting yourself as USB audio hardware but it may rely on licensing IP from Apple, to supply the correct unlock code. Failing all of those options, you could just strip digital audio from the HDMI output.
As a rule, cheap docks just use the analogue audio out, expensive ones substitute their own DAC. As you can imagine, the tiny thing built into the iPhone isn't of audiophile quality.
While I agree that open source is only free if time is free, the story is about open source being _cheaper_, not being free. I can easily see how it would be cheaper.
Firstly, there's an open market for supporting software like Linux because it doesn't come from a single source. You can shop around to find the best support deal for your organisation.
Secondly, most system support in most organisations is provided internally by a department hired and trained for whatever software stack happens to be in use. It's relatively rare that you kick a problem back to the supplier and very unlikely if they're going to restrict what they use to the big name projects that have seen wide deployment, like the Apaches, OpenOffices, etc of the world. So in-houe support costs probably remain the same.
That all being considered, you hopefully end up spending slightly less on support in total and nothing whatsoever on licences.
On the assumption that someone reasonably intelligent has set up the computers and locked them down in the same way that most corporate machines are locked down (so, e.g. for a desk staffer it'd boot up to a GUI desktop with a browser, a Word-like word processor, etc, and all customisation and package management would be disabled) I also don't imagine you're looking at any real extra training costs and in any case training costs are a one off. You'd budget for maybe a month of slowed productivity as a switchover cost, which probably would pay for itself within a year.
It's technically not tethered to iTunes any more, thank goodness, though the disconnection isn't immediately as helpful as it could be.
Apps, music, movies, etc can all be bought and consumed directly on the device. The free part of Apple's iCloud service keeps your device, apps and app settings backed up and synchronises them across multiple devices if you want.
Where they haven't eliminated iTunes is in importing music (and, to a lesser extent, movies) from anywhere other than the iTunes Store. If you want to buy MP3s elsewhere you'll need to put them into iTunes at some point. If you opt to subscribe to iTunes Match (US$25/year, I think) then they can sync to your device over the Internet so there's no physical tether — and I guess you needn't technically keep the original file if you don't want to — but that's a relatively minor sop.
I say movies are a problem to a lesser extent because I don't think anybody sells them DRM free so they're less of an issue in practice, and in any case you can import them straight from SD card or USB stick with the camera connector if you want. You have to copy them from the card/stick to the device before you can play them so this explicitly isn't a completely satisfying way to circumvent Apple's desire that all storage be internal.
Photos similarly can go in via USB or SD card, and most people with photos they want to import probably start from having an SD card so I'm willing to give Apple full credit there.
Me? I've paid for iTunes Match because I have a Mac and therefore a copy of iTunes that doesn't gunk my whole system up to hand, and like you I don't really want my tablet to be dependant on a tether.
The core is written in C++. That's well known because it uses the open source Box2D framework, slightly controversially without giving any credit (short version: the licence doesn't require it; people think they should anyway as a courtesy).
Windows Phone 7 uses the Microsoft-invented C# and shuns any language not invented by Microsoft. For security reasons, we're told. All the other major and minor platforms can be targeted with C++ (including iOS, Android, Windows, Mac, Bada...) so there's a significant extra cost in supporting them for a multi-platform title.
Quite probably the original port was subsidised and Rovio thought it worth the punt. As El Reg imply, they're probably otherwise aware that the time they have to milk the franchise is likely to be short and that decisions need to focus on the facts right now.
I guess that at a the existing 9cm screen (yes, it's all metric really) and aspect ratio of 3:2 gives a width of close enough to 5cm and a height of about 7.5cm.
If you were to keep that width but extend the diagonal to 4.6", which I'm going to take as 11.7cm then you'd get a height of about 10.6cm and an aspect ratio close to 2:1. So the screen would fit on the front of the current sized iPhone (quoted by Apple as 11.52cm) with almost a centimetre to spare for a home button, speaker grille and so on.
Furthermore, all existing apps could be displayed identically, in letterbox.
That said, like you I remain sceptical just because of Apple's regard for the ecosystem. Fine, the autoresizing masks on UIViews mean that a large number of apps could be made to work just by ensuring the correct boxes were ticked but it definitely wouldn't be that easy for everyone.
Although I'm still awaiting publication of the follow-up to Commodore: A Company on the Edge, my feeling was that the Amiga was killed because Commodore decided to market it entirely in the computers-that-connect-to-the-television category so it became known as a high-end games machine and then the consoles became an easier way to play high-end games.
It's not that a properly supported Amiga would inevitably lose to Windows on the desktop so much as that Commodore never let it compete.
Quite correct — and the Mac version is £10.49, which makes the iPad version look even better.
If you're just collecting photos, cropping and possibly adjusting colour balance, don't you probably have Picasa or iPhoto, or you can probably do it directly on Flickr? The number of people that want to do only basic editing and also want manually to manage storage has to be vanishingly small.
I may give Photoshop a spin out of curiosity but Pixelmator matches my slender needs — though pretty much anything with layers and a clone brush would. Including Ifranview.
I think I was overly negative before; to pick a favourite set I'd go Baker + Sarah Jane + Harry. I seriously considered Troughton + Jamie but couldn't pick a favourite third (or third and fourth).
That being said, another thing I can't be that negative about is the current cast, as I think they're all excellent. I'm not willing to say the same about the stories. I'm finding one or two episodes a year to be really good television and most of the rest to be, well, like they want to be Lost but without any discipline. If you're going to invest in story arcs then there's only so many times you can cheat the audience by answering a question with a question or by inserting some get out of jail free nuance into established events before it just becomes impossible to suspend disbelief.
Are you kidding? Even if you restrict yourself to the Pertwees then I'd still pick either of the others above Jo Grant. And I wouldn't even go Pertwee if given free choice — his stuff always appears to be self consciously trite.
Sadly for us, it doesn't appear that having strong opinions in this area is helpful in securing the job.
Given that they've sold three million of them in the first three days, and judging by the full-year sales figures for the previous two devices, one assumes iPads are being bought by more than just Apple's hardcore customer niche. Compare to the AppleTV to see how well an Apple product does when people on the whole just aren't all that interested.
I make it almost 47 celsius (don't you subtract 32 and multiply by five ninths?), so it seems to be the difference between the iPad 2 topping out at almost exactly healthy body temperature and the iPad 3 pushing on through to about 10% above healthy body temperature. So I can understand why people are noticing even though I agree with your executive summary.
It wouldn't have been too much of a problem; these little vanity channels don't sound like they'll have much of a lifespan to me. In a small country that already has regional news broadcasts, who exactly are they for?
I was working on the assumption that developers are showing an increasing interest in Windows Phone because Microsoft's development tools and languages are very nice. So they're interested in developing for Windows Phone in the sense that they're likely to play around with it because they expect it to be a pleasant experience and because learning additional platforms and languages never hurts.
They're probably also optimistic that a sufficiently large market will appear for it to be worth releasing products. We're probably only talking about something minor like a 10% share for the sort of apps that are not in themselves directly profitable (Facebook, DropBox, other service front ends, anything promotional or sponsored) to be worthwhile to port, and that wouldn't exactly disturb the Android freight train.
So, no, I don't think developer interest need always be a trailing indicator of market share. Indeed it's almost the only way I can make sense of the survey.
I received a brand-new, fully unlocked N8 from Nokia because I'd turned up to a developer day. So although I haven't upgraded it to Belle, I think I can claim unadulterated experience of Symbian's last commercial stand.
Problem one: consistency. It felt like three or four separate widget sets glued randomly together. I'm talking about things like the bundled applications exhibiting at least three different types of scroll view. In the settings there were still a few that required me to poke at and drag a nub on a scrollbar. In most of the apps they'd settled on a direct manipulation metaphor but with no inertia. In a few they'd decided to go with direct manipulation and inertia. So the experience is muddled and confused, and basically relies on me learning to adjust my expectations on how to scroll content for each individual app by rote.
Problem two: tacked on touch screen hacks. To enter text into a text box in Symbian as shipped on the N8 you tap the text box. You're then taken to a completely separate, mostly blank screen — often with almost no context — to enter your text. When you press okay you're taken back to the screen with the text box and the text box is filled with whatever you just typed. I don't care how hard it was to hack an on-screen keyboard into the OS, that's just unacceptable. And, again, I'm talking primarily about the bundled apps, supplied right on the handset, not third-party offerings that someone somewhere couldn't be bothered to adapt.
Problem three: incredibly poor use of screen real estate. I seemed forever to be having to navigate little pop-up menus down three or four levels just to access basic app functionality. Again I suspect a junior somewhere was told to 'make the menus work on touch screens' and given about three days in which to do it.
Problem four: poor development environment that fosters all of the above problems. On my developer day they'd invited an ex-employee who was then gainfully employed outside the company in Symbian software creation to evangelise about the state of the platform. He primarily boasted about how incredible it was that he'd managed closely to duplicate an iOS app he'd built while only spending about three weeks on replicating things like animated transitions between views that iOS gives you for free. They made lots of promises about QT Quick, which at the time still hadn't even shipped, showing how neat it was that if we (i) put a graphic down on the canvas; and (ii) put a touch area in front of it then we've managed to reproduce something a bit like a button. It doesn't give any feedback cues on user interactions or anything, but obviously as a developer you should be implementing that stuff yourself for every single app, right?
As a mature platform with a history of techy users I'm sure Symbian is just feature packed. But as a user I don't care when even the basic features are so obfuscated that I have to spend days learning the phone before I can use it.
Clearly you people have never heard of motion pictures. Clearly the moon landing was fake.
However, it's exactly the same price difference as officially charged by Samsung and RIM, and was formerly charged by HP. So it's a bit of an industry-wide rip off.
It's not fair to compare it to a Kindle; on electronic ink devices the pixels exactly meet up so that a grey line is a continuous black line. On LCD devices the red, green and blue elements have gaps between them so that a grey line is a series of discontinuous half-lit red, green and blue pixels. As a result the electronic ink looks infinitely better at a much lower resolution.
It's the iPad 1, but check out http://www.wired.com/gadgetlab/2010/08/pictures-kindle-and-ipad-screens-under-microscope/ to see the difference under a zoom lens and a microscope.
Having taken ownership of one thanks to my job, I can confirm that the new iPad gets noticeably warm whereas the old did not. On the plus side, my cat loves it.
I think quite a lot of the discussion is manufacturers trying to find a way to distinguish their products and the usual partisans trying to turn it into a wedge issue. Which explains the pattern of up and down votes here, I think — even discussion it like normal people puts you a bit too close to areas heavily ploughed by the trolls.
Me? The 4:3 is closer to A4 (and US letter) so feels better for reading, especially with a pixel density that looks almost as good as printed material. Widescreen is trivially better for most modern video content because most modern video content is widescreen. So I agree with you that it's a preference.
The Maltese Falcon? The Wizard of Oz? Frankenstein? The Fly? His Girl Friday? The Man Who Knew Too Much?
Admittedly I can't think of one from the last twenty five years.
So the conditions described aren't prima facie wrong, it's merely that there's a certain amount of profit above which child labour, employee poisoning, etc becomes unacceptable?
I prefer to think all the named manufacturers are in the wrong. If placing disproportionate blame on Apple makes everyone have to act a little more properly then I accept that it isn't fair but I'm all for it.
It may also be a more muted launch but we won't know until the numbers are in. I'm of the opinion that the new iPad is a bigger step onward from the iPad 2 than the iPad 2 was from the original iPad but either of the early models easily passes the threshold for comfortable usability.
You probably have a point though — we have several of the sort of people that probably queued last year in the company but all they've done is preorder, with someone planning to head down (to the SF shop, no less) at lunchtime to pick the things up.
And the weather here is indeed very glum. It's been raining since Monday so I'm starting to wonder why I didn't just stay in London.
In this case the term fanbois would offend me since only exactly one person turned up four days early. What are you going to do next? Split the infinitive?
They said the pixels have to be small enough that a person with normal eyesight can't tell them apart at the distance they normally hold the device. Then they said that's 300ppi on a phone. At the iPad launch they sort of argued that, you know, people tend to hold tablets further away, so the density can be less while still fitting their definition of retina, etc.
I'm not sure I buy that since I seem to hold both phones and tablets at approximately arm's length and in any case if I play, say, a polygon-based game without antialiasing at the full screen resolution on my 4s I can make out the aliased edges. And I've had routine eye tests so I know that I'm average and not some sort of super human.
In theory if the pixels actually were too small to see it wouldn't matter exactly how source content maps to them — aliasing errors would be present but invisibly small, just as how magazines have been able to print TV snaps for years without someone doing arithmetic on exactly what size the picture needs to be to come out well when the printers are done. In this case I would expect that errors will be visible to the keen sighted but basically not a major problem.
Don't buy what everyone else tells you to buy! Buy what everyone else [on El Reg] tells you to buy!