Sadly it's not especially overpriced compared to the Thunderbolt competition. What is it about PCIe being on a cable rather than a slot that seems to add hundreds of pounds to the cost?
2226 posts • joined 18 Jun 2009
Sadly it's not especially overpriced compared to the Thunderbolt competition. What is it about PCIe being on a cable rather than a slot that seems to add hundreds of pounds to the cost?
Apple was first to... release a device that was truly attractive to the majority of consumers. It achieved this through a successful marriage of blossoming technologies and by using its iPod clout to strong-arm the mobile networks.
If you had asked interested consumers what impressed them in the first iPhone they'd probably mention the multitouch interface (thanks to Apple's acquisition of FingerWorks), the smooth user interface (achievable because GPUs had crossed the necessary power/price threshold) and the unlimited data plans (that's the network clout bit). The screen was also large and high resolution for its time.
But the real issue here isn't whether Samsung dared to plan its products by surveying competing products (hint: it did) it is to what extent it should be allowed to as obviously if someone had said "iPhone purchasers have reported liking this browser icon, let's use this browser icon" then there'd be no real debate.
If you've really nothing else to do with your day you can check my post history and see that I'm generally positive about Apple but, honestly, I don't think a functioning marketplace is sustainable if we're at the level where minor interface elements like slide to unlock are protectable (regardless of whether Apple should own that one or not). I also don't really understand Apple's strategy here. What benefit has been derived from all this wrangling? It feels like if Apple had just left well alone then it would be in a better market position now, allowing for the negative PR consequences compounded by Samsung's resulting advertising strategy. All I can imagine is that it's so much bluster to get a better price on components.
The 5s is 58.6 mm wide; if every single millimetre of that were screen then at 16:9 that would be a diagonal of just barely more than 119.5 mm — 4.7". So I think you're probably on to a winner there.
So that aligns me pretty well with maybe 5% of the market. If Apple finally does move closer towards what the other 95% have shown themselves to want then I think it'll be good for them.
Spot fact: the current iPhone and the original 2001 iPod are almost exactly the same width — there's about 2mm difference. So presumably that's how wide someone at Apple at some point determined is comfortable for thumbs.
... and that's something we used to have but which was taken away. If I were setting up a new account now I'd seriously consider outlook.com but I'm not unhappy enough to leave. I assume the majority of people here managed to jump in early enough to get their actual name as an address?
@TechnicalBen — from the main other phone companies they were for hard engineering technical patents.
Nokia's was about wireless technology and ended with an out-of-court settlement that guarantees ongoing licensing fees from Apple. I'm not knowledgable about wireless technologies to say a lot beyond that but I guess Apple accepted some sort of fault even if it legally didn't.
Motorola's claims against Apple have yet to be concluded as they've been stayed pending an EU investigation into whether Motorola have broken competition law by asserting rights over FRAND patents. Though an investigation is just an investigation and nothing more; maybe Motorola acted inappropriately, maybe not. Regardless the dispute is (i) whether Apple acquired a licence automatically because the Qualcomm chips it buys are licensed and are the component that contains the intellectual property; and (ii) if not, whether Motorola was bound by FRAND agreements and/or whether such an offer was made to Apple appropriately.
So technically that dispute is about whether the licensing steps Apple took were acceptable.
From the non-phone companies, the patents asserted have been less impressive. Creative Labs won shortly before the iPhone for hierarchical menus and continue to receive royalties as far as I'm aware. Kodak sued essentially because the iPhone could show a preview before taking a digital image (though, in practice, the lawsuit was actually about ownership of a related patent and whether Kodak could sell it in bankruptcy, though the issue was still that Kodak wanted money for digital image previews).
S3's claims included what appears to be a patent over taking a digital video signal in and sending an analogue video signal out, which sounds quite obvious to me but they ended up winning and then losing so probably sounded quite obvious to the judge. Another was essentially about the way the PowerVR does deferred tile-based rendering — rendering pixels from geometry in a certain order to maximise caching. I know S3 couldn't make the claim stick but I don't know whether it's because of a Motorola-type situation where PowerVR already pay the correct licensing or just because it was felt to be an obvious patent.
I don't wonder: you detest Apple because you haven't done a proper statistical examination of the field and you suffer from confirmation bias.
Companies you've heard of that have sued Apple first:
• Nokia (starting the first lawsuit of the smartphone era);
• Creative Labs;
• S3 (this is why HTC bought them);
Companies you've heard of that Apple has sued first:
Other lawsuits during the period involving companies you've heard of:
• Oracle v Google;
• Microsoft v Motorola;
• Microsoft v Barnes & Noble (over their use of Android);
• Post-Nortel consortium (including Apple, RIM, Microsoft, Sony, Ericsson) v Google.
(apologies for the horrid formatting; using ul/li actually looked worse and El Reg both doesn't allow double line breaks between paragraphs and doesn't do soft returns)
I've not bothered listing everyone who now pays Microsoft because they use Android.
Maybe you're arguing that Apple's patent claims are weaker than the others because they're all soft design patents? To me that would appear to be a fallacious argument because of the number of times Apple has been sued for patents that the companies involved were legally obliged to offer under FRAND terms but refused to do so. I'd say that using one kind of patent as permitted under law is less bad than using another kind of patent in a way you've legally prohibited yourself from doing so as in the first case everyone knows the rules, no matter how much you or I might think they're laughable, and in the second the rules are deliberately being broken.
If some of my inevitable down-voters can engage with how I'd be wrong in an assertion that Apple is being no worse than the industry average then I'd be grateful.
I was going to post a similar thing: people are concluding malice where there is no such evidence — jumping from "Apple makes disproportionately hard-to-repair devices" to "Apple is trying to kill the repair industry so that it can pump up its own profits".
As well as the factors you raise, I think it may also be because the people at Apple genuinely seem to care about shaving millimetres off the products every year. When faced with a conflict they prefer being able to claim thinnest/lightest to being able to claim more repairable.
The under 30 tag does seem to be arbitrary: this isn't a fashion thing, it's an accessibility thing and the Internet is no more natural to those born in the '90s than to those born in the '80s. Either it showed up before you were an adult or it didn't.
Or you could just print it from your iPad. Options, options, options.
Our two-year old office HP printer implements whatever system Apple invented for printing so I'm sure plenty of others do. In every iOS app I've tried you just tap print and select the printer. Though there are no drivers and no configuration screens so I'm sure the budget printers don't work, the manufacturers having spotted the lack of an opportunity to load an 800mb binary on every boot that constantly shouts at you with a semi-human voice and pushes advertising for their ink shop into your face*.
(* I recently had to use a standard ~£30 Kodak all-in-one printer with OS X; it was horrid)
The GUI is quite a bit more revised than perhaps is obvious: Core Animation originated on the iPhone and made its way over the the Mac only afterwards (though, publicly, it was on the Mac long before the iPhone was announced). Core Animation primarily does three things: (i) it draws every view to GPU storage, always; (ii) it introduces an extra transform into the composition process, allowing any view to have any linear transform applied to it when drawn to the screen; and (iii) it takes advantage of Objective-C's dynamic runtime — including lookup of setters by name — to write introduce common code that can adjust a value from A to B per a function f(t) of time, then targets that one piece of interpolation logic all over the place to make the coding difference between an animated transition and a static one just a couple of lines of code.
I'm willing to bet that stuff, the 'how do we run a consistent metaphor at 60Hz on the hardware?', is what took the majority of the time for the team. Obviously iOS became the first GUI OS to work only in the presence of a GPU but which thing prompted the other? It's all obvious in hindsight but I'll bet time was spent on the back and forth over that.
It caused Facebook shares to drop by more than 6%. Is that a positive thing? I guess it depends on how vindictive you're feeling.
What did boffins have to say about it all? Was it all actually the fault of asylum seekers? Does it tell us anything about Princess Diana?
I think the difference is that a lot of the datasets that researchers work with are structured in an inherently parallel fashion — the expensive stuff is purely functional individual results from each of millions or billions of data points. It's the classic CPU versus GPU thing: CPUs are good when you want to do any of a very large number of variable things to a small number of objects, GPUs are good when you want to do a few very exact things to a large number of objects.
So OpenCL or CUDA just naturally get radically better performance than a traditional CPU for a certain segment of users, and those users are the niche at which this card is targeted.
All it does is generate sync signals, addresses and indicate where a (text) cursor should go in the signal, and can latch the current logical position if it receives a pulse from a light pen. Actually fetching the video byte and by whatever means turning it into colours is left to other circuitry.
It is indeed the same chip used by at least the [8-colour] BBC Micro, [27-colour] Amstrad CPC and [16/256-colour] EGA/VGA cards.
... and that's the absolute most I can possibly contribute to the conversation. I enjoyed the article but, like most Brits, have no idea how large a baseball is.
One assumes last July's huge internal shakeup was related to this: probably too many fiefdoms and no centralised control — fine, the design team was separate and did what it did but then everything filtered down into the traditionally separate teams and they did their traditionally separate things.
It's more absurd than you'd think: admittedly obliquely, answering a popular British press criticism of the revolution, Jefferson wrote:
"And can history produce an instance of a rebellion so honourably conducted? ... God forbid we should ever be 20. years without such a rebellion. ... What country ever existed a century and a half without a rebellion? And what country can preserve it's liberties if their rulers are not warned from time to time that their people preserve the spirit of resistance? Let them take arms."
Which obviously doesn't mean that the founding fathers considered intended the American constitution (and, indeed, the Constitution) to be a flexible thing but probably does mean that Americans need the largely unrestricted right to bear arms.
It's pretty clear, as stated in the article, that staff at the NSA routinely abuse its powers. Furthermore the whole approach seems to be scattergun with the overwhelming majority of it being aimed at people who have done nothing wrong and will do nothing wrong but, regardless, are given no opportunity to defend themselves or even any notification that they've been surveilled.
So given that this was an attempt at PR, I agree that the NSA is lousy as PR.
You think he was heading to the forums to post a comment blindly supporting [company X] regardless of the story, taking the opportunity to remind people who use products made by [company Y] that they're a sub-monkey laughing stock and objectively wrong?
It was Z, wasn't it? They spent much longer than usual on that only to be pipped to the post by the coincidentally very similar Command & Conquer, then found themselves surrounded by the new world of the Playstation and never quite refound their footing. At least that's how I've heard it told.
255 shots per second? I wouldn't put anything past them but if the Bitmap Brothers really were checking the input state more than five times per frame then hats off to them. I'll bet there are gamers that could tell the difference, too.
The problem with Workbench was that the way its nano kernel was architected — a product of its time — was basically dependent on message passing between processes being fast. When you don't have protected memory, as on the Amiga, it is fast because there's no work to do. If you have protected memory then it's usually slow as the kernel must either copy the message from one address space to another, must reassign ownership of regions of memory or must be not entire protected giving everyone access to shared messaging memory and still allowing one misbehaving application to mess up a bunch of others. Therefore Workbench as constituted had a definite expiry date.
(of course, I say this aware that e.g. the classic Mac OS was even worse with neither protected memory nor pre-emptive multitasking, yet managed to hobble on just about into the early 2000s before an overhaul was finally achieved)
As for PC vs Amiga? Yeah, it was already clear which way the wind was blowing once the PC became the machine with the super-fast CPU and the easy to address video memory. The 3dfx and its competitors just sealed the deal. The open market of commodity hardware from multiple vendors overwhelmingly based around a software platform eventually outdid the closed market of a single vendor overwhelmingly based around a hardware platform. I think it's telling that the only other computer platform to survive the '90s was also overwhelmingly a software platform, not a hardware platform (and, indeed, now largely just uses the same commodity hardware as everyone else).
I think the difference is that Samsung has a lot of space to transition sales from ordinary phones to smartphones, so its gains in smartphone numbers are offset by the decline of the non-smart market.
If you compared methods of playing MP3s then Apple's gains wouldn't look so good as the ongoing decline in iPod sales would have a similar effect.
Yeah, Google's aggressive data collection and hoarding, increasingly closed software and alleged anticompetitive practices (cf: http://www.theverge.com/2013/6/13/4427706/eu-committee-probe-google-over-android-anticompetitive ) is much better than Apple's not-invented-here mentality, explicitly closed software and alleged anticompetitive practices. Also when Google doesn't want something you submit on its store, it permanently suspends the thing — which is obviously a lot more 'open' than when you submit something to Apple's store which it doesn't want, as Apple will decline to approve it.
Apple's position as whipping boy is not without justification but the degree to which some people separate it from other players is absurd.
Suppose I'm an average consumer and I want a smartphone. I probably decide to purchase an Android because it ticks all my mental boxes: touchscreen, web browser, apps. Having decided to do that, I see the S4 or the HTC One is the king of the market but I also see the Moto G and the Nexus 4 offering a lot more value and I probably have the choice of something else free on my contract. It's quite likely that I don't buy the most expensive option.
Supposing I decide I want an iPhone, because I am already in the Apple ecosystem from iPod times, because I was an early smartphone adopter and don't fancy learning something new, or for whatever other reason, the first thing I can consider is quite an expensive handset. So Apple has protected its profit from me by not offering a mass-market option.
With no evidence other than personal bias to suggest a causative link, surely the correct assumption is the standard: success is part skill, part luck and hence extreme success implies extreme luck. Since luck is random, over time extreme luck will always decline. So even those who are still doing everything under their own control perfectly will see declining prospects.
(though, obviously, I'd love it if the story were Windows 8 + migration away from this category of computing device while those that were buying the expensive ones continue to have the money to buy one of everything)
That doesn't sound likely to me. Here's what most people think about phones: they're much of a muchness and the best one to get is whichever is the most nominally expensive that they'll give you for free on your contract.
The iPad similarly doesn't have Calculator. I wouldn't mind if it didn't make me so painfully aware that I'm out of practice with mental arithmetic.
Based on the number of people that seem capable of operating phones based on the Linux kernel, I'd say the tailored user interface obviates any concerns about the ability of the man on the street to use the bash/csh/X11/etc/whatever stack usually associated with 'Linux'.
That being said, I suspect Windows was chosen because it comes with good security support most of the time with the cessation of support having been hand waved away. The Linux guys are very good at updating the kernel but then who's responsible for pushing that to the machines? And the XP support period has beaten any of the commercial Linuxes by quite a stretch.
So we're accusing Apple of breaking third-party Lightning cables without any evidence that they have, comparing them to Microsoft despite that not really being something Microsoft would do, then recommending Amazon cables as still working even though we still have no evidence that others have stopped working or, therefore, that Amazon cables still do?
El gato no come?
I don't care enough to compare the specs directly but your Samsung is a different proposition because it's an internal drive with no Thunderbolt port.
However, the two other Thunderbolt SSDs available on Amazon are:
• a 256GB Lacie for £254 (which, like the reviewed product, also has a USB3 port); and
• a 256GB MiniPro for £238 (with no additional USB3).
So the Lacie is more than 40% cheaper but appears to be from a year and a half ago so is likely slower. However the MiniPro is not yet a year old and boasts of being "capable" of read speeds "in excess of 500MB/sec" and therefore would be faster if the marketing puff were reliable. So it does sound like a better deal.
Kicking it old school here, I find my AMOLED Nexus One (yes, definitely the AMOLED version: that PenTile matrix is clear) harder to read in direct sunlight than whatever my iPhone 5s has. But maybe it's just not automatically adjusting brightness or something? Better not rule out user error.
Ford at least were rumoured to be sticking with intelligence in the car but via QNX rather than the Microsoft product. I can understand the push towards devolving processing to the phone — apart from making it replaceable I think it also eases several regulatory hurdles since there's one testing standard for normal consumer electronics and another for electronics in cars — but betting on one or two specific handsets now when the car will probably still be in use by somebody in 15 years is silly.
Standards have always been a useful tool for companies to use in certain situations. Look at web standards: useful for Microsoft to ignore so it could gain a monopoly, then useful for Mozilla et al to push so that the competition would all be aiming for the same target, now useful for everyone because nobody has overall control. This is much better for the market but every step along the line has been motivated by individual interests.
In this case the combination of standards you'd need to implement would be significantly more onerous than just supporting (pretty much) a single device and I guess Mercedes et al are more interested in beating each other to market. They'd probably also argue that most of the other similar in-car systems can't connect to your mobile at all for things like GPS, music with the menus up on the dash, contacts, calendar, etc, so it's all a bonus, right?
I hope standards will be forthcoming.
I guess it gets you rich text with an established scripting language and platform-neutral API which doesn't enforce many assumptions about specific typography?
Though I'm not very enthused if the first laundry list item is a "file system browser". My OS already has one of those. Yours does too. They're probably different. Why would either of us want to learn a third?
Just use whatever's already in the OS, please. I don't care how boring it is.
The linked article states:
"A test case could have caught this, but it's difficult because it's so deep into the handshake. One needs to write a completely separate TLS stack, with lots of options for sending invalid handshakes."
So rather than the absence of proper unit testing I'd say it was the absence of exceptional unit testing, but it plays into the usual narrative around Apple's attitude towards security versus the more commerce-oriented firms.
But these same issues must be concerns for Firefox et al. How many variations of ostensibly valid but actually invalid SSL certificate can there be and has nobody set up test servers that automatically vend those? Writing a unit to test to connect to each of those, with and without a data fuzzer, doesn't sound too hard.
Mercedes had a self-driving car take itself from Germany to Denmark back in the mid-'90s so are probably used to those tech upstarts duplicating their work by now.
At $60,000 the Tesla is a budget car, silly!
Hey, businesses, wouldn't it be great if your staff were more distracted? Hey, businesses, wouldn't it be great if all your commercial practices — whether genuinely dodgy or just ordinarily proprietary — were more widely and more easily recorded? Etc.
Phwoar! Look at the foraminifera on that!
... or Great Britain? Oh, sorry, it's still optional, right? I'll ask again in a few years.
He was happy to buy a licence to post on Google Play so I suspect he's fine with licensing. Or maybe it's just that his budget was $25 rather than $99/year?
I thought Apple's recent track record on mapping was quite encouraging: they've been slowly and methodically fixing the problems without calling a press conference every three months to announce they're "Revolutionising maps. Again." or whatever. Much better behaviour than I think any of us might have predicted.
Wouldn't that just be one step towards no longer gaining weight? My understanding is that the most recommended way to lose weight is to eat a healthy mix of food in a minimum safe quantity and exercise to create a moderate daily energy deficit.
If I dare challenge the received image: Sega makes most of its money from manufacture and distribution of coin operated entertainments — not just video games but a bunch of things. It's always had a very profitable business in that. It had a successful foray into home entertainment but was losing money every year by the time the Dreamcast was on sale. It was therefore smart and pragmatic severely to scale down the home entertainment side while continuing to enjoy the coin op profits. It's impressive that such a company went for broke with the Dreamcast rather than hedging on the escape strategy but the latter was always a safe option.
No oblivion, none coming. Just a gamble that didn't pay off.
You appear not to have understood the case.
If Ford said its feature worked "each and every morning" and it didn't then you would have a case.
The court looked at the specific adverts that Apple actually used and determined that there was no problem _specifically because_ Apple never promised it would work "each and every" time.
It's all in the article.
That's exactly what a Commodore owner would say.
For the record, I find OS X's Launchpad to be stupid without a touch screen. But it was added as an extra: nothing else was taken away. If anything OS X has become more accommodating over time to those of us that keep regular use applications directly on the dock and the /Applications folder over on the right for start-menu like access to everything else, as Apple has introduced the speech bubble style folder that makes a better show of most /Applications folders.
To be fair to Rovio, playing and playing again in Angry Birds was similarly speedy back when it was a paid standalone app. The clutter of advertising has accumulated only after success. So I'm sure it's a classic tale of most of the team understanding the benefits but the marketing team having different ideas.
As for Flappy Bird? I can see the appeal: if you fail then it's unambiguously always your fault, the gameplay doesn't actually progress so there's no having to repeat yourself disincentive to hitting the play button again and it requires just enough attention to occupy you. So you end up hitting the play button repeatedly and losing track of the time. Meanwhile all it does for revenue is display a small advertising banner at the top of the game over screen but not during gameplay, which is actually quite smart because it's a contextually justified way to get a lot of impressions and doesn't annoy the user.
There are a lot of theories that Nguyen is some sort of genius — e.g. the rate button was also on the game over screen in early versions and would appear suspiciously close to where most people tap to fly. Meanwhile Apple's App Store uses recent positive reviews to weight its overall rankings as they attempt to quantify popularity by as many measurements as possible. So that may have helped give the app early momentum, whether intentional or not.