1749 posts • joined 18 Jun 2009
Could Unified Memory not be a step intended to buy Nvidia better heterogeneous computing options in the future, especially on smaller systems? I'm thinking of things like smartphones where you've got GPU + CPU + RAM in a single module, with the memory actually physically addressable equally by the different components. It'd be good to have naturally parallel things scale automatically between appropriate cores on those, wouldn't it? GPGPU isn't going to be exclusively for the HPC niche forever.
Re: They're coming but they're not selling (@Don Jefe)
If value were "totally irrelevant" then people would use smartphones exactly as much as they currently do, even if data were still 60p/megabyte. I'd suggest that (i) they wouldn't; and therefore (ii) value has _at least some_ relevance.
In wearables terms, if it were a choice between one phone with no associated watch and one with a watch for, say, £50 more, I can see people going for the watch even if they not only then never actually wear it but also if they made exactly the same choice last year and never wore that one either. If it's a choice of paying an extra £300 in a separate transaction for a watch then I don't see that happening.
They're coming but they're not selling
See the sales of smart watches. Smart phones said "now you don't have to go all the way to your home to see email, maps and the web". Wearables say "now you don't have to reach all the way into your pocket to see email, maps and the web". For whom is that a sufficiently compelling proposition?
Re: Laptop resolutions...
95% of Windows applications display incorrectly on high resolution panels: see the various Kirabook reviews for more details. Microsoft's all work perfectly and Adobe's are getting there so it's obviously something you can solve even with millions of lines of legacy code that's probably in pure win32 in places, but I guess proper scaling something that was either easy to do incorrectly or requires effort to do at all. Maybe someone with more insight can expand on that?
In the meantime I guess it's difficult for most laptop manufacturers to make the jump,
Re: My two-year old camera has a better screen
Original post withdrawn as I stand heavily corrected; +1 to "Sorry that handle..." for correcting my simple-minded equation of a camera's "1,230,000 dot display" with a 1.23 megapixel display.
Re: Working with dates is hard, and financial guys believe it is not
I misread 'finance' as 'fiance'. The story still sounded familiar.
I don't think so. But Microsoft was found to have violated antitrust law thirteen years ago now; Apple was found to have violated less than a year ago. The watchdog is there for the one specific purpose of monitoring compliance concerning book pricing.
To be contrary, it sounds fully rotatable to me. The article mentions lenticular displays, which can't be rotated, but seems to say that the rumour is that Amazon will use a regular 2d panel and present the illusion of 3d through eye tracking. So the vector from the centre of the screen to the user's eyes is really all the phone calculates with; rotation doesn't matter.
It's the same thing as those c.2007 videos of the guy who reversed the Wii remote control (infrared sensors on his glasses, Wiimote static and pointed at him).
Re: Still got no proper depth of field
I think this is more likely to turn up in VR headsets first. You don't have to go to a full light field — supposing mechanical latency were magically no problem you "just need to" track the eye's focus, undo it with an adjustable intermediate lens and then apply appropriate depth of field to the rendered image. So you're sort of cancelling the eye's attempt at focus and then pretending that it worked.
Inverted commas are there as per my expertise. How easy is it for us non-engineers to sit here and say "it doesn't sound hard to me"?
Obligatory "What about Samsung?" post
Samsung implemented eye tracking in one of its handsets (the S3? Maybe the S4?) albeit that they limited it to deciding whether to keep the screen awake and allowing implicit scrolling. It's not a huge leap from tracking the pupil to decide how to move 2d content around to tracking the pupil to decide how to move 3d content around, especially when you've already got handsets like Apple's that import a slightly more nuanced idea of interface depth than just plain z-order and use that in order to do very subtle 3d presentations (in that case in response to the gyroscope rather than to eye motion but you get the point).
But how many years did it take for web-enabled phones and tablet computers to get over the hump?
Re: Wrong. @ThomH @Neoc
As noted, it's EU law, not US, but I'd argue that what Apple has done is "potentially illegal" in the same sense that your post is "potentially written in Portuguese".
Although law in the US and UK is often known for being obtuse, that's often because the statutes were written hundreds of years ago so the words have either a historic meaning or have has their original meaning slowly winnowed by years of case law (see e.g.: malice aforethought in Lord Coke's definition of murder, which essentially means "intention, at the time"; spontaneous mercy killings are murders).
EU law doesn't use the precedential system (i.e. case law doesn't affect what the law subsequently is, it merely directs you as to how it has previously been interpreted — the latest judges may agree or disagree) and is always read in terms of the wrong that was intended to be addressed rather than exactly the words on the page (because it's translated into so many languages and because that's how most of the European national legal systems operate anyway).
The attempted precise technical language still tends to put people off, and there are other barriers like the way the articles all get renumbered when a new treaty requires it, but the net effect is that EU law is probably the easiest to read a little of.
It's Articles 101–106 for competition law. If you're curious then check them out.
Isn't that a bit like installing Firefox on Windows in order to stick it to Safari? Buying the Android — the default mainstream choice — is the thing that will actually have had affected Apple; then ignoring Apple's blessed (/strong-armed) choice for an app specific to its OS feels like tilting at windmills.
Re: Wrong. @ThomH @Da Weezil
Then it's probably a good thing I was talking about EU competition law, isn't it?
It's been a decade since I went to law school and I don't practise so I'm probably the definition of a little knowledge being a dangerous thing but the school I attended was definitely inside the UK.
Since there's nothing whatsoever to suggest I was talking about US law, and in any case the EU law is based on the US law, I'm unclear what your point was meant to be.
Re: Wrong. @Neoc
It's nothing like illegal.
Apple doesn't have a monopoly in banner adverts — not across the industry as a whole and not even if you restrict yourself to iOS specifically. The vast majority of apps use something other than iAds. Free, open source iOS SDKs are available from Google, MoPub, Grey Stripe, Flurry and many others.
Since it doesn't have a monopoly, Apple cannot abuse that monopoly.
Competition law protects the marketplace from actual damage. It isn't a sort of corporate equivalent of criminal law.
I guess it depends on your outlook. If you're a strong partisan then either you're now going to download the app on your Android or completely ignore it on your Apple. If you're a pragmatist you're going to ignore the app because of the self-serving attempt at controversy and the many much more popular options available. If you're outside the blog bubble you're not going to hear about this at all.
Re: How many?
As many as you like. I can tell you that our app used to include Flurry and MoPub with a wide range of adaptors but nowadays we mostly just use MoPub with the Google AdMob and Apple iAd adaptors. Anything we sell ourselves is trafficked on MoPub directly and if we've trafficked nothing specifically then it falls back on whichever of AdMob or iAd is offering the better rate at that moment.
Re: Fixated on Apple
These are historic documents, but not by a lot — though Samsung has started name-checking companies beyond Apple in its effective but slightly disingenuous* adverts. So I think someone there is aware that if they continue to focus all guns in one direction then a competitor is likely to leapfrog the both of them.
* in which the market leader tells people they're sheep if they buy from a competitor.
Re: Eegads imagine the price! (@Bullseyed)
That's what the posters inverted commas around 'first' were meant to indicate, I think.
Apple's modus operandi has always been to spot a nascent market (e.g. home computers, GUIs, MP3 players, web-enabled phones) and leap in with the first second generation product to become the early market leader. If unable to keep commanding control of the entire market (i.e. in everything except MP3 players) then the retreat is into the high end niche.
I cannot imagine that Apple could succeed in televisions.
Re: Nah your wrong....
Maybe they've seen which way the wind blew with mobile screens, taken a moment of self reflection to realise how quickly they keep pace, and decided just to launch the watch with a 65" screen?
As I've said to the other post, the exact issue is respect for the right to people's own opinions. Eich believed that people shouldn't be allowed to exercise their own judgment: he put money towards a successful campaign to make his opinion the law of the land.
Especially for those whose rights were affected by the law, his selection was therefore difficult to make peace with; for some it was evidently unacceptable.
I think it's more that he supported an attempt to legislate about an issue that we may or may not agree with in his private life — he felt that his opinion was so valid that it should be illegal to act contrary to it, indeed he was so sure that he put his money where his mouth was.
For the affected group it wasn't just a matter of knowing that some people disagree with you, it was a matter of not legally being able to do what everyone else takes for granted.
Sadly it's not especially overpriced compared to the Thunderbolt competition. What is it about PCIe being on a cable rather than a slot that seems to add hundreds of pounds to the cost?
Apple was first to... release a device that was truly attractive to the majority of consumers. It achieved this through a successful marriage of blossoming technologies and by using its iPod clout to strong-arm the mobile networks.
If you had asked interested consumers what impressed them in the first iPhone they'd probably mention the multitouch interface (thanks to Apple's acquisition of FingerWorks), the smooth user interface (achievable because GPUs had crossed the necessary power/price threshold) and the unlimited data plans (that's the network clout bit). The screen was also large and high resolution for its time.
But the real issue here isn't whether Samsung dared to plan its products by surveying competing products (hint: it did) it is to what extent it should be allowed to as obviously if someone had said "iPhone purchasers have reported liking this browser icon, let's use this browser icon" then there'd be no real debate.
If you've really nothing else to do with your day you can check my post history and see that I'm generally positive about Apple but, honestly, I don't think a functioning marketplace is sustainable if we're at the level where minor interface elements like slide to unlock are protectable (regardless of whether Apple should own that one or not). I also don't really understand Apple's strategy here. What benefit has been derived from all this wrangling? It feels like if Apple had just left well alone then it would be in a better market position now, allowing for the negative PR consequences compounded by Samsung's resulting advertising strategy. All I can imagine is that it's so much bluster to get a better price on components.
Re: My hunch (@DougS)
The 5s is 58.6 mm wide; if every single millimetre of that were screen then at 16:9 that would be a diagonal of just barely more than 119.5 mm — 4.7". So I think you're probably on to a winner there.
I don't want a larger phone
So that aligns me pretty well with maybe 5% of the market. If Apple finally does move closer towards what the other 95% have shown themselves to want then I think it'll be good for them.
Spot fact: the current iPhone and the original 2001 iPod are almost exactly the same width — there's about 2mm difference. So presumably that's how wide someone at Apple at some point determined is comfortable for thumbs.
All that's missing is push mail for mobile clients
... and that's something we used to have but which was taken away. If I were setting up a new account now I'd seriously consider outlook.com but I'm not unhappy enough to leave. I assume the majority of people here managed to jump in early enough to get their actual name as an address?
Re: Oh come on.
@TechnicalBen — from the main other phone companies they were for hard engineering technical patents.
Nokia's was about wireless technology and ended with an out-of-court settlement that guarantees ongoing licensing fees from Apple. I'm not knowledgable about wireless technologies to say a lot beyond that but I guess Apple accepted some sort of fault even if it legally didn't.
Motorola's claims against Apple have yet to be concluded as they've been stayed pending an EU investigation into whether Motorola have broken competition law by asserting rights over FRAND patents. Though an investigation is just an investigation and nothing more; maybe Motorola acted inappropriately, maybe not. Regardless the dispute is (i) whether Apple acquired a licence automatically because the Qualcomm chips it buys are licensed and are the component that contains the intellectual property; and (ii) if not, whether Motorola was bound by FRAND agreements and/or whether such an offer was made to Apple appropriately.
So technically that dispute is about whether the licensing steps Apple took were acceptable.
From the non-phone companies, the patents asserted have been less impressive. Creative Labs won shortly before the iPhone for hierarchical menus and continue to receive royalties as far as I'm aware. Kodak sued essentially because the iPhone could show a preview before taking a digital image (though, in practice, the lawsuit was actually about ownership of a related patent and whether Kodak could sell it in bankruptcy, though the issue was still that Kodak wanted money for digital image previews).
S3's claims included what appears to be a patent over taking a digital video signal in and sending an analogue video signal out, which sounds quite obvious to me but they ended up winning and then losing so probably sounded quite obvious to the judge. Another was essentially about the way the PowerVR does deferred tile-based rendering — rendering pixels from geometry in a certain order to maximise caching. I know S3 couldn't make the claim stick but I don't know whether it's because of a Motorola-type situation where PowerVR already pay the correct licensing or just because it was felt to be an obvious patent.
Re: Oh come on.
I don't wonder: you detest Apple because you haven't done a proper statistical examination of the field and you suffer from confirmation bias.
Companies you've heard of that have sued Apple first:
• Nokia (starting the first lawsuit of the smartphone era);
• Creative Labs;
• S3 (this is why HTC bought them);
Companies you've heard of that Apple has sued first:
Other lawsuits during the period involving companies you've heard of:
• Oracle v Google;
• Microsoft v Motorola;
• Microsoft v Barnes & Noble (over their use of Android);
• Post-Nortel consortium (including Apple, RIM, Microsoft, Sony, Ericsson) v Google.
(apologies for the horrid formatting; using ul/li actually looked worse and El Reg both doesn't allow double line breaks between paragraphs and doesn't do soft returns)
I've not bothered listing everyone who now pays Microsoft because they use Android.
Maybe you're arguing that Apple's patent claims are weaker than the others because they're all soft design patents? To me that would appear to be a fallacious argument because of the number of times Apple has been sued for patents that the companies involved were legally obliged to offer under FRAND terms but refused to do so. I'd say that using one kind of patent as permitted under law is less bad than using another kind of patent in a way you've legally prohibited yourself from doing so as in the first case everyone knows the rules, no matter how much you or I might think they're laughable, and in the second the rules are deliberately being broken.
If some of my inevitable down-voters can engage with how I'd be wrong in an assertion that Apple is being no worse than the industry average then I'd be grateful.
Re: It isn't because they're "out to get" the poor third party repairmen
I was going to post a similar thing: people are concluding malice where there is no such evidence — jumping from "Apple makes disproportionately hard-to-repair devices" to "Apple is trying to kill the repair industry so that it can pump up its own profits".
As well as the factors you raise, I think it may also be because the people at Apple genuinely seem to care about shaving millimetres off the products every year. When faced with a conflict they prefer being able to claim thinnest/lightest to being able to claim more repairable.
Re: Stating the obvious ..
The under 30 tag does seem to be arbitrary: this isn't a fashion thing, it's an accessibility thing and the Internet is no more natural to those born in the '90s than to those born in the '80s. Either it showed up before you were an adult or it didn't.
Re: View but no print
Or you could just print it from your iPad. Options, options, options.
Our two-year old office HP printer implements whatever system Apple invented for printing so I'm sure plenty of others do. In every iOS app I've tried you just tap print and select the printer. Though there are no drivers and no configuration screens so I'm sure the budget printers don't work, the manufacturers having spotted the lack of an opportunity to load an 800mb binary on every boot that constantly shouts at you with a semi-human voice and pushes advertising for their ink shop into your face*.
(* I recently had to use a standard ~£30 Kodak all-in-one printer with OS X; it was horrid)
Re: It didn't need a big team
The GUI is quite a bit more revised than perhaps is obvious: Core Animation originated on the iPhone and made its way over the the Mac only afterwards (though, publicly, it was on the Mac long before the iPhone was announced). Core Animation primarily does three things: (i) it draws every view to GPU storage, always; (ii) it introduces an extra transform into the composition process, allowing any view to have any linear transform applied to it when drawn to the screen; and (iii) it takes advantage of Objective-C's dynamic runtime — including lookup of setters by name — to write introduce common code that can adjust a value from A to B per a function f(t) of time, then targets that one piece of interpolation logic all over the place to make the coding difference between an animated transition and a static one just a couple of lines of code.
I'm willing to bet that stuff, the 'how do we run a consistent metaphor at 60Hz on the hardware?', is what took the majority of the time for the team. Obviously iOS became the first GUI OS to work only in the presence of a GPU but which thing prompted the other? It's all obvious in hindsight but I'll bet time was spent on the back and forth over that.
It caused Facebook shares to drop by more than 6%. Is that a positive thing? I guess it depends on how vindictive you're feeling.
Returning a laptop to PC World ruined this bloke's credit score. Today the Supreme Court ended his 15-year nightmare
Nightmare? Dad's fight?
What did boffins have to say about it all? Was it all actually the fault of asylum seekers? Does it tell us anything about Princess Diana?
Re: Yowser @ToddR
I think the difference is that a lot of the datasets that researchers work with are structured in an inherently parallel fashion — the expensive stuff is purely functional individual results from each of millions or billions of data points. It's the classic CPU versus GPU thing: CPUs are good when you want to do any of a very large number of variable things to a small number of objects, GPUs are good when you want to do a few very exact things to a large number of objects.
So OpenCL or CUDA just naturally get radically better performance than a traditional CPU for a certain segment of users, and those users are the niche at which this card is targeted.
The 6845 is a zero-colour chip
All it does is generate sync signals, addresses and indicate where a (text) cursor should go in the signal, and can latch the current logical position if it receives a pulse from a light pen. Actually fetching the video byte and by whatever means turning it into colours is left to other circuitry.
It is indeed the same chip used by at least the [8-colour] BBC Micro, [27-colour] Amstrad CPC and [16/256-colour] EGA/VGA cards.
... and that's the absolute most I can possibly contribute to the conversation. I enjoyed the article but, like most Brits, have no idea how large a baseball is.
One assumes last July's huge internal shakeup was related to this: probably too many fiefdoms and no centralised control — fine, the design team was separate and did what it did but then everything filtered down into the traditionally separate teams and they did their traditionally separate things.
Re: The founding fathers!
It's more absurd than you'd think: admittedly obliquely, answering a popular British press criticism of the revolution, Jefferson wrote:
"And can history produce an instance of a rebellion so honourably conducted? ... God forbid we should ever be 20. years without such a rebellion. ... What country ever existed a century and a half without a rebellion? And what country can preserve it's liberties if their rulers are not warned from time to time that their people preserve the spirit of resistance? Let them take arms."
Which obviously doesn't mean that the founding fathers considered intended the American constitution (and, indeed, the Constitution) to be a flexible thing but probably does mean that Americans need the largely unrestricted right to bear arms.
I disagree. Which means I agree.
It's pretty clear, as stated in the article, that staff at the NSA routinely abuse its powers. Furthermore the whole approach seems to be scattergun with the overwhelming majority of it being aimed at people who have done nothing wrong and will do nothing wrong but, regardless, are given no opportunity to defend themselves or even any notification that they've been surveilled.
So given that this was an attempt at PR, I agree that the NSA is lousy as PR.
You think he was heading to the forums to post a comment blindly supporting [company X] regardless of the story, taking the opportunity to remind people who use products made by [company Y] that they're a sub-monkey laughing stock and objectively wrong?
It was Z, wasn't it? They spent much longer than usual on that only to be pipped to the post by the coincidentally very similar Command & Conquer, then found themselves surrounded by the new world of the Playstation and never quite refound their footing. At least that's how I've heard it told.
Re: @ RyokuMas
255 shots per second? I wouldn't put anything past them but if the Bitmap Brothers really were checking the input state more than five times per frame then hats off to them. I'll bet there are gamers that could tell the difference, too.
Re: I miss my Amiga...
The problem with Workbench was that the way its nano kernel was architected — a product of its time — was basically dependent on message passing between processes being fast. When you don't have protected memory, as on the Amiga, it is fast because there's no work to do. If you have protected memory then it's usually slow as the kernel must either copy the message from one address space to another, must reassign ownership of regions of memory or must be not entire protected giving everyone access to shared messaging memory and still allowing one misbehaving application to mess up a bunch of others. Therefore Workbench as constituted had a definite expiry date.
(of course, I say this aware that e.g. the classic Mac OS was even worse with neither protected memory nor pre-emptive multitasking, yet managed to hobble on just about into the early 2000s before an overhaul was finally achieved)
As for PC vs Amiga? Yeah, it was already clear which way the wind was blowing once the PC became the machine with the super-fast CPU and the easy to address video memory. The 3dfx and its competitors just sealed the deal. The open market of commodity hardware from multiple vendors overwhelmingly based around a software platform eventually outdid the closed market of a single vendor overwhelmingly based around a hardware platform. I think it's telling that the only other computer platform to survive the '90s was also overwhelmingly a software platform, not a hardware platform (and, indeed, now largely just uses the same commodity hardware as everyone else).
I think the difference is that Samsung has a lot of space to transition sales from ordinary phones to smartphones, so its gains in smartphone numbers are offset by the decline of the non-smart market.
If you compared methods of playing MP3s then Apple's gains wouldn't look so good as the ongoing decline in iPod sales would have a similar effect.
Re: Don't care.
Yeah, Google's aggressive data collection and hoarding, increasingly closed software and alleged anticompetitive practices (cf: http://www.theverge.com/2013/6/13/4427706/eu-committee-probe-google-over-android-anticompetitive ) is much better than Apple's not-invented-here mentality, explicitly closed software and alleged anticompetitive practices. Also when Google doesn't want something you submit on its store, it permanently suspends the thing — which is obviously a lot more 'open' than when you submit something to Apple's store which it doesn't want, as Apple will decline to approve it.
Apple's position as whipping boy is not without justification but the degree to which some people separate it from other players is absurd.
The power of not offering options?
Suppose I'm an average consumer and I want a smartphone. I probably decide to purchase an Android because it ticks all my mental boxes: touchscreen, web browser, apps. Having decided to do that, I see the S4 or the HTC One is the king of the market but I also see the Moto G and the Nexus 4 offering a lot more value and I probably have the choice of something else free on my contract. It's quite likely that I don't buy the most expensive option.
Supposing I decide I want an iPhone, because I am already in the Apple ecosystem from iPod times, because I was an early smartphone adopter and don't fancy learning something new, or for whatever other reason, the first thing I can consider is quite an expensive handset. So Apple has protected its profit from me by not offering a mass-market option.
Just regression towards the mean?
With no evidence other than personal bias to suggest a causative link, surely the correct assumption is the standard: success is part skill, part luck and hence extreme success implies extreme luck. Since luck is random, over time extreme luck will always decline. So even those who are still doing everything under their own control perfectly will see declining prospects.
(though, obviously, I'd love it if the story were Windows 8 + migration away from this category of computing device while those that were buying the expensive ones continue to have the money to buy one of everything)
That doesn't sound likely to me. Here's what most people think about phones: they're much of a muchness and the best one to get is whichever is the most nominally expensive that they'll give you for free on your contract.
- Mounties get their man: Heartbleed hacker suspect, 19, CUFFED
- Batten down the hatches, Ubuntu 14.04 LTS due in TWO DAYS
- Samsung Galaxy S5 fingerprint scanner hacked in just 4 DAYS
- Feast your PUNY eyes on highest resolution phone display EVER
- Wall St's DROOLING as Twitter GULPS DOWN analytics firm Gnip