That hoverbike level!
Yes, you know the one. There are rumoured to be ten levels after it but nobody has ever seen them.
2181 posts • joined 18 Jun 2009
Yes, you know the one. There are rumoured to be ten levels after it but nobody has ever seen them.
Mostly correct, I'm sure, but if Apple never pays for "... celebrities to tweet 'I love my new iPhone'" then what was all that from the ghost of Joan Rivers?
I confirmed this behaviour by testing against standard, trusted sites like LinkedIn and Gmail.
So, ummm, not all that useful for testing actually, Microsoft.
The spokesman doesn't say that at all. He said: "It all started with the Apple-1". If we're going to assume he meant something more than just "[Apple] all started with the Apple-1" then why stop at the entire digital world? Why not assume he meant that the Apple-1 ushered in the creation of the universe? I certainly can't personally vouch for anything definitely having existed prior to the early-'80s.
If I dare suggest: the spokesman's comment that "when you see a child playing with an iPad or iPhone, not too many people know that it all started with the Apple-1" suggests that he just means that Apple, the noteworthy company, started with the Apple-1. Not the digital age. Not the home computer revolution. Just Apple.
Such other births as you may want to peg to the Apple-1 are entirely at your own discretion.
Ars has an article — http://arstechnica.com/apple/2014/10/the-retina-imac-and-its-5k-display-as-a-gaming-machine/2/ — in which they play the current Alien game at 4k with screenshots, benchmarks and subjective reactions.
I don't want to ruin it too much for you, but:
The 3840x1440 runs appeared visually smooth when I watched them complete, but the numbers tell a bit of a different tale. When I hopped into the game to actually play at that resolution, there was a noticeable amount of mouse lag. Indications are that a faster CPU would have helped considerably, but with the iMac, what you get is what you get.
The Dell you refer to is a TN panel with about 8.3 megapixels and a linear density of around 163 ppi. The same as a first-generation iPhone.
The iMac is an IPS panel with about 14.7 megapixels and a linear density of around of around 217 ppi. A shade above the Nexus 7.
It's a fatuous comparison.
There's no reason anybody should have noticed but Nokia phones still do that. The whole back comes off as a single piece and various after-market options are available, including ordinary replacements, more interesting colours, ones with built-in flip covers and probably more.
It depends whether you lump health watches in with smart watches. If you do then mine does the following without two hands: monitors my heart rate, steps taken, skin temperature and perspiration level, all so as to determine periods in which I'm sleeping, running, walking or cycling, and therefore to comment on my general fitness.
Which are mostly things my phone doesn't do. Or I wouldn't have bought it.
But who is it working for?
If I name my hotspot "United States Perfect Freedom Democracy Network", will I get a free upgrade to first class?
If anybody wants to know how to stop advertising from working on them then I know one weird old tip. Yours for a song.
If that's your position, didn't the BBC express an opinion you could agree with circa 1989?
I enjoy it, personally. But not on an intellectual level — it's popular, family viewing.
If I dare suggest it: Google's vague, thin version of open source (we'll write it in private, according to our priorities, then show you when it's done: the cathedral model, but they'll sell the bibles over in the bazaar) served its purpose, of getting a certain kind of press for a certain audience when Android wasn't yet at critical mass, but is just no longer necessary. As the runaway winner in smartphones, with a mature and well-received product, Google no longer needs to play to that audience and doesn't otherwise desire to.
There's a bit of devil's advocacy to that statement; I'd love to hear the contrary viewpoint.
The former is the name of a distribution of the latter.
I'm not a fan.
My previous comment simply pointed out that you had misunderstood the Ars article. It's a technical discussion of how Swift works and why it was designed as it is. It is not an evaluation of Swift versus other languages. It is not an evaluation of LLVM versus other compilers.
Similarly my post claims only to be about what the Ars article is meant to convey. It does not at any point make the claim that I know anything more about compiler writing than what is stated in the contents of that article.
But what I'm enjoying is that I accused you of constructing straw men and you responded with:
"Can you explain how concatenating two strings of unknown size at compile-time, and storing the result in a third string, also of unknown size at compile-time, can be strength reduced to the native integer add case?", something neither I nor anybody else had claimed or come close to claiming, before segueing into: "Claiming that it is possible to apply strength reduction to an overloaded plus operator in such a way that it always reduces to the integer add case, and therefore only the integer add case ever needs to be emitted in machine code, is pure bullshit."
I am honestly unsure whether you're just a brilliant satirist. Kudos to you if so.
You might want to reread that article.
The point of that section is: the language was designed by the compiler author. In this specific case he wanted to make sure the plus operator could be defined per class without adding any baggage to a normal integer add — adding nothing, in machine code, beyond what the C would generate — and he wanted to do it without creating a special case. To which a discussion of how a modern compiler works is relevant because it helps us understand the mindset behind the new language.
Not a single line of that piece claims "Apple Has Done It Again". You're erecting a straw man.
I don't agree with the second statement, based on the LLVM track record. It was first released in 2003. Chris Lattner was hired by Apple specifically to work on LLVM for them in 2005, where amongst other duties he has added to it technologies intended to make it easier to write efficient and safe software for iOS and Mac OS X: ARC being the most obvious one. Those are completely open source.
Apple has effectively owned LLVM for more than 80% of its life and hasn't made any attempt to close the source — which might have been legally possible (as the original author's ownership isn't subject to the licence he offers the rest of the world, but I don't know who else had contributed) but regardless could easily have been achieved in practice by having Lattner work on a new compiler that wasn't LLVM.
I'm willing to bet that Swift isn't open source because, as evidenced by Apple telling everyone that the exact final syntax is still up for grabs, the company doesn't want to commit to anything yet.
... and how has he been judged?
Golden ages are subjective.
Pssst... insider's tip: iOS 7 does panoramas. No need for you to upgrade!
I wouldn't get too used to soldered RAM; it's clearly going to end up on the same physical silicon as the CPU before too long. Not for everyone, of course, in the same way that the compromises made to have the GPU on the same die isn't good for a lot of users — creatives and gamers, in particular — but I would dare imagine it'll be something Apple does everywhere except the Mac Pro.
I was pleasantly surprised that the 5k iMac still has RAM sockets. In a machine that size and that price I shouldn't have needed to be surprised at all.
Using vanilla v10.10, the following is displayed every time I hit the spotlight icon:
In addition to searching your Mac, Spotlight now shows suggestions from the Internet, iTunes, App Store, locations nearby, and more. To make suggestions more relevant to you, Spotlight includes your approximate location with search requests to Apple.
You can change this in Preferences. Learn more…
... with that final bit being a link. So no intelligence required: just ordinary reading.
It would appear they've continued updating it. Reinstalling it should give that pleasingly 10.3 feel to the day.
i5 is a brand name. It is not a model number. The i5 processors are on at least their third micro-architecture. Counting the cores and looking at the clock rate is fatuous.
That would be the issue raised by someone who titled their post "Radeon 290X to drive a 5k display?"
As to heat though, I'll bet Apple has just underclocked it. The advertised fill rate for that GPU is 64 gigapixels per second; enough to fill every pixel on screen more than 4,340 times a second. Let's suppose they underclocked it by 50%, which they won't have, that's naively enough to paint every pixel 36 times at 60 frames per second. Or more than enough to paint every pixel once even when you factor in that advertised fill rates aren't really achievable and blur translucency costs quite a few samples*. And OS X does not repaint every pixel for every frame, it just reacts to the changes (yes, with some redundancy because it's a broad-phase test).
* but not as many as you think; per empirical investigation Apple is using a smart mix of mip map samples very closely to approximate a Gaussian in four or five samples.
Android launches occur all the time and Google's own hardware doesn't dominate the market. It's therefore less newsworthy.
As to your random accusation of bias? Let's check the historical record:
After launch, the newer Nexus 7 sold fewer than a million units a month: http://www.fool.com/investing/general/2013/11/11/why-isnt-googles-new-nexus-7-outselling-its-predec.aspx . The iPad Air sold more than five million units a month after its launch: http://www.wired.com/2014/04/apple-q2-earnings-2/
The BBC put both of the launches of the last two days on its front page (see https://web.archive.org/web/20141015182903/http://www.bbc.co.uk/news/ ).
The difference is that the launch that's likely to interest more than five times as many people got one of the picture boxes rather than just a single line, giving it much greater prominence.
I'd suggest this is because the BBC reports based on likely reader interest. It's pretty much given that the iPads will massively outsell the tiny subset of new Android devices that was announced yesterday.
The screen is 5120x2880 pixels — coming up to 15 megapixels. At 32bits per pixel, that requires a frame buffer of 56.25 megabytes. A 2gb GPU can therefore store more than 37 copies of the display before running out of space. So there's plentiful room for all of the things a GPU caches.
What do you imagine the issue to be?
Your wallet? You're so old fashioned!
It would have been a very practical option but Aérospatiale-BAC didn't see that oil crisis coming any more than anyone else and when BA realised that most fares were corporate they multiplied up the price.
Doing the big post-crash press relaunch on the morning of the 11th of September 2001 also turned out not to be ideal with hindsight.
With this and the Nexus 9 going after the market for laptop-type-things, do you get the feeling that perhaps the Android and Chrome teams aren't the best of friends?
64-bit ARM refashions quite a lot of the architecture so as to achieve advantages quite distinct from just having a larger address space. Including:
• approximately twice as many integer registers (28 general purpose versus 13);
• more, and wider, classical floating point registers;
• double precision SIMD; and
• better synchronisation primitives.
There are also some performance-oriented subtractions. ARM used to be famous for making every instruction conditional and allowing each to include a barrel roll. Both of those things are gone in favour of a shorter pipeline.
Also, AES, SHA1 and SHA256 are now implemented in hardware.
There's also the nature of both Objective-C and Java: they're both objects-on-the-heap languages with object types like Integer or NSNumber that are often used just to wrap primitive types like int.
Apple uses 64-bit support to implement tagged pointers: pointers that aren't correctly aligned, i.e. are identifiably not actual valid pointers, actually directly contain the data. So e.g. a 64-bit pointer to an NSNumber that contains a 32-bit value is actually the value itself in the pointer plus some meta content. Nothing is put on the heap. That tagged pointer then effectively gives life on the stack to objects without affecting the semantics of objects on the heap. Which, besides anything else, is good for avoiding page faults. I assume ART will or does do something similar.
So the 64-bit pointers provide benefits unrelated to simply being able to point to a wider area, potentially for both of the main platforms.
It's the other way around; Apple set its date first, then Google decided to announce the day before, then Apple leaked this information. The two are clearly playing off each other, quite harmlessly and blamelessly, but Apple tied itself to a date first.
This isn't quite the same thing; one of the use cases is giving the browser the power to select what density of image to download based on its own state. So if you zoom in, it can download a higher-density image. If you're near your data bandwidth limit for the month or it's just loading a preview thumbnail, it can download a lower-density image.
You think it's unlikely that any launch by Google will end up being noisy regardless of Google's intentions... because they have really big market share?
Oh, and apparently the majority of voters agree with you.
"Google is planning to steal some of the thunder from Apple's October 16 event by quietly unveiling a new tablet running the next generation of its Android OS the day before"
Emphasis added; unveiling the day before Apple is not going to be quiet, no matter how modest.
I think the issue is more the principle espoused by Johnson with regard to a different kind of affirmative action:
You do not take a man who for years has been hobbled by chains, liberate him, bring him to the starting line of a race, saying, "you are free to compete with all the others," and still justly believe you have been completely fair...
If institutional barriers have existed then specific action can help to overcome inertia provided by social barriers. An example social barrier: a person is much more likely to succeed if they have a good mentor and mentors are much more likely to pick mentees that reminds them of themselves.
As to where and for how long such extrinsic forces should be applied? That's where mistakes are made. But I don't see anything wrong with the principle.
With the glut of cheap very-similar tablets, I'd rate stock, Google Services enabled Android as one of the most important features.
Most Androids don't have SD slots now either; I don't think it's only about upsell. Those pesky Microsoft patents appear also to be a concern...
... which is also presumably why even the Lumia 630-level of Windows Phone still has expandable memory. They also have removable batteries and are exceedingly cheap. So if anybody really prioritises those things, there's the device for you.
Some quick price comparisons, of some of my current favourite albums:
First Love by Emmy the Great: iTunes, £4.49; Qobuz €9.99 (about 75% more expensive)
Gulag Orkestar by Beirut: iTunes, £7.99; Qobuz €11.99 (about 18% more expensive)
Anna Calvi's eponymous album: iTunes £5.99; Qobuz €7.99 (about 5% more expensive)
The Stone Roses' eponymous album: iTunes £6.99; Qobuz €12.99 (about 46% more expensive)
In my opinion, 5% is easily worth it for lossless. 18% is borderline. 75% is grossly disproportionate. But then I thought, maybe I just have atypical taste? Or benefitted from checking out generally older albums? So I checked the current top 5. Per Radio 1 that is currently George Ezra, Sam Smith, Ed Sheeran, Jamie T and Barbra Streisand. In all cases I picked the regular versions of albums rather than deluxe; for Qobuz I stuck with the CD-quality prices though often better-than-CD was available.
Combined iTunes price for all five: £40.95. Combined Qobuz price: €69.72 (about 33% more expensive).
Is 33% worth it for lossless? I think a lot of people would say so. But it's not better for less, it's better for a little more.
(aside: personally I'm a big fan of roughtrade.com)
Then just use the storefront. Music is DRM free and comes in a standard, well-supported file format. So when you're shopping around, you can factor in iTunes as an option regardless of whether you've bought any hardware from Apple. A truly price-conscious customer would check all the options.
Why on earth do you think 70% is reduced income? Production, distribution and storage (i.e. the shop's rent, electricity, etc) that need to be paid for if you buy a CD on the high street.
According to the BBC those three things add up to £2.72 of an £8 CD — see http://www.bbc.com/news/magazine-23840744
So from a physical CD sale, the bit equivalent to that for which Apple charges 30% adds up to... 34%.
It sounds more like they'd be pitching at the Asus Transformer type of thing. You know, from the completely unsubstantiated rumour.
I think it's (i) to provide an entry point; and, therefore, (ii) to broaden appeal. Nothing you learn in primary or secondary school is a particularly thorough treatment of the subject — you're supposed to discover your talent and carry it through to the rest of your life (via university, if appropriate).
You're obviously at the top end of talent; so the objective is to provide something that is suitable for a whole-class environment so that people like you who might otherwise not be introduced to programming at all can head off and discover the in-depth stuff.
Maybe what you want is the FUZE — http://www.fuze.co.uk ? El Reg has reviewed it; it's a large metal keyboard cast that the Pi fixes into with ports accessible round the back and a breadboard on top that's connected to the GPIO.
Judges decide issues of law, juries decide issues of fact. So e.g. if the law states that someone must do A, B, C to achieve crime Q but the prosecution presents only evidence of A and B — failing even to mention C — then the judge should decide that the jury cannot find a verdict on crime Q. Specifically what happens is that the judge phrases the questions the jury must answer. If the jury answers different questions then that has no effect.
Like if they came back and said "the defendant is not guilty... because Ted did it!"; if Ted weren't a defendant then that wouldn't mean he'd been found guilty of the crime.
British legislation tells the British courts that if Strasbourg has said something then it should be taken account of, as far as possible. British law also says that if British legislation explicitly contracts anything said by Strasbourg or contained within the ECHR then the court must follow the British legislation.
Apologies to the Daily Mail contingent.
If El Reg has only 2000 regular readers then we needn't worry about it doing anything sinister any time soon. Or indeed continuing to do much of anything.
The Consumer Reports verdict was "We expect that any of these phones should stand up to typical use". That was after laboratory tests, carried out by qualified professionals, for a long-established organisation that would have no business if it were believed to offer unreliable reports to consumers.
I'm confident the random commenters and bloggers of the internet will somehow have greater insight than the professionals.
And, yes, it also found the Samsung twice as able to stand up to atypical usage.
It's a shame they didn't test any regularly-sized Samusungs; they're probably even more sturdy.
Consumer Reports is a thoroughly trusted organisatiom with an 80-year history. There's zero evidence of corruption here.
If you're the sort of person that jumped to that conclusion then here are some more facts that may rock your world: man really did land on the moon, the Earth is round, Obama was born in Hawaii.