If that's your position, didn't the BBC express an opinion you could agree with circa 1989?
I enjoy it, personally. But not on an intellectual level — it's popular, family viewing.
2069 posts • joined 18 Jun 2009
If that's your position, didn't the BBC express an opinion you could agree with circa 1989?
I enjoy it, personally. But not on an intellectual level — it's popular, family viewing.
If I dare suggest it: Google's vague, thin version of open source (we'll write it in private, according to our priorities, then show you when it's done: the cathedral model, but they'll sell the bibles over in the bazaar) served its purpose, of getting a certain kind of press for a certain audience when Android wasn't yet at critical mass, but is just no longer necessary. As the runaway winner in smartphones, with a mature and well-received product, Google no longer needs to play to that audience and doesn't otherwise desire to.
There's a bit of devil's advocacy to that statement; I'd love to hear the contrary viewpoint.
The former is the name of a distribution of the latter.
I'm not a fan.
My previous comment simply pointed out that you had misunderstood the Ars article. It's a technical discussion of how Swift works and why it was designed as it is. It is not an evaluation of Swift versus other languages. It is not an evaluation of LLVM versus other compilers.
Similarly my post claims only to be about what the Ars article is meant to convey. It does not at any point make the claim that I know anything more about compiler writing than what is stated in the contents of that article.
But what I'm enjoying is that I accused you of constructing straw men and you responded with:
"Can you explain how concatenating two strings of unknown size at compile-time, and storing the result in a third string, also of unknown size at compile-time, can be strength reduced to the native integer add case?", something neither I nor anybody else had claimed or come close to claiming, before segueing into: "Claiming that it is possible to apply strength reduction to an overloaded plus operator in such a way that it always reduces to the integer add case, and therefore only the integer add case ever needs to be emitted in machine code, is pure bullshit."
I am honestly unsure whether you're just a brilliant satirist. Kudos to you if so.
You might want to reread that article.
The point of that section is: the language was designed by the compiler author. In this specific case he wanted to make sure the plus operator could be defined per class without adding any baggage to a normal integer add — adding nothing, in machine code, beyond what the C would generate — and he wanted to do it without creating a special case. To which a discussion of how a modern compiler works is relevant because it helps us understand the mindset behind the new language.
Not a single line of that piece claims "Apple Has Done It Again". You're erecting a straw man.
I don't agree with the second statement, based on the LLVM track record. It was first released in 2003. Chris Lattner was hired by Apple specifically to work on LLVM for them in 2005, where amongst other duties he has added to it technologies intended to make it easier to write efficient and safe software for iOS and Mac OS X: ARC being the most obvious one. Those are completely open source.
Apple has effectively owned LLVM for more than 80% of its life and hasn't made any attempt to close the source — which might have been legally possible (as the original author's ownership isn't subject to the licence he offers the rest of the world, but I don't know who else had contributed) but regardless could easily have been achieved in practice by having Lattner work on a new compiler that wasn't LLVM.
I'm willing to bet that Swift isn't open source because, as evidenced by Apple telling everyone that the exact final syntax is still up for grabs, the company doesn't want to commit to anything yet.
... and how has he been judged?
Golden ages are subjective.
Pssst... insider's tip: iOS 7 does panoramas. No need for you to upgrade!
I wouldn't get too used to soldered RAM; it's clearly going to end up on the same physical silicon as the CPU before too long. Not for everyone, of course, in the same way that the compromises made to have the GPU on the same die isn't good for a lot of users — creatives and gamers, in particular — but I would dare imagine it'll be something Apple does everywhere except the Mac Pro.
I was pleasantly surprised that the 5k iMac still has RAM sockets. In a machine that size and that price I shouldn't have needed to be surprised at all.
Using vanilla v10.10, the following is displayed every time I hit the spotlight icon:
In addition to searching your Mac, Spotlight now shows suggestions from the Internet, iTunes, App Store, locations nearby, and more. To make suggestions more relevant to you, Spotlight includes your approximate location with search requests to Apple.
You can change this in Preferences. Learn more…
... with that final bit being a link. So no intelligence required: just ordinary reading.
It would appear they've continued updating it. Reinstalling it should give that pleasingly 10.3 feel to the day.
i5 is a brand name. It is not a model number. The i5 processors are on at least their third micro-architecture. Counting the cores and looking at the clock rate is fatuous.
That would be the issue raised by someone who titled their post "Radeon 290X to drive a 5k display?"
As to heat though, I'll bet Apple has just underclocked it. The advertised fill rate for that GPU is 64 gigapixels per second; enough to fill every pixel on screen more than 4,340 times a second. Let's suppose they underclocked it by 50%, which they won't have, that's naively enough to paint every pixel 36 times at 60 frames per second. Or more than enough to paint every pixel once even when you factor in that advertised fill rates aren't really achievable and blur translucency costs quite a few samples*. And OS X does not repaint every pixel for every frame, it just reacts to the changes (yes, with some redundancy because it's a broad-phase test).
* but not as many as you think; per empirical investigation Apple is using a smart mix of mip map samples very closely to approximate a Gaussian in four or five samples.
Android launches occur all the time and Google's own hardware doesn't dominate the market. It's therefore less newsworthy.
As to your random accusation of bias? Let's check the historical record:
After launch, the newer Nexus 7 sold fewer than a million units a month: http://www.fool.com/investing/general/2013/11/11/why-isnt-googles-new-nexus-7-outselling-its-predec.aspx . The iPad Air sold more than five million units a month after its launch: http://www.wired.com/2014/04/apple-q2-earnings-2/
The BBC put both of the launches of the last two days on its front page (see https://web.archive.org/web/20141015182903/http://www.bbc.co.uk/news/ ).
The difference is that the launch that's likely to interest more than five times as many people got one of the picture boxes rather than just a single line, giving it much greater prominence.
I'd suggest this is because the BBC reports based on likely reader interest. It's pretty much given that the iPads will massively outsell the tiny subset of new Android devices that was announced yesterday.
The screen is 5120x2880 pixels — coming up to 15 megapixels. At 32bits per pixel, that requires a frame buffer of 56.25 megabytes. A 2gb GPU can therefore store more than 37 copies of the display before running out of space. So there's plentiful room for all of the things a GPU caches.
What do you imagine the issue to be?
Your wallet? You're so old fashioned!
It would have been a very practical option but Aérospatiale-BAC didn't see that oil crisis coming any more than anyone else and when BA realised that most fares were corporate they multiplied up the price.
Doing the big post-crash press relaunch on the morning of the 11th of September 2001 also turned out not to be ideal with hindsight.
With this and the Nexus 9 going after the market for laptop-type-things, do you get the feeling that perhaps the Android and Chrome teams aren't the best of friends?
64-bit ARM refashions quite a lot of the architecture so as to achieve advantages quite distinct from just having a larger address space. Including:
• approximately twice as many integer registers (28 general purpose versus 13);
• more, and wider, classical floating point registers;
• double precision SIMD; and
• better synchronisation primitives.
There are also some performance-oriented subtractions. ARM used to be famous for making every instruction conditional and allowing each to include a barrel roll. Both of those things are gone in favour of a shorter pipeline.
Also, AES, SHA1 and SHA256 are now implemented in hardware.
There's also the nature of both Objective-C and Java: they're both objects-on-the-heap languages with object types like Integer or NSNumber that are often used just to wrap primitive types like int.
Apple uses 64-bit support to implement tagged pointers: pointers that aren't correctly aligned, i.e. are identifiably not actual valid pointers, actually directly contain the data. So e.g. a 64-bit pointer to an NSNumber that contains a 32-bit value is actually the value itself in the pointer plus some meta content. Nothing is put on the heap. That tagged pointer then effectively gives life on the stack to objects without affecting the semantics of objects on the heap. Which, besides anything else, is good for avoiding page faults. I assume ART will or does do something similar.
So the 64-bit pointers provide benefits unrelated to simply being able to point to a wider area, potentially for both of the main platforms.
It's the other way around; Apple set its date first, then Google decided to announce the day before, then Apple leaked this information. The two are clearly playing off each other, quite harmlessly and blamelessly, but Apple tied itself to a date first.
This isn't quite the same thing; one of the use cases is giving the browser the power to select what density of image to download based on its own state. So if you zoom in, it can download a higher-density image. If you're near your data bandwidth limit for the month or it's just loading a preview thumbnail, it can download a lower-density image.
You think it's unlikely that any launch by Google will end up being noisy regardless of Google's intentions... because they have really big market share?
Oh, and apparently the majority of voters agree with you.
"Google is planning to steal some of the thunder from Apple's October 16 event by quietly unveiling a new tablet running the next generation of its Android OS the day before"
Emphasis added; unveiling the day before Apple is not going to be quiet, no matter how modest.
I think the issue is more the principle espoused by Johnson with regard to a different kind of affirmative action:
You do not take a man who for years has been hobbled by chains, liberate him, bring him to the starting line of a race, saying, "you are free to compete with all the others," and still justly believe you have been completely fair...
If institutional barriers have existed then specific action can help to overcome inertia provided by social barriers. An example social barrier: a person is much more likely to succeed if they have a good mentor and mentors are much more likely to pick mentees that reminds them of themselves.
As to where and for how long such extrinsic forces should be applied? That's where mistakes are made. But I don't see anything wrong with the principle.
With the glut of cheap very-similar tablets, I'd rate stock, Google Services enabled Android as one of the most important features.
Most Androids don't have SD slots now either; I don't think it's only about upsell. Those pesky Microsoft patents appear also to be a concern...
... which is also presumably why even the Lumia 630-level of Windows Phone still has expandable memory. They also have removable batteries and are exceedingly cheap. So if anybody really prioritises those things, there's the device for you.
Some quick price comparisons, of some of my current favourite albums:
First Love by Emmy the Great: iTunes, £4.49; Qobuz €9.99 (about 75% more expensive)
Gulag Orkestar by Beirut: iTunes, £7.99; Qobuz €11.99 (about 18% more expensive)
Anna Calvi's eponymous album: iTunes £5.99; Qobuz €7.99 (about 5% more expensive)
The Stone Roses' eponymous album: iTunes £6.99; Qobuz €12.99 (about 46% more expensive)
In my opinion, 5% is easily worth it for lossless. 18% is borderline. 75% is grossly disproportionate. But then I thought, maybe I just have atypical taste? Or benefitted from checking out generally older albums? So I checked the current top 5. Per Radio 1 that is currently George Ezra, Sam Smith, Ed Sheeran, Jamie T and Barbra Streisand. In all cases I picked the regular versions of albums rather than deluxe; for Qobuz I stuck with the CD-quality prices though often better-than-CD was available.
Combined iTunes price for all five: £40.95. Combined Qobuz price: €69.72 (about 33% more expensive).
Is 33% worth it for lossless? I think a lot of people would say so. But it's not better for less, it's better for a little more.
(aside: personally I'm a big fan of roughtrade.com)
Then just use the storefront. Music is DRM free and comes in a standard, well-supported file format. So when you're shopping around, you can factor in iTunes as an option regardless of whether you've bought any hardware from Apple. A truly price-conscious customer would check all the options.
Why on earth do you think 70% is reduced income? Production, distribution and storage (i.e. the shop's rent, electricity, etc) that need to be paid for if you buy a CD on the high street.
According to the BBC those three things add up to £2.72 of an £8 CD — see http://www.bbc.com/news/magazine-23840744
So from a physical CD sale, the bit equivalent to that for which Apple charges 30% adds up to... 34%.
It sounds more like they'd be pitching at the Asus Transformer type of thing. You know, from the completely unsubstantiated rumour.
I think it's (i) to provide an entry point; and, therefore, (ii) to broaden appeal. Nothing you learn in primary or secondary school is a particularly thorough treatment of the subject — you're supposed to discover your talent and carry it through to the rest of your life (via university, if appropriate).
You're obviously at the top end of talent; so the objective is to provide something that is suitable for a whole-class environment so that people like you who might otherwise not be introduced to programming at all can head off and discover the in-depth stuff.
Maybe what you want is the FUZE — http://www.fuze.co.uk ? El Reg has reviewed it; it's a large metal keyboard cast that the Pi fixes into with ports accessible round the back and a breadboard on top that's connected to the GPIO.
Judges decide issues of law, juries decide issues of fact. So e.g. if the law states that someone must do A, B, C to achieve crime Q but the prosecution presents only evidence of A and B — failing even to mention C — then the judge should decide that the jury cannot find a verdict on crime Q. Specifically what happens is that the judge phrases the questions the jury must answer. If the jury answers different questions then that has no effect.
Like if they came back and said "the defendant is not guilty... because Ted did it!"; if Ted weren't a defendant then that wouldn't mean he'd been found guilty of the crime.
British legislation tells the British courts that if Strasbourg has said something then it should be taken account of, as far as possible. British law also says that if British legislation explicitly contracts anything said by Strasbourg or contained within the ECHR then the court must follow the British legislation.
Apologies to the Daily Mail contingent.
If El Reg has only 2000 regular readers then we needn't worry about it doing anything sinister any time soon. Or indeed continuing to do much of anything.
The Consumer Reports verdict was "We expect that any of these phones should stand up to typical use". That was after laboratory tests, carried out by qualified professionals, for a long-established organisation that would have no business if it were believed to offer unreliable reports to consumers.
I'm confident the random commenters and bloggers of the internet will somehow have greater insight than the professionals.
And, yes, it also found the Samsung twice as able to stand up to atypical usage.
It's a shame they didn't test any regularly-sized Samusungs; they're probably even more sturdy.
Consumer Reports is a thoroughly trusted organisatiom with an 80-year history. There's zero evidence of corruption here.
If you're the sort of person that jumped to that conclusion then here are some more facts that may rock your world: man really did land on the moon, the Earth is round, Obama was born in Hawaii.
What does the one thing have to do with the other?
The Slate article appears to be about some idiot who badly faked a video of bending the thing with his own hands. But that was following the initial reports — of it bending of its own volition during the course of a wedding, etc — so is itself a complete side issue to the bendability of the thing.
The HTC One is also made of aluminium, so that's not a "yes, but" between Apple and HTC.
So, honestly, the story is probably threefold:
* Apple not learning from its competitors;
* Apple sacrificing rigidity for thinness (as in, objectively, per Consumer Reports, the 6 and 6+ are much less rigid than the 5s);
* Apple acting in such a way that people apply different standards to it.
Surely the issue is the range? Even amongst well-regard handsets, Apple is not an outlier.
I still think this is much more a case of the hype not matching the reality (i.e. it's just another handset, with each individual attribute being located within the spectrum of its competitors) and people enjoying any opportunity to take Apple down a peg or two for being so arrogant, than it is of some sort of recall-worthy defect.
Per Consumer Reports — see e.g. http://www.examiner.com/article/consumer-reports-says-iphone-6-and-iphone-6-plus-are-not-very-bendable — the iPhone 6 is in fact no more bendable than the competition.
From worst to best:
HTC One (70 pounds to deform, 90 pounds to separate);
iPhone 6 (70 pounds to deform, 100 to separate);
iPhone 6+ (90 pounds to deform, 110 to separate),
LG G3 (130 pounds both to deform and to separate);
iPhone 5s (130 to deform, 150 to separate);
Galaxy Note 3 (150 pounds both to deform and to separate).
So Apple has remained strictly within industry bounds, never producing either the most robust or the most deformable phone. Maybe the issue is more that it's jumped from being equal with the high end of the table on one measure to being equal with the low end on another?
The Finder can handle zip files perfectly. It can't handle .cpgz — compressed Unix CPIO archive file — well at all, but stupidly believes that it can.
Top OS X tip: grab The Unarchiver. From the App Store or elsewhere.
I think he means to make the distinction between the time domain and the frequency domain. Assuming perfect instantaneous sampling then everything in between samples in the time domain is lost. But, frequency wise, there's no new information between samples to miss.
I guess it depends on what you define the totality of the information to be. If you want to record all frequencies up to 20Khz then if you have regular samples at 40Khz, nothing is lost. If your low-pass filter is insufficient then aliasing may even add things that weren't there originally...
I saw my first out-on-the-street user today. But I live in San Francisco, so almost a week is actually a surprisingly long delay. I've seen Google Glasses and Segways, after all. You know, while I'm not jealously eyeing up those making their escape back into mainstream society.
Seems to me like Apple's problems are outnumbering Microsoft's right now. Microsoft have just gone in an undesirable direction; Apple appear to have had trouble getting a fully-working piece of software out the door to meet their hardware deadlines.
Heads should roll, but don't ask me in which department. Was it an unrealistic deadline or was the realistic deadline simply handled poorly? So heads probably won't roll.
Just being wrong (at least) once before does not establish that pundits are always wrong.
There are similarly pundits who declared that the G4 Cube would change the computer marketplace, that the Apple TV would kill the Roku stone dead and that Android would never have more than 10% of the market. Some of those same pundits also said the iPad would be the first really successful tablet.
Although that is true of me, if there's no third option then I'm also happy to skip Christmas this year.
The relevant Android APIs are sufficiently low level that Marvin works excellently with a vanilla bluetooth keyboard.
Or possibly harmless, mostly harmless, poor, average, above average, competent, dangerous or elite.
(though I assume the logo's just a lift by the hacking group if the games affected are only those listed)