Re: No offline? Screw you guys, I’ll play with myself instead. @Tsung
It may be more that fans of a 1984 game are more prone to ask"if this game requires a server, will I still be able to play it in 30 years?"
2307 posts • joined 18 Jun 2009
It may be more that fans of a 1984 game are more prone to ask"if this game requires a server, will I still be able to play it in 30 years?"
Apple which has recently ended a half-decade class action re:media player lock-in and is still fighting a case on e-book price fixing, having lost to the government? I'd say the US legal system is doing its due diligence.
That all being said, look to the top. The big Microsoft antitrust suit concluded under Clinton. Then Bush came in and everyone — including Microsoft — got somewhat more of a free hand. Even most of the penalties initially imposed against Microsoft in '98 just sort of quietly vanished on appeal.
Competition law protects the market. As per Roland6 and others, it defines that the wrong is using a monopoly position to distort competition.
So how come Sun didn't get in trouble because Solaris never offered you a browser ballot? Because that was not abuse of a monopoly position. How come Apply don't get in trouble because OS X never offers you a browser ballot? Because that is not abuse of a monopoly position.
Look at the consequences.
Microsoft built a majority market share with a shoddy browser then took steps to lock its platform down and walked away. What effect did the long life of IE6 have on every other part of the internet's technology stack? How much money and how much time was spent dealing with IE6's peculiarities?
Suppose Apple had built Safari not to be especially standards compliant, then baked it closely into the core of OS X and taken market measures to lock out the competing browsers. What effect do you think that would have had on the internet's technology stack? How much money and how much time do you think would be spent dealing with Safari's hypothetical peculiarities?
So, given that the remit is protecting the market, which of those companies was it correct to take action against?
In the worst case what happens is: no benefit for the environment. But it buys the ability for the environmental problem to be fixed centrally. So if cold fusion were discovered tomorrow then they could just plug a couple of those into the grid. Or maybe they'll come around to the idea that new fission stations are the thing environmentally? Renewables don't exactly have a lock on being a better solution if we're optimising for that.
It's nearly 2015. What Windows PC?
I'd rate Android Studio as about a million times as good as Eclipse, and I've also used RubyMine at work so I understand the value of the JetBrains IDE as a transferrable skill. But...
IntelliJ is built in Java. So to use it you have to expose your machine to Oracle's vision of a runtime. Ironically for a just-in-time compiler, it seems to have absolutely no concept of just-in-time launching. Let your machine be forever burdened with Java overhead at boot regardless of what you intended to do that day.
It's also quite visibly not native software. It makes a pretty good stab at hitting a middleground between the OSes and is nowhere near Swing-level awfulness but expect normal cues to be absent and to go ignored. Git integration is one of the obvious examples: you may have your machine set up with an SSH key and all your other appropriate configuration but Android Studio comes with its own embedded version of Git that'll ignore all of that and insist you supply Google's software with your username and password. Presumably just using the Git you already have proved to be an issue across targets.
Then there's the real blight: that emulator. The default is painful and only a computer nerd could love the labyrinth of third-party options and associated manual configuration. Guess what? Being a developer doesn't automatically mean loving configuration. For me HAXM is a default install and lots of people love Genymotion but it feels like an issue is that the first-party tool just isn't up to snuff.
Other grab bag complaints: gradle wants a network connection before you can build anything. There's still no nexus between the IDE and the package manager; the one can know that you're trying to use API 21 and the other can know that API 21 is available but you're the agent that has to transfer the knowledge.
But I think Google can advance in leaps and bounds when it wants. Android 1.6 was awful. Even 2.x retained significant issues, both technical (no accelerated drawing) and in the user interface (that menu button that nobody ever spotted). So probably the future's bright.
I don't think Apple had much left in its witness buying off fund this month, since it overspent on helping to cover up the faked moon landings, sheltering the person that really shot JFK and pretending that Obama was born in the US.
I understand that it's the school's budget but that even iPads were considered justified because the third alternative is textbooks, which are even more expensive. California, like the other states I'm aware of, requires that textbooks be approved before schools can purchase them, which creates something of a captive audience for the publishers and gives them significant extra costs to defray (especially in terms of risk).
Which seems to be similar to the process for hardware but I guess the fact that Chromebooks and iPads have a huge external audience limits price jacking.
Command+shift+control+4 and select an area.
Open Preview and the command+n File menu will have become "New from Clipboard". So select that or hit command+n for an atomic create+paste.
To me it sounded more like AT&T's "value adding" Android customisations may not be functioning correctly; meanwhile the demo units usually run a completely different software configuration full of tutorials and guides.
Chromebooks accounted for 35% of US B2B laptop purchases during the first five months of 2014 per NPD. So Microsoft has been losing its grip on businesses at an unprecedented rate. If Microsoft is focussing effort on trying to segue its business computer hegemony into phone success then it might be better advised not to take so much for granted or it may end up without dominance anywhere. And, yes, I feel old just being able to type that. Things change, I guess.
I use mine rarely because I use it for relatively limited things — web, email, Netflix, Hulu and application development — so I'm in the habit of turning it all the way off when I'm finished. That being an accepted difference between you and I, it's still speedy and working perfectly.
I'm a very casual developer so haven't tried the Lollipop beta and am still running ordinary 4.4 but I'll probably accept the over-the-air upgrade without compunction when it becomes available. My experience from owning an iPad is that these kinds of complaint tend to be very much about edge cases; I can think of uncountable iOS updates that reportedly had users up in arms but which were completely uncontroversial from my subjective point of view.
Netflix is now available via the HTML5 premium video extensions — most controversially the encrypted media extensions which either (a) seek to corrupt the aim of open standards to allow consumption anywhere; or (b) accept that DRM is the trade-off for some content access and try to make it less vendor-dependant. Depending on where you sit.
If you're accessing Netflix through a browser and your browser isn't IE11/Windows 8.1 or Safari/OS X v10.10 then, yes, it's still Silverlight powered.
But I think a huge proportion of access is now probably tablets, TVs with native clients, set-top boxes, video game consoles, etc, etc, etc. Not Silverlight places.
Haven't you noticed the increasing number of articles from El Reg's San Francisco office, full of American spellings and terms? I don't think there's any intention to be a British publication for British people.
Xenon 2, naturally.
Actually, I didn't know the cheat. But how many games were really famed for their music before the PC could keep up?
If I were asked to guess the KKK's password then I'd be happy those text boxes usually don't let anybody else see what you're typing.
You could sell geographic distribution information to kebab vans?
The repository is up at https://github.com/microsoft/dotnet and says:
.NET open source projects typically use either the MIT or Apache 2 licenses for code. Some projects license documentation and other forms of content under Creative Commons Attribution 4.0. See specific projects to understand the license used.
.NET Core uses the MIT licence. The .NET Compiler Platform remains Apache 2.0. For comparison, Mono components are primarily licensed via one of the GPL, LGPL or MIT X11 licences.
So I think that's one barrier to trust overcome.
What's wrong with managed code? Is it virtual machines in general or just Microsoft's approach?
I think the Android switch from Dalvik to ART is interesting: Google is switching from just-in-time to ahead-of-time compilation, compiling on the device at the point of app installation a lot like a traditional make install but from an intermediary byte code rather than from source. It's being promoted as a performance win, eliminating any remaining user-noticeable distinction between 'managed' (in Microsoft parlance) and 'unmanaged' code.
I've seen it argued that such an approach should ultimately prevail everywhere because it resolves the same security issues as an MMU without requiring all those expensive context switches every time a system call is made. That is, given the semantics involved, a proper compiler can generate code that is guaranteed safe to run as ring 0. I have no independent opinion on that other than that it sounds reasonable at a brief parsing to someone who doesn't do anything closely related for a living.
You're not a huge distance from arguing that a CD player could not be more consumer friendly. The simple fact is that — with a hypothetical perfect software SIM — it would be more consumer friendly not to have to carry multiple if these things around, not to have to try to obtain them in foreign languages when you have better things to do with your only seven days in the country, to have the cross-network pricing options clearly tabulated free of marketing puff, etc.
Of course, what Apple supplies is nowhere near the hypothetical perfection. Not even close.
In theory, what's good for the consumer is good for the manufacturer. It is in Apple's interest to create a smoother experience for potential customers because then potential customers are more likely to become actual customers.
Apple would argue it has attempted to do that by creating the soft SIM: it's trying to eliminate that bit where you have to obtain a physical thing and put it into a slot. Which is especially helpful if you're on holiday or one of those people that just travels a lot. Also it'd be one less cause of consumer inertia in picking carriers even in their own home, were the idea fully adopted.
I've no idea whether there's a subsidy involved but even if so that doesn't mean Apple hasn't done good for the consumer. In the US you get three options when starting up, one of which locks the SIM forever but the other two of which can be switched between at will depending on the latest pricing. All pricing is the same as obtaining a physical SIM. So the overall experience is better for the consumer.
Of course an even more neutral SIM would be even better, and would probably be even better for Apple too.
Hopefully Google will do something similar and the customer will benefit both from the on-device choice of carriers and from being able to pick their gatekeeper to that list. Or still being able to do the physical thing if they really want.
The multitude of people to have suggested that appear not to use the service. iMesages are more like Google Hangouts than text messages. If you own the connected phone then you can add a phone number as one of your addresses but after that you'll receive all messages sent to you via your phone, your iPad, your Mac, etc. It's multi-client instant chat. The issue is that your Apple friends end up sending you chat messages when they want to send you text messages, which makes a difference only once your phone can't receive them.
You most likely still receive them on your iPad, Mac, etc. They're still received. There are still receipts being returned.
Unlike the average tech blogger, normal people are perfectly happy to mix and match brands, including to wander in and out of iPhone ownership over the years.
An iPhone will resend as SMS if it can't send as an iMessage but that's mainly about non-data mobile connections still being more widely available than data connections per the frequencies at play. It's a failure-to-send fallback.
Apple doesn't "force customers to register to get something as basic to mobile telephony as SMS messages if they leave Apple". See my other message below. I'm a recent departee. I haven't registered. I still have other Apple devices which receive iMessages sent to my email account. I've had no interruption in texts from my Apple-owning friends.
Though with further hindsight I can only assume that's because I wiped the phone before handing it back (it was a work phone so will now be somebody else's; contrast with if I'd broken it and bought something else or just put it into a drawer). Otherwise how could Apple know?
It'll look appropriately awful. But I think the issue may already be technically fixed. I switched away a couple of months ago and all of my Apple-toting friends' messages are now just arriving by regular text message. I didn't inform Apple, I still use some non-phone iMessage-enabled devices, I kept the same number with no discontinuity of service. I don't know what the applied logic is but I appeared not to lose anything in the switchover.
I don't think you've understood the issue. The suit is for failure to disclose a policy. That is, failure at the time you become a customer. That's when the actionable offence allegedly occurs. It's nothing whatsoever to do with how Apple acts after you're gone and everything to do with how it acts when you're entering.
Yes, you know the one. There are rumoured to be ten levels after it but nobody has ever seen them.
Mostly correct, I'm sure, but if Apple never pays for "... celebrities to tweet 'I love my new iPhone'" then what was all that from the ghost of Joan Rivers?
I confirmed this behaviour by testing against standard, trusted sites like LinkedIn and Gmail.
So, ummm, not all that useful for testing actually, Microsoft.
The spokesman doesn't say that at all. He said: "It all started with the Apple-1". If we're going to assume he meant something more than just "[Apple] all started with the Apple-1" then why stop at the entire digital world? Why not assume he meant that the Apple-1 ushered in the creation of the universe? I certainly can't personally vouch for anything definitely having existed prior to the early-'80s.
If I dare suggest: the spokesman's comment that "when you see a child playing with an iPad or iPhone, not too many people know that it all started with the Apple-1" suggests that he just means that Apple, the noteworthy company, started with the Apple-1. Not the digital age. Not the home computer revolution. Just Apple.
Such other births as you may want to peg to the Apple-1 are entirely at your own discretion.
Ars has an article — http://arstechnica.com/apple/2014/10/the-retina-imac-and-its-5k-display-as-a-gaming-machine/2/ — in which they play the current Alien game at 4k with screenshots, benchmarks and subjective reactions.
I don't want to ruin it too much for you, but:
The 3840x1440 runs appeared visually smooth when I watched them complete, but the numbers tell a bit of a different tale. When I hopped into the game to actually play at that resolution, there was a noticeable amount of mouse lag. Indications are that a faster CPU would have helped considerably, but with the iMac, what you get is what you get.
The Dell you refer to is a TN panel with about 8.3 megapixels and a linear density of around 163 ppi. The same as a first-generation iPhone.
The iMac is an IPS panel with about 14.7 megapixels and a linear density of around of around 217 ppi. A shade above the Nexus 7.
It's a fatuous comparison.
There's no reason anybody should have noticed but Nokia phones still do that. The whole back comes off as a single piece and various after-market options are available, including ordinary replacements, more interesting colours, ones with built-in flip covers and probably more.
It depends whether you lump health watches in with smart watches. If you do then mine does the following without two hands: monitors my heart rate, steps taken, skin temperature and perspiration level, all so as to determine periods in which I'm sleeping, running, walking or cycling, and therefore to comment on my general fitness.
Which are mostly things my phone doesn't do. Or I wouldn't have bought it.
But who is it working for?
If I name my hotspot "United States Perfect Freedom Democracy Network", will I get a free upgrade to first class?
If anybody wants to know how to stop advertising from working on them then I know one weird old tip. Yours for a song.
If that's your position, didn't the BBC express an opinion you could agree with circa 1989?
I enjoy it, personally. But not on an intellectual level — it's popular, family viewing.
If I dare suggest it: Google's vague, thin version of open source (we'll write it in private, according to our priorities, then show you when it's done: the cathedral model, but they'll sell the bibles over in the bazaar) served its purpose, of getting a certain kind of press for a certain audience when Android wasn't yet at critical mass, but is just no longer necessary. As the runaway winner in smartphones, with a mature and well-received product, Google no longer needs to play to that audience and doesn't otherwise desire to.
There's a bit of devil's advocacy to that statement; I'd love to hear the contrary viewpoint.
The former is the name of a distribution of the latter.
I'm not a fan.
My previous comment simply pointed out that you had misunderstood the Ars article. It's a technical discussion of how Swift works and why it was designed as it is. It is not an evaluation of Swift versus other languages. It is not an evaluation of LLVM versus other compilers.
Similarly my post claims only to be about what the Ars article is meant to convey. It does not at any point make the claim that I know anything more about compiler writing than what is stated in the contents of that article.
But what I'm enjoying is that I accused you of constructing straw men and you responded with:
"Can you explain how concatenating two strings of unknown size at compile-time, and storing the result in a third string, also of unknown size at compile-time, can be strength reduced to the native integer add case?", something neither I nor anybody else had claimed or come close to claiming, before segueing into: "Claiming that it is possible to apply strength reduction to an overloaded plus operator in such a way that it always reduces to the integer add case, and therefore only the integer add case ever needs to be emitted in machine code, is pure bullshit."
I am honestly unsure whether you're just a brilliant satirist. Kudos to you if so.
You might want to reread that article.
The point of that section is: the language was designed by the compiler author. In this specific case he wanted to make sure the plus operator could be defined per class without adding any baggage to a normal integer add — adding nothing, in machine code, beyond what the C would generate — and he wanted to do it without creating a special case. To which a discussion of how a modern compiler works is relevant because it helps us understand the mindset behind the new language.
Not a single line of that piece claims "Apple Has Done It Again". You're erecting a straw man.
I don't agree with the second statement, based on the LLVM track record. It was first released in 2003. Chris Lattner was hired by Apple specifically to work on LLVM for them in 2005, where amongst other duties he has added to it technologies intended to make it easier to write efficient and safe software for iOS and Mac OS X: ARC being the most obvious one. Those are completely open source.
Apple has effectively owned LLVM for more than 80% of its life and hasn't made any attempt to close the source — which might have been legally possible (as the original author's ownership isn't subject to the licence he offers the rest of the world, but I don't know who else had contributed) but regardless could easily have been achieved in practice by having Lattner work on a new compiler that wasn't LLVM.
I'm willing to bet that Swift isn't open source because, as evidenced by Apple telling everyone that the exact final syntax is still up for grabs, the company doesn't want to commit to anything yet.
... and how has he been judged?
Golden ages are subjective.
Pssst... insider's tip: iOS 7 does panoramas. No need for you to upgrade!
I wouldn't get too used to soldered RAM; it's clearly going to end up on the same physical silicon as the CPU before too long. Not for everyone, of course, in the same way that the compromises made to have the GPU on the same die isn't good for a lot of users — creatives and gamers, in particular — but I would dare imagine it'll be something Apple does everywhere except the Mac Pro.
I was pleasantly surprised that the 5k iMac still has RAM sockets. In a machine that size and that price I shouldn't have needed to be surprised at all.
Using vanilla v10.10, the following is displayed every time I hit the spotlight icon:
In addition to searching your Mac, Spotlight now shows suggestions from the Internet, iTunes, App Store, locations nearby, and more. To make suggestions more relevant to you, Spotlight includes your approximate location with search requests to Apple.
You can change this in Preferences. Learn more…
... with that final bit being a link. So no intelligence required: just ordinary reading.
It would appear they've continued updating it. Reinstalling it should give that pleasingly 10.3 feel to the day.
i5 is a brand name. It is not a model number. The i5 processors are on at least their third micro-architecture. Counting the cores and looking at the clock rate is fatuous.
Biting the hand that feeds IT © 1998–2017