They've reinvented the portal!
Remember how one of the reasons to abandon Altavista/etc and start using Google was the minimal page design that helped to keep the information you're actually interested in more prominent? Google doesn't.
2235 posts • joined 18 Jun 2009
Remember how one of the reasons to abandon Altavista/etc and start using Google was the minimal page design that helped to keep the information you're actually interested in more prominent? Google doesn't.
The EDL is considered, rightly or wrongly, to be far-right because of its deeds rather than because of its words. On several occasions, the EDL has planned a protest only for it to descend into hooligans who self-associate with the EDL smashing up private property. People like those caught on video here: http://www.liveleak.com/view?i=794_1369521652
The public perception has therefore become that EDL marches are likely to descend into hooligans destroying private property. People also generally associate hooliganism with right-wing sentiments, probably because of the serious problems football has had with both hooliganism and with racism, tying the two things together in the public imagination.
Even taking your argument as accurate, and starting from the premise that the connection is a misperception, EDL supporters like yourself further cement the conclusion by blaming it on "the liberal left wing". Where is a group that despises the liberal left wing most likely to be on the political spectrum?
To a much lesser extent there's the 'English' in 'English Defence League'. Where do groupings that explicitly reference England, Britain or the United Kingdom in their name tend to fall on the political spectrum?
Those are the reasons that the EDL is perceived, and continues to be perceived, as right wing. If, as an EDL supporter, you want to shed the label, those are the problems you want to address.
Here's a test: I strongly dislike the EDL. From that statement in isolation, where would you assume my political beliefs lie?
Yeah, that's how it works. If I see a Conservative Party political broadcast that tells me they'll fix the economy but then don't vote Conservative then that must mean I want to break the economy?
The problem with guaranteeing free speech is that you can't hold it up as the be all and end all while simultaneously saying that the EDL have the right to say anything they want but Anonymous don't have the right to say that person X is a member of the EDL.
At some point you have to balance the rights of one group against the rights of another. In this case I think the defamation angle is the right one to follow. If Anonymous has misidentified anybody then those people will likely be subject to a heavy adverse reaction. It's the rights of those individuals that should properly restrict the right to free speech.
So, yes, I'm against what Anonymous has done. I'm also against the EDL but that's neither here nor there. But I disagree with what's happened not because I think free speech is an absolute right but rather because I think that limitations are justified in specific limited cases.
I'm not so sure about gamers — it could go either way. Without a middle tier of people that don't care if they play at 20fps with low-quality textures to pump up the sales, there's just not all that large of a market. So then what are people going to buy the high-end computers for? It'll become an ever shrinking niche.
Another way round of phrasing it: 90% of games are going to be developed for tablets and similar devices. Those games are not going to scale well. So why bother spending the money to try?
If this is really the way the conversation is going, it's pretty easy to rattle off the systems that were technologically superior in many respects to Windows 3.0 in 1990. Off the top of my head: The Amiga Workbench, RISC OS (both as already mentioned), NextStep, OS/2, NeWS, X + e.g. OpenWindows.
Of those, NextStep, OS/2 and NeWS are probably the ones worth singling out for special praise. All three are preemptive, use a protected memory scheme and provide the sort of user-land libraries that we now usually consider to be part of an OS.
The iPhone isn't dying now, but the iPod wasn't dying in 2007. Modern Apple likes to get each new cash cow up to speed before the previous goes into decline.
That said, if a watch is all they can come up with then the future's probably not bright.
The Z80 (and 6502, and others) undocumented opcodes were merely relics of the decoding process — they weren't intended to be hidden. The reason they became well known was that one or two of them were found to do useful things in computers where a single supplier had provided the same model of CPU for the entire production run.
For example, there's 'shift left and insert a 1 in the least significant bit' on the Z80 that can be used to make a faster scroll in some cases and to help with certain methods of sprite compositing. It's a relic of shift right arithmetic and fills a pretty obvious numerical hole in the instruction map. So if you know that every 48k Spectrum uses a Z80 with that instruction then why not use it?
So this is unlike the classic situation because the motivation is different — the new operations are hidden on purpose.
"Haha, all our money comes from markets you don't compete in — but we hear Samsung are getting rich off phones"?
I quite like the ribbon and don't fully understand the dislike for it. Especially once you've set it to automatically hide, it's a pull-down menu that has icons as well as words. Then once you can rely on people's ability to discern pictures more quickly than words, you can go back to the old-school approach of putting things in the drop downs rather than in toolbars. One of the reasons they did that was that screens used to be smaller; now laptops are the predominant form of computers, screens are smaller again.
I guess the counter argument is that the icons don't add anything to the words or the words don't add anything to the icons so one or the other just acts as visual noise, spreading everything out so as to make navigation more laborious? I can't say I've faced that problem but I'm hardly a power user — in Word I use little beyond style sheets and am sufficiently fussy that I expect not to set them up in way that satisfies me very quickly.
Sorry to be the carrier of bad news, but she isn't. Her LinkedIn profile says she left in February 2013 — no doubt in that big round of layoffs they did.
If you give me $150,000 then I could give you some of my time trying to figure out how to recreate old arcade machines.
Straw man begets straw man:
Yes, silly us in the rest of the world. We forgot that the death penalty and free access to guns are inalienable requirements for democracy. That's why there aren't any democracies in Europe — we're all just oppressed socialists because we have things like universal healthcare.
It probably doesn't say anything but the American Constitution recognises rights rather than granting them, so that really just means that it doesn't take an explicit position. Probably the argument that it'd be a bit ridiculous if the right to bear arms were recognised but not the right to make them usable is the more persuasive.
I think they just mean that, in the style of Citrix, OnLive, VNC or a host of others, you could use their streaming to stream a moving picture of a computer program rather than moving pictures of actors. That program would probably be hosted as a virtual machine on the originating server. Which gives them a neat extra buzzword vaguely to attach to their software.
Apple has a working implementation of WebGL under iOS — it's enabled for iAds (which are vetted) and can be enabled across the system on jailbroken devices and/or in individual web views through undocumented API calls. Guesses for it not being on by default range from it being insufficiently secure for the main browser (ie, the Microsoft argument) to Apple not wanting to lose app store revenue (ie, the anti-Apple argument).
So, anyway, if killer WebGL apps come along then Apple needn't allow its OS to be left behind.
By putting the GPU behind the MMU it does technically reduce one of the video memory concerns — you could have a single graphic however many gigabytes in size, memory map the file and call that the texture. Attempts by the GPU to read sections not currently paged would simply raise the usual exception, which would be caught by the OS in the usual way and handled by the existing paging mechanisms. You no longer have to treat texture caching as a separate application-level task.
That said, as others have noted the main point of the design is that when you write a parallel for loop in your language of choice to perform some vector operation — especially if it involves no branching — the GPUs can be factored into the workload just as easily as any traditional CPU cores, but so as to perform the work much more efficiently. So writing programs that take advantage of all the available processing becomes a lot easier. Collapsing virtual memory to a single mechanism that your OS vendor has already supplied is just one example.
The oldest supported machines for OS X v10.8 are mid-2007 iMacs and the newest unsupported machine is a just-before-early-2009 Mac Mini. So the most harshly affected purchases were a shade more than three years old when the OS came out. Given that we're talking official support, not how well the thing runs, that's harsh when you consider that Windows 7 and 8 have the same official minimum requirements and Windows 7 came out just shortly after the newest of the unsupported Macs.
I guess the £25 cost-of-entry explains support and testing cuts at Apple's end but it's hard to call it fair treatment.
If we apply the standards usually used by commenters on tech blogs: it isn't worth lauding in any capacity because somebody else has already done it as part of university research and vaguely comparable products have preceded it to market (eg, the Vuzix).
Besides that, I'm pessimistic about it because as far as I can make out it's just a different way of using a mobile phone. Instead of pulling it out of your pocket to look at, it's already right there in front of your eye. So you gain pocket space and get to use your hands for something else but you lose most of the interactivity and the ability to share. How many times has one of your friends given you their phone for a few minutes, or at least waved it in front of you, to try a new game or application, or quickly to show you something on a website?
But the numbers from StatCounter show that Apple's market ISN'T diminishing, at least in terms of share.
It seems to need repeating several times a day but Apple's marketshare isn't in decline. Looking at the worldwide numbers as reported by StatCounter and going back three years to March 2010, the iOS market share has had a range of 19.41%–30.13%. It is presently 27.14%.
What's happened is that Android has killed more or less everything that isn't iOS. So in in the same three years it has risen from 6.21% to 37.23%. But saying that success for Google must obviously mean failure for iOS is plainly false. (aside: the stats report 12.58% share for Nokia Series 40 so they're not exclusively about smart phones; if the 37.23% doesn't sound like what you thought then that's probably why)
Investor confidence is an issue but loss of marketshare isn't.
3ish Mhz was the norm but I guess you can claim some credit for the Sord based on its video chip.
The ZX80/81 famously use the processor for screen painting — if memory serves then to paint the display it runs through a series of NOP instructions, which gives a reliable deterministic rate for the z80-generated refresh signal and when the video circuits spot a NOP in ROM they make a note to use the next thing on the bus, which is the value the RAM kicks out on account of the refresh cycle, for video output. The RAM doesn't actually need a real refresh cycle because it's static. But the net effect is that the CPU is occupied for the entire pixel region, doing work that otherwise produces nothing.
The Spectrum has a ULA that can generate addressed and read cycles all of its own volition but shares the same memory (at least, the lower 16kb) between CPU and ULA so the CPU has to wait if accessing that area when the ULA needs it. It's also a fully bitmapped display so the CPU has to write every byte of a graphic for it to appear or move.
Conversely the TMS9929A has 16kb all of its own that operates entirely separately from the CPU's memory pool. You write to it via port IO and there are still some wait cycles involved but the whole setup is designed around the idea that most of the time you don't write much data. It's sprites and a tile map, so for text and most games you spend time uploading the block graphic set and then the drawing isn't much more than updating the map and possibly a few sprite registers, so you get almost all of that 3.6Mhz free.
Games are still likely to work better on the Spectrum though as the TMS9929A completely overlooks scrolling. You can do the block scroll alluded to in the article by rewriting the entire map but that's almost the end of it as you don't have time to rewrite every pixel. The MSX 1 and the ColecoVision have the same chip and the same problem.
Sorry, share in which market is meant to be bleeding?
Steve Jobs died in October 2011. According to StatCounter iOS had 23.48% worldwide market share then. It's now April 2013. According to StatCounter iOS has 26.65% worldwide market share. Market share is up since Jobs died.
Maybe if we limit the numbers to Europe? Then we're talking 38.86% now versus 42.29% then but that's hardly a bleed. North America goes the other way with 41.03% then turning into 51.79% now.
Okay, what about the Mac? Worldwide share was 7.18% in October 2011 and is 7.04% now; in Europe that's a transition from 6.79% to 7.53%, in North America it's from 13.91% to 11.6%. So the continents are reversed in their trends versus iOS but in neither case is the change particularly massive.
Check out http://gs.statcounter.com/ to try any combination you like — the objective reality is that Apple's marketshare hasn't bled since Jobs died.
The statement is defamation if it's communicated to a third party by the person making it (rather than merely a private insult), and would cause a person's standing in society to be seriously affected, or would cause the individual to be shunned or avoided.
So the legal protection is on reputation, not on feelings.
Not only are some of Apple's patents much more obvious but have been ruled so and are now similarly invalid. E.g. last month the US Patent & Trademark Office invalidated Apple's rubber banding patents — there was a preliminary ruling last October that I can find an El Reg reporting of at http://www.theregister.co.uk/2012/10/23/uspto_apple_patent/ and a final one that many other sites reported at the beginning of this month (though that's final only in the sense of 'we've finalised our initial ruling on the problem, bring on the appeals').
January's ITC ruling in favour of Motorola and against Apple re: '430, '828 and '607 would suggest otherwise.
While I hope they achieve their target the good way, I think the sceptics are probably right on x86 in tablets and phones — there's too much NDK stuff out there for many Android-using manufacturers to make the switch and a lot of them, along with Apple, are now used to being able to license embeddable components and design their own silicon. It's not that the architecture is always going to be behind in speed or power, just that the ship has already sailed.
Apple don't say that if you discontinue Siri then they will delete all your data, only that they'll delete the non-anonymised stuff. The article even has the relevant bit in bold:
If you turn off Siri, Apple will delete your user data, as well as your recent voice input data. Older voice input data that has been disassociated from you may be retained for a period of time to generally improve Siri and other Apple products and services.
As to your related point about whether voice data can really be anonymised, they could technically just be keeping the first-level stuff about pitches and rhythms that they extracted from the sound recording, which would identify you only in the same sense that written text with no associated author could identify you, but probably they just mean 'we won't store further user details with it'.
I guess it's just about conceivable, given that Android and Chrome OS are distinct, that there would have been a third branded OS since Goggles are neither primarily interested in displaying HTML content or, as I understand it, intended to run full, largely discrete, self-contained applications.
Can you provide any references? Apple doesn't expose an API for dialling numbers and, at least for me, Google can find no hits concerning any iOS application that has ever maliciously dialled a premium rate number (eg, via a backdoor or other unofficial API route).
Taking music from an iPad to a computer is officially disallowed because Apple has a contractual relationship with the music industry that it wants to preserve.
You weren't expecting a technical reason were you?
Presumably Microsoft's thinking is that now they can go to all the people that don't make the current or future Xbox and say 'Foxconn are paying and they've got enough money for really good lawyers; why don't you just pay up and save us all some hassle?'
Even supposing Microsofts ends up paying its own royalties via Foxconn, that's a good piece of leverage over everybody else to have acquired for free.
Default on Android or not, StatCounter's statistics separately show iOS (iPod + iPhone) and Android both to be about equal, around the 30% mark — Android is slightly ahead (this is worldwide, after all, not just US) but if Chrome's showing in the overall chart were significantly attributable to Android then you'd expect Safari to be doing very well too.
Since Safari isn't doing anywhere near as well in the StatCounter chart, I don't think the mobile angle is influential.
The PC market as a whole is contracting a lot more quickly than Apple's computer sales — per that recent IDC report, worldwide computer shipments are down 14%, Apple's are down only 7.5%. Obviously you should frame that with the fact that Lenovo has managed to buck the trend entirely with 0% year-on-year difference but it seems to me that you could argue both that Apple is failing (sales down) or that it is succeeding (it's significantly outperforming the market average).
Here's what else all the rogue nations have in common: they know that man didn't really land on the moon, that Jackie shot JFK, that the Titanic was an insurance scam, that Prince Philip crashed Diana's car, that September the 11th was designed to give the US more oil, that Tesla shot Archduke Ferdinand, that Hitler invented the CAT scanner, that Castro really had over seventeen legs and that Roswell caused the middle ages.
Iran wishes it were big?
I don't think courts have ever liked this sort of mucking about; look at the Woolf reforms here in the UK for evidence that, when acting as a body, legal professionals like the rules to be clear, costs to be proportional to the dispute, reasonable parties to be rewarded and, ideally, people not to have to go to court in the first place — whether due to alternative dispute resolution or as a result of summary judgment.
The problem is partly that even in countries modelled on the English system court-made law is long out of fashion in deference to elected legislators (as they've more of a mandate), but probably more that we've waded into uncharted waters. These sorts of action are broadly unprecedented and the law has yet to figure out how to deal with these things equitably.
That's called confirmation bias. There's a risk that the programming on a network that shares a brand with a deeply conservative news channel might become less easy to obtain. Of those comments that are politically motivated, which side of the spectrum are they likely to come from?
You might also benefit from looking up the contrapositive.
Oh, I live in San Francisco right now so I'm fully aware of the California freeways — my favourite is when there's a central lane that divides without any discontinuity so that if you head to the left you're on one freeway and if you head to the right then you're on another. Naturally that divide often happens just after the crest of a hill, and quite a lot of the time it isn't even a real hill but just an artefact of the way that elevated portion of road was built. I can't think of anywhere in the UK where the same lane just suddenly divides like that, without one road explicitly being a junction off the other.
And that's just the junction-by-junction stuff, the overall layout is a nightmare in itself. While driving to Pacifica this weekend I naturally had to go 101 south, 380 west, 280 north then 1 south, making four freeways for a 25-ish minute journey to a reasonably popular destination.
I guess that's an imperfect solution if your phone doesn't do turn-by-turn directions (as then it also won't correct itself if you miss a turning), and it's similarly possible you'll be caught using the phone for GPS if you've no way of routing the audio through your car speakers or if you just can't hear the instructions because it's California so obviously you've got a convertible.
Since the judge explicitly cited using your hands on the phone as the distraction, I guess mounting it within eyesight would be acceptable.
I think that not only is Sony a minority player in the Android market but that your son has discovered why. The 'problem' with Android is that each manufacturer can add whatever layers of code they like — see also non-removable Facebook apps, carrier-specific app stores and the rest. So you can easily acquire an Android phone with a horrible user interface and a habit of crashing just by buying the wrong model from a lesser manufacturer.
Of course this is only a problem if you don't do some research and is a natural side effect of freedom and choice. Given that accepting the carrier-specific applications usually gets you more of a subsidy you might even do your research and then decide that going with the more afflicted handset is a better fit for your priorities.
The entire market continues to skyrocket; given that the real story implied by the numbers seems to be "no monumental changes", you'd expect Android's numbers to go up accordingly.
But then surely the interesting thing is the year-on-year trend? We're very close to 6 months in to the iPhone 5 now and were almost six months in to the 4S at the same time a year ago as both were October launches. The S4 is coming only about 11 months after the S3 but the S2 was twelve months before that so Sansung is essentially puts the Samsung on an annual cycle too.
For my money the iPhone trend is real but essentially insignificant because the change is small and appears to be limited to one market.
That's a common misconception — the more proper version would be that Apple will always charge a premium where the market supports it. For example, yhere was never more than about £15 difference between a ~£200 iPod and the relevant Creative competitor and the Apple TV is exactly the same price as the equivalently specified Roku.
That said, the lesson of the 2007 iPhone launch is that they do sometimes grossly misjudge the price the market is willing to pay so it's definitely possible they'll launch very high before having to about face.
That'd be only if the cost didn't matter _and_ the prestige of having a fancy building didn't matter. What's going on here is that the latter has been judged hugely to outweigh the former.
Indeed the opposite is true: the more money Apple disperse to building contractors and suppliers that would otherwise have stayed inert in a bank somewhere, the more money ends up in the tax pot. So given Apple's cash reserves I also think they should build as complicated a building as possible.
i.e. those on http://www.modern.ie/en-US/virtualization-tools, which seem to include an option to grab 'IE10 - Win8'. Is the bundled Windows 8 in there limited in some means that the one on the USB stick wasn't?
A search for 'genius salesman' (no quotes) returns 'Apple - Jobs at Apple - Retail (us)' as the first hit. So maybe they are looking to get a more direct Steve replacement after all?
With about one sixth the number of users as 'does it really qualify for the list?' Safari, Opera is not a major browser (sources: StatCounter, W3Counter, NetApplications). Which is a shame.
Some technical autopsies wouldn't go unappreciated either.