Re: Don't forget the C64 fan version
Pfff... in the Sam Coupe world they managed to get their fan version completed and released commercially back in 1992. By pausing the Amiga version frame-by-frame and hand copying the graphics, if memory serves.
2056 posts • joined 18 Jun 2009
Pfff... in the Sam Coupe world they managed to get their fan version completed and released commercially back in 1992. By pausing the Amiga version frame-by-frame and hand copying the graphics, if memory serves.
But it's currently selling for a third below its usual price on Amazon US and only 12% below its usual price on Amazon UK.
The US price is ordinarily $75, which is £47.98 at today's exchange rate. The normal UK price is £47.99. Once you add some tax, it ordinarily costs more in the US.
The competition is embeddable cores. Samsung, Apple, et al pick the ARM core, the GPU and everything else, lay out the silicon (or let a computer do it) and hit print. It's very mix and match. As a result, competition is healthier than it has been in years. ARM is likely to persevere both on momentum and because you don't have to go begging cap in hand any time you want a custom fabrication.
I'm no particular fan of Forth for practical use but I can't recommend the Thinking Forth book strongly enough — it transcends the language it was written for, teaching fantastic lessons about structure and development cycles.
My feeling is that doing so would substantially decrease the crime rate?
It was $1bn in cash and Facebook stock. So you can cut more than a third off probably a large chunk of that.
I heard they kicked off the flat icon craze at Xerox PARC by having only two colours.
Umm, except that iPhone sales are up 40% year-on-year. It has decreasing marketshare in a rapidly growing market, not really allowing us to conclude anything much about the individuals.
At least in iOS 1–6, you just long press the letter and a popup shows the options. Slide your finger to the one you want and release. Options for the long press on a are aàáâäæãåā. The copyright symbol seems to be available only on the emoji keyboard as far as I can see, @ is accessible on the symbols page and I failed to find a superscript but that doesn't mean it isn't in there. I'd have been unlikely to guess that § is paired with & for example.
The Metro interface was so named because it was modelled after the design language of mass public transit systems, specifically including the King County Metro that serves Redmond and Seattle amongst others. So it's a little disingenuous to give credit to any of these computer company upstarts.
I trust a site called AllAboutSymbian to be as objective about iOS vs Symbian as, say, AppleInsider.
Creative Labs (hierarchical menus versus the iPod), then Nokia (swiping on a touch screen versus the iPhone), then Apple, Apple, Apple, Apple.
The iPad Mini has the same 4:3 aspect ratio as the big iPad (and similar to that of most sizes of paper). Though that does still differ from the 16:9 of the latest iPhone and the 3:2 of the previous ones.
I think he just meant that the average amount of time an iPhone user uses his or her device for is 50% greater than that an Android owner does.
*cough* the final version of iOS available for the iPad 1 is 5, not 6.
The Sinclair ZX81 and Spectrum were amongst those using ULAs — a bit like write-once FPGAs you bought off the shelf, a custom step imprinted your logic and then they were delivered. So they were custom in all reasonable senses. Acorn used the same method to shrink the BBC into the Electron.
That's because Google fixed the maps. There's actually quite a lot of precedent for OSs shipping with applications that many feel are better implemented by third parties: TextEdit, MS Paint, Internet Explorer, Safari, etc, etc.
But surely that still allows it to be too big?
You cannot fit an SLR into a smartphone. So you're going to have to accept lesser image quality. In that case why is something bulbous preferable? A lot of people would argue that it just falls between two stools — it's not compact enough to fit well with a mobile phone and it's not large enough to do all that much better than what does fit in other mobile phones. So it's too large for anything people want in a phone.
It's the same sort of logic that allows people to conclude that e.g. a 6" screen is too large for a mobile phone.
Samsung are probably most explicit with their adverts full of 'haha, these people are sheep' (ie, thoughtlessly following the prevailing fashion) or 'of course they use an iPhone, they're old' (as old is the antithesis of fashionable per advertisers), but that's only because they explicitly want to put the negative association onto the competitor. Apple's adverts are just as much about fashion — the iPhone is sleek and modern — but less explicitly because it doesn't work if you just stand up and plainly claim to be fashionable.
The Kantar figures you're citing are for the USA only — see http://www.kantarworldpanel.com/global/News/news-articles/While-Android-Leads-iOS-and-Windows-Are-Growing-At-A-Faster-Pace — and show Android ticking up a little year-on-year, iOS ticking down a little and Windows Phone eating BlackBerry's share.
Gartner run the figures worldwide and, taking smartphones in isolation, report a 69.7% share for Android and a 20.9% share for Apple. If you include all phones then Apple's share drops to 7.5% so I guess the Android number would fall proportionally (source: http://www.gartner.com/newsroom/id/2335616 ).
Possibly also of interest, Apple's absolute sales figures are up by almost 50% year-on-year totalling an increase of 40m and Samsung's are up more than 20% but that's about 70m. Nokia managed to lose a bit more than 20% of its custom during 2012, RIM about 33%.
So duopoly is questionable but it's obvious that both iOS and Android are relevant and neither is going anywhere any time soon. Can you confidently say the same about the other platforms?
To me Microsoft's current efforts feel rather like Apple suddenly deciding to license the Mac operating system after Windows 95 had come out.
Remember how one of the reasons to abandon Altavista/etc and start using Google was the minimal page design that helped to keep the information you're actually interested in more prominent? Google doesn't.
The EDL is considered, rightly or wrongly, to be far-right because of its deeds rather than because of its words. On several occasions, the EDL has planned a protest only for it to descend into hooligans who self-associate with the EDL smashing up private property. People like those caught on video here: http://www.liveleak.com/view?i=794_1369521652
The public perception has therefore become that EDL marches are likely to descend into hooligans destroying private property. People also generally associate hooliganism with right-wing sentiments, probably because of the serious problems football has had with both hooliganism and with racism, tying the two things together in the public imagination.
Even taking your argument as accurate, and starting from the premise that the connection is a misperception, EDL supporters like yourself further cement the conclusion by blaming it on "the liberal left wing". Where is a group that despises the liberal left wing most likely to be on the political spectrum?
To a much lesser extent there's the 'English' in 'English Defence League'. Where do groupings that explicitly reference England, Britain or the United Kingdom in their name tend to fall on the political spectrum?
Those are the reasons that the EDL is perceived, and continues to be perceived, as right wing. If, as an EDL supporter, you want to shed the label, those are the problems you want to address.
Here's a test: I strongly dislike the EDL. From that statement in isolation, where would you assume my political beliefs lie?
Yeah, that's how it works. If I see a Conservative Party political broadcast that tells me they'll fix the economy but then don't vote Conservative then that must mean I want to break the economy?
The problem with guaranteeing free speech is that you can't hold it up as the be all and end all while simultaneously saying that the EDL have the right to say anything they want but Anonymous don't have the right to say that person X is a member of the EDL.
At some point you have to balance the rights of one group against the rights of another. In this case I think the defamation angle is the right one to follow. If Anonymous has misidentified anybody then those people will likely be subject to a heavy adverse reaction. It's the rights of those individuals that should properly restrict the right to free speech.
So, yes, I'm against what Anonymous has done. I'm also against the EDL but that's neither here nor there. But I disagree with what's happened not because I think free speech is an absolute right but rather because I think that limitations are justified in specific limited cases.
I'm not so sure about gamers — it could go either way. Without a middle tier of people that don't care if they play at 20fps with low-quality textures to pump up the sales, there's just not all that large of a market. So then what are people going to buy the high-end computers for? It'll become an ever shrinking niche.
Another way round of phrasing it: 90% of games are going to be developed for tablets and similar devices. Those games are not going to scale well. So why bother spending the money to try?
If this is really the way the conversation is going, it's pretty easy to rattle off the systems that were technologically superior in many respects to Windows 3.0 in 1990. Off the top of my head: The Amiga Workbench, RISC OS (both as already mentioned), NextStep, OS/2, NeWS, X + e.g. OpenWindows.
Of those, NextStep, OS/2 and NeWS are probably the ones worth singling out for special praise. All three are preemptive, use a protected memory scheme and provide the sort of user-land libraries that we now usually consider to be part of an OS.
The iPhone isn't dying now, but the iPod wasn't dying in 2007. Modern Apple likes to get each new cash cow up to speed before the previous goes into decline.
That said, if a watch is all they can come up with then the future's probably not bright.
The Z80 (and 6502, and others) undocumented opcodes were merely relics of the decoding process — they weren't intended to be hidden. The reason they became well known was that one or two of them were found to do useful things in computers where a single supplier had provided the same model of CPU for the entire production run.
For example, there's 'shift left and insert a 1 in the least significant bit' on the Z80 that can be used to make a faster scroll in some cases and to help with certain methods of sprite compositing. It's a relic of shift right arithmetic and fills a pretty obvious numerical hole in the instruction map. So if you know that every 48k Spectrum uses a Z80 with that instruction then why not use it?
So this is unlike the classic situation because the motivation is different — the new operations are hidden on purpose.
"Haha, all our money comes from markets you don't compete in — but we hear Samsung are getting rich off phones"?
I quite like the ribbon and don't fully understand the dislike for it. Especially once you've set it to automatically hide, it's a pull-down menu that has icons as well as words. Then once you can rely on people's ability to discern pictures more quickly than words, you can go back to the old-school approach of putting things in the drop downs rather than in toolbars. One of the reasons they did that was that screens used to be smaller; now laptops are the predominant form of computers, screens are smaller again.
I guess the counter argument is that the icons don't add anything to the words or the words don't add anything to the icons so one or the other just acts as visual noise, spreading everything out so as to make navigation more laborious? I can't say I've faced that problem but I'm hardly a power user — in Word I use little beyond style sheets and am sufficiently fussy that I expect not to set them up in way that satisfies me very quickly.
Sorry to be the carrier of bad news, but she isn't. Her LinkedIn profile says she left in February 2013 — no doubt in that big round of layoffs they did.
If you give me $150,000 then I could give you some of my time trying to figure out how to recreate old arcade machines.
Straw man begets straw man:
Yes, silly us in the rest of the world. We forgot that the death penalty and free access to guns are inalienable requirements for democracy. That's why there aren't any democracies in Europe — we're all just oppressed socialists because we have things like universal healthcare.
It probably doesn't say anything but the American Constitution recognises rights rather than granting them, so that really just means that it doesn't take an explicit position. Probably the argument that it'd be a bit ridiculous if the right to bear arms were recognised but not the right to make them usable is the more persuasive.
I think they just mean that, in the style of Citrix, OnLive, VNC or a host of others, you could use their streaming to stream a moving picture of a computer program rather than moving pictures of actors. That program would probably be hosted as a virtual machine on the originating server. Which gives them a neat extra buzzword vaguely to attach to their software.
Apple has a working implementation of WebGL under iOS — it's enabled for iAds (which are vetted) and can be enabled across the system on jailbroken devices and/or in individual web views through undocumented API calls. Guesses for it not being on by default range from it being insufficiently secure for the main browser (ie, the Microsoft argument) to Apple not wanting to lose app store revenue (ie, the anti-Apple argument).
So, anyway, if killer WebGL apps come along then Apple needn't allow its OS to be left behind.
By putting the GPU behind the MMU it does technically reduce one of the video memory concerns — you could have a single graphic however many gigabytes in size, memory map the file and call that the texture. Attempts by the GPU to read sections not currently paged would simply raise the usual exception, which would be caught by the OS in the usual way and handled by the existing paging mechanisms. You no longer have to treat texture caching as a separate application-level task.
That said, as others have noted the main point of the design is that when you write a parallel for loop in your language of choice to perform some vector operation — especially if it involves no branching — the GPUs can be factored into the workload just as easily as any traditional CPU cores, but so as to perform the work much more efficiently. So writing programs that take advantage of all the available processing becomes a lot easier. Collapsing virtual memory to a single mechanism that your OS vendor has already supplied is just one example.
The oldest supported machines for OS X v10.8 are mid-2007 iMacs and the newest unsupported machine is a just-before-early-2009 Mac Mini. So the most harshly affected purchases were a shade more than three years old when the OS came out. Given that we're talking official support, not how well the thing runs, that's harsh when you consider that Windows 7 and 8 have the same official minimum requirements and Windows 7 came out just shortly after the newest of the unsupported Macs.
I guess the £25 cost-of-entry explains support and testing cuts at Apple's end but it's hard to call it fair treatment.
If we apply the standards usually used by commenters on tech blogs: it isn't worth lauding in any capacity because somebody else has already done it as part of university research and vaguely comparable products have preceded it to market (eg, the Vuzix).
Besides that, I'm pessimistic about it because as far as I can make out it's just a different way of using a mobile phone. Instead of pulling it out of your pocket to look at, it's already right there in front of your eye. So you gain pocket space and get to use your hands for something else but you lose most of the interactivity and the ability to share. How many times has one of your friends given you their phone for a few minutes, or at least waved it in front of you, to try a new game or application, or quickly to show you something on a website?
But the numbers from StatCounter show that Apple's market ISN'T diminishing, at least in terms of share.
It seems to need repeating several times a day but Apple's marketshare isn't in decline. Looking at the worldwide numbers as reported by StatCounter and going back three years to March 2010, the iOS market share has had a range of 19.41%–30.13%. It is presently 27.14%.
What's happened is that Android has killed more or less everything that isn't iOS. So in in the same three years it has risen from 6.21% to 37.23%. But saying that success for Google must obviously mean failure for iOS is plainly false. (aside: the stats report 12.58% share for Nokia Series 40 so they're not exclusively about smart phones; if the 37.23% doesn't sound like what you thought then that's probably why)
Investor confidence is an issue but loss of marketshare isn't.
3ish Mhz was the norm but I guess you can claim some credit for the Sord based on its video chip.
The ZX80/81 famously use the processor for screen painting — if memory serves then to paint the display it runs through a series of NOP instructions, which gives a reliable deterministic rate for the z80-generated refresh signal and when the video circuits spot a NOP in ROM they make a note to use the next thing on the bus, which is the value the RAM kicks out on account of the refresh cycle, for video output. The RAM doesn't actually need a real refresh cycle because it's static. But the net effect is that the CPU is occupied for the entire pixel region, doing work that otherwise produces nothing.
The Spectrum has a ULA that can generate addressed and read cycles all of its own volition but shares the same memory (at least, the lower 16kb) between CPU and ULA so the CPU has to wait if accessing that area when the ULA needs it. It's also a fully bitmapped display so the CPU has to write every byte of a graphic for it to appear or move.
Conversely the TMS9929A has 16kb all of its own that operates entirely separately from the CPU's memory pool. You write to it via port IO and there are still some wait cycles involved but the whole setup is designed around the idea that most of the time you don't write much data. It's sprites and a tile map, so for text and most games you spend time uploading the block graphic set and then the drawing isn't much more than updating the map and possibly a few sprite registers, so you get almost all of that 3.6Mhz free.
Games are still likely to work better on the Spectrum though as the TMS9929A completely overlooks scrolling. You can do the block scroll alluded to in the article by rewriting the entire map but that's almost the end of it as you don't have time to rewrite every pixel. The MSX 1 and the ColecoVision have the same chip and the same problem.
Sorry, share in which market is meant to be bleeding?
Steve Jobs died in October 2011. According to StatCounter iOS had 23.48% worldwide market share then. It's now April 2013. According to StatCounter iOS has 26.65% worldwide market share. Market share is up since Jobs died.
Maybe if we limit the numbers to Europe? Then we're talking 38.86% now versus 42.29% then but that's hardly a bleed. North America goes the other way with 41.03% then turning into 51.79% now.
Okay, what about the Mac? Worldwide share was 7.18% in October 2011 and is 7.04% now; in Europe that's a transition from 6.79% to 7.53%, in North America it's from 13.91% to 11.6%. So the continents are reversed in their trends versus iOS but in neither case is the change particularly massive.
Check out http://gs.statcounter.com/ to try any combination you like — the objective reality is that Apple's marketshare hasn't bled since Jobs died.
The statement is defamation if it's communicated to a third party by the person making it (rather than merely a private insult), and would cause a person's standing in society to be seriously affected, or would cause the individual to be shunned or avoided.
So the legal protection is on reputation, not on feelings.
Not only are some of Apple's patents much more obvious but have been ruled so and are now similarly invalid. E.g. last month the US Patent & Trademark Office invalidated Apple's rubber banding patents — there was a preliminary ruling last October that I can find an El Reg reporting of at http://www.theregister.co.uk/2012/10/23/uspto_apple_patent/ and a final one that many other sites reported at the beginning of this month (though that's final only in the sense of 'we've finalised our initial ruling on the problem, bring on the appeals').
January's ITC ruling in favour of Motorola and against Apple re: '430, '828 and '607 would suggest otherwise.
While I hope they achieve their target the good way, I think the sceptics are probably right on x86 in tablets and phones — there's too much NDK stuff out there for many Android-using manufacturers to make the switch and a lot of them, along with Apple, are now used to being able to license embeddable components and design their own silicon. It's not that the architecture is always going to be behind in speed or power, just that the ship has already sailed.
Apple don't say that if you discontinue Siri then they will delete all your data, only that they'll delete the non-anonymised stuff. The article even has the relevant bit in bold:
If you turn off Siri, Apple will delete your user data, as well as your recent voice input data. Older voice input data that has been disassociated from you may be retained for a period of time to generally improve Siri and other Apple products and services.
As to your related point about whether voice data can really be anonymised, they could technically just be keeping the first-level stuff about pitches and rhythms that they extracted from the sound recording, which would identify you only in the same sense that written text with no associated author could identify you, but probably they just mean 'we won't store further user details with it'.
I guess it's just about conceivable, given that Android and Chrome OS are distinct, that there would have been a third branded OS since Goggles are neither primarily interested in displaying HTML content or, as I understand it, intended to run full, largely discrete, self-contained applications.