You think he was heading to the forums to post a comment blindly supporting [company X] regardless of the story, taking the opportunity to remind people who use products made by [company Y] that they're a sub-monkey laughing stock and objectively wrong?
2058 posts • joined 18 Jun 2009
It was Z, wasn't it? They spent much longer than usual on that only to be pipped to the post by the coincidentally very similar Command & Conquer, then found themselves surrounded by the new world of the Playstation and never quite refound their footing. At least that's how I've heard it told.
Re: @ RyokuMas
255 shots per second? I wouldn't put anything past them but if the Bitmap Brothers really were checking the input state more than five times per frame then hats off to them. I'll bet there are gamers that could tell the difference, too.
Re: I miss my Amiga...
The problem with Workbench was that the way its nano kernel was architected — a product of its time — was basically dependent on message passing between processes being fast. When you don't have protected memory, as on the Amiga, it is fast because there's no work to do. If you have protected memory then it's usually slow as the kernel must either copy the message from one address space to another, must reassign ownership of regions of memory or must be not entire protected giving everyone access to shared messaging memory and still allowing one misbehaving application to mess up a bunch of others. Therefore Workbench as constituted had a definite expiry date.
(of course, I say this aware that e.g. the classic Mac OS was even worse with neither protected memory nor pre-emptive multitasking, yet managed to hobble on just about into the early 2000s before an overhaul was finally achieved)
As for PC vs Amiga? Yeah, it was already clear which way the wind was blowing once the PC became the machine with the super-fast CPU and the easy to address video memory. The 3dfx and its competitors just sealed the deal. The open market of commodity hardware from multiple vendors overwhelmingly based around a software platform eventually outdid the closed market of a single vendor overwhelmingly based around a hardware platform. I think it's telling that the only other computer platform to survive the '90s was also overwhelmingly a software platform, not a hardware platform (and, indeed, now largely just uses the same commodity hardware as everyone else).
I think the difference is that Samsung has a lot of space to transition sales from ordinary phones to smartphones, so its gains in smartphone numbers are offset by the decline of the non-smart market.
If you compared methods of playing MP3s then Apple's gains wouldn't look so good as the ongoing decline in iPod sales would have a similar effect.
Re: Don't care.
Yeah, Google's aggressive data collection and hoarding, increasingly closed software and alleged anticompetitive practices (cf: http://www.theverge.com/2013/6/13/4427706/eu-committee-probe-google-over-android-anticompetitive ) is much better than Apple's not-invented-here mentality, explicitly closed software and alleged anticompetitive practices. Also when Google doesn't want something you submit on its store, it permanently suspends the thing — which is obviously a lot more 'open' than when you submit something to Apple's store which it doesn't want, as Apple will decline to approve it.
Apple's position as whipping boy is not without justification but the degree to which some people separate it from other players is absurd.
The power of not offering options?
Suppose I'm an average consumer and I want a smartphone. I probably decide to purchase an Android because it ticks all my mental boxes: touchscreen, web browser, apps. Having decided to do that, I see the S4 or the HTC One is the king of the market but I also see the Moto G and the Nexus 4 offering a lot more value and I probably have the choice of something else free on my contract. It's quite likely that I don't buy the most expensive option.
Supposing I decide I want an iPhone, because I am already in the Apple ecosystem from iPod times, because I was an early smartphone adopter and don't fancy learning something new, or for whatever other reason, the first thing I can consider is quite an expensive handset. So Apple has protected its profit from me by not offering a mass-market option.
Just regression towards the mean?
With no evidence other than personal bias to suggest a causative link, surely the correct assumption is the standard: success is part skill, part luck and hence extreme success implies extreme luck. Since luck is random, over time extreme luck will always decline. So even those who are still doing everything under their own control perfectly will see declining prospects.
(though, obviously, I'd love it if the story were Windows 8 + migration away from this category of computing device while those that were buying the expensive ones continue to have the money to buy one of everything)
That doesn't sound likely to me. Here's what most people think about phones: they're much of a muchness and the best one to get is whichever is the most nominally expensive that they'll give you for free on your contract.
The iPad similarly doesn't have Calculator. I wouldn't mind if it didn't make me so painfully aware that I'm out of practice with mental arithmetic.
Re: Hmm @AC
Based on the number of people that seem capable of operating phones based on the Linux kernel, I'd say the tailored user interface obviates any concerns about the ability of the man on the street to use the bash/csh/X11/etc/whatever stack usually associated with 'Linux'.
That being said, I suspect Windows was chosen because it comes with good security support most of the time with the cessation of support having been hand waved away. The Linux guys are very good at updating the kernel but then who's responsible for pushing that to the machines? And the XP support period has beaten any of the commercial Linuxes by quite a stretch.
Re: "If you want security-upgrade details, you'll need to wait."
So we're accusing Apple of breaking third-party Lightning cables without any evidence that they have, comparing them to Microsoft despite that not really being something Microsoft would do, then recommending Amazon cables as still working even though we still have no evidence that others have stopped working or, therefore, that Amazon cables still do?
Re: Mr Postman
El gato no come?
Re: "that’s a serious amount of money"
I don't care enough to compare the specs directly but your Samsung is a different proposition because it's an internal drive with no Thunderbolt port.
However, the two other Thunderbolt SSDs available on Amazon are:
• a 256GB Lacie for £254 (which, like the reviewed product, also has a USB3 port); and
• a 256GB MiniPro for £238 (with no additional USB3).
So the Lacie is more than 40% cheaper but appears to be from a year and a half ago so is likely slower. However the MiniPro is not yet a year old and boasts of being "capable" of read speeds "in excess of 500MB/sec" and therefore would be faster if the marketing puff were reliable. So it does sound like a better deal.
Kicking it old school here, I find my AMOLED Nexus One (yes, definitely the AMOLED version: that PenTile matrix is clear) harder to read in direct sunlight than whatever my iPhone 5s has. But maybe it's just not automatically adjusting brightness or something? Better not rule out user error.
Re: sounds like @big_D
Ford at least were rumoured to be sticking with intelligence in the car but via QNX rather than the Microsoft product. I can understand the push towards devolving processing to the phone — apart from making it replaceable I think it also eases several regulatory hurdles since there's one testing standard for normal consumer electronics and another for electronics in cars — but betting on one or two specific handsets now when the car will probably still be in use by somebody in 15 years is silly.
Standards have always been a useful tool for companies to use in certain situations. Look at web standards: useful for Microsoft to ignore so it could gain a monopoly, then useful for Mozilla et al to push so that the competition would all be aiming for the same target, now useful for everyone because nobody has overall control. This is much better for the market but every step along the line has been motivated by individual interests.
In this case the combination of standards you'd need to implement would be significantly more onerous than just supporting (pretty much) a single device and I guess Mercedes et al are more interested in beating each other to market. They'd probably also argue that most of the other similar in-car systems can't connect to your mobile at all for things like GPS, music with the menus up on the dash, contacts, calendar, etc, so it's all a bonus, right?
I hope standards will be forthcoming.
I guess it gets you rich text with an established scripting language and platform-neutral API which doesn't enforce many assumptions about specific typography?
Though I'm not very enthused if the first laundry list item is a "file system browser". My OS already has one of those. Yours does too. They're probably different. Why would either of us want to learn a third?
Just use whatever's already in the OS, please. I don't care how boring it is.
Re: This wasn't an SSL weakness as such (@AC "Looks more like a merge issue than anything else")
The linked article states:
"A test case could have caught this, but it's difficult because it's so deep into the handshake. One needs to write a completely separate TLS stack, with lots of options for sending invalid handshakes."
So rather than the absence of proper unit testing I'd say it was the absence of exceptional unit testing, but it plays into the usual narrative around Apple's attitude towards security versus the more commerce-oriented firms.
But these same issues must be concerns for Firefox et al. How many variations of ostensibly valid but actually invalid SSL certificate can there be and has nobody set up test servers that automatically vend those? Writing a unit to test to connect to each of those, with and without a data fuzzer, doesn't sound too hard.
Re: No thanks
Mercedes had a self-driving car take itself from Germany to Denmark back in the mid-'90s so are probably used to those tech upstarts duplicating their work by now.
Re: We all remember what happened last time... @Adam JC
At $60,000 the Tesla is a budget car, silly!
Are we expecting "businesses [to be] excited about Glass"?
Hey, businesses, wouldn't it be great if your staff were more distracted? Hey, businesses, wouldn't it be great if all your commercial practices — whether genuinely dodgy or just ordinarily proprietary — were more widely and more easily recorded? Etc.
Re: Good name
Phwoar! Look at the foraminifera on that!
... or Great Britain? Oh, sorry, it's still optional, right? I'll ask again in a few years.
He was happy to buy a licence to post on Google Play so I suspect he's fine with licensing. Or maybe it's just that his budget was $25 rather than $99/year?
Re: I know an iCar gets a lot of stick... @TRT
I thought Apple's recent track record on mapping was quite encouraging: they've been slowly and methodically fixing the problems without calling a press conference every three months to announce they're "Revolutionising maps. Again." or whatever. Much better behaviour than I think any of us might have predicted.
Re: You want to loose weight?
Wouldn't that just be one step towards no longer gaining weight? My understanding is that the most recommended way to lose weight is to eat a healthy mix of food in a minimum safe quantity and exercise to create a moderate daily energy deficit.
If I dare challenge the received image: Sega makes most of its money from manufacture and distribution of coin operated entertainments — not just video games but a bunch of things. It's always had a very profitable business in that. It had a successful foray into home entertainment but was losing money every year by the time the Dreamcast was on sale. It was therefore smart and pragmatic severely to scale down the home entertainment side while continuing to enjoy the coin op profits. It's impressive that such a company went for broke with the Dreamcast rather than hedging on the escape strategy but the latter was always a safe option.
No oblivion, none coming. Just a gamble that didn't pay off.
Re: The suit was obviously without merit @AC
You appear not to have understood the case.
If Ford said its feature worked "each and every morning" and it didn't then you would have a case.
The court looked at the specific adverts that Apple actually used and determined that there was no problem _specifically because_ Apple never promised it would work "each and every" time.
It's all in the article.
That's exactly what a Commodore owner would say.
Re: What the hell did they expect?
For the record, I find OS X's Launchpad to be stupid without a touch screen. But it was added as an extra: nothing else was taken away. If anything OS X has become more accommodating over time to those of us that keep regular use applications directly on the dock and the /Applications folder over on the right for start-menu like access to everything else, as Apple has introduced the speech bubble style folder that makes a better show of most /Applications folders.
To be fair to Rovio, playing and playing again in Angry Birds was similarly speedy back when it was a paid standalone app. The clutter of advertising has accumulated only after success. So I'm sure it's a classic tale of most of the team understanding the benefits but the marketing team having different ideas.
As for Flappy Bird? I can see the appeal: if you fail then it's unambiguously always your fault, the gameplay doesn't actually progress so there's no having to repeat yourself disincentive to hitting the play button again and it requires just enough attention to occupy you. So you end up hitting the play button repeatedly and losing track of the time. Meanwhile all it does for revenue is display a small advertising banner at the top of the game over screen but not during gameplay, which is actually quite smart because it's a contextually justified way to get a lot of impressions and doesn't annoy the user.
There are a lot of theories that Nguyen is some sort of genius — e.g. the rate button was also on the game over screen in early versions and would appear suspiciously close to where most people tap to fly. Meanwhile Apple's App Store uses recent positive reviews to weight its overall rankings as they attempt to quantify popularity by as many measurements as possible. So that may have helped give the app early momentum, whether intentional or not.
There's no such thing as CRT pixels in general; per the original back and white spec scan lines are entirely analogue, as is the display mechanism, and even with colour it's more complicated than that as there's the dot pitch and the type of separation to take into account: a low pass responds differently to a comb, etc.
Given that, why not just use the full screen at any old number of pixels? To comply with the PAL standard, the vertical sync pulse needs to be between 4.6 and 4.8 microseconds. So you need a clock speed that aligns well with that. But you also don't want to use too much RAM and you possibly want to hit a standard column count, like 80 in the case of the CPC. If you're a machine that shares memory but semi-intelligently like the Spectrum then more pixels would mean slower processing in the affected areas. You possibly also want a sufficiently trivial way to determine the start addresses for a line of pixels. And I'm pretty sure the Spectrum at least used video fetch as RAM refresh, so there were additional timing requirements there about hitting certain rows of RAM.
But the CPC, like the BBC and at least EGA and VGA video cards, uses a Motorola 6845 CRTC — cathode ray tube controller. It's programmer configurable to provide any line timings and pixel areas you want. So it's the developer's choice, subject to the comstraint that if they're not careful while developing then they might ruin a screen or two. The CPC also switches some of the address lines around to give linear memory along scan lines, rather than a BBC-style character centric layout, which introduced additional considerations.
Aside: in classic micro style, values you write to it take effect immediately so it's the mechanism by which later special effects were achieved: tell it to start horizontal sync and it'll reload the start address, but jump in at the last minute and tell it not to do so and it'll start doing pixels again in the same frame from a different address. So that's good for panels, split screen scrolling, etc. Stuff they'd eventually call 'Mode X' when someone else eventually spotted it on the VGA cards.
Gosh I'm late to the party, but:
The C64 and the Oric both use a 6502. The 6502 runs internally on a two-phase clock, like most chips from immediately before it, but is advanced enough to require only a single-phase clock input, which it doubles. As a result, e.g. the stated clock speed of a C64's 6502 is 1Mhz but if you compare access cycles and wait times, the work it's doing is broadly similar to a 2Mhz purely single-phase CPU like the Z80. Check out the memory access timing diagrams on a 6502 data sheet, then check them out on a Z80 data sheet. Check out the cycle timings for things like an 8bit add.
So, what can you do with 4Mhz RAM? You could connect it to the Oric or the C64's CPU and it would be running at four times the speed. You could connect it to the CPC's CPU and it would be running at the same speed. What you're getting in the CPC versus the other two machines is better described as: RAM that's twice as fast plus a CPU that's twice as fast (but a little more haphazard in its access patterns).
If we're citing game examples, look on YouTube for C64 Chase HQ versus CPC Chase HQ. Look at Hard Drivin'. Look at Carrier Command. Even if you just want to see how the C64 cut corners on the processor, compare the BBC Revs to the C64.
You beat me to it. But, yeah, if memory serves then the AY is three channels, each of which may be tone and/or noise whereas the SN is three tone channels plus one noise channel. Also the AY has a small number of fixed volume envelopes — timed patterns of volume ramps that it will repeat over and over again on a channel.
The AY is also marginally better for PCM output because both tone and noise are 1-bit signals and are mixed by logical OR. So I think you can rig it to give you a static non-zero wave for the CPU to throw volume levels at. On an SN you'd just ramp up the frequency beyond the audible range and let the natural low-pass filter of your ears discern the volume gymnastics.
This action is likely to be very successful
... in helping Rand to secure his front-runner status for the 2016 nomination, especially with Christie's Bridgegate woes. You know, if the first name isn't enough.
One wonders whether Apple's much ballyhooed attempts to manufacture these things in the US (national pride and all that) might be causing problems as presumably they've had to tool up a brand new factory and train a brand new work force. They could very well be building a lot fewer than they'd wanted to.
Re: Sounds like he's having an "In MY day" moment.
I agree entirely. I think computers were a positive effect on children in the '80s because they booted into BASIC and therefore were an effective educational tool; also games weren't yet sophisticated enough routinely to swallow a large amount of any given day.
But then I ask myself: why do I think that? Yeah, it's because I was a child in the '80s. So my opinion is probably biased rubbish.
There's no 'report a problem' button, so...
It's Der Spiegel, not De Spiegel. As in "the mirror" but at a completely different end of the market from The Mirror.
Re: Shock horror!
I don't think he does have a point: Apple has almost no expertise in being just one of many suppliers of anything. Everything Apple knows — and knows how to sell — is about tight vertical integration, with Apple being behind every part of the widget. Macs ship with OS X. iPhones ship with iOS. iPods link only to iTunes. Etc.
So while Apple would have some advantages in trying to sell an Android phone over Samsung, HTC, etc (for emphasis: _some_ benefits, i.e. they'd likely capture _some_ of the market) it'd be a riskier proposition that continuing as they are now, with no obvious benefit even if they prevail.
I therefore don't think it's correct for Apple to start selling Android mobiles. It'd be great for us, the market, but that's neither here nor there.
Re: NeXTSTEP (@Mage)
Modern OS X uses, effectively, Display PDF. PDF is the output of a PostScript program. Apple's Core Graphics is all the same primitives, fill modes, etc as PostScript without the PostScript interpreter. So the two have both solved the same problem in the same way.
This actually turns out to be a pretty good idea: that's why Apple's text looks like printed text, using classic printed fonts like Helvetica, and Microsoft have had to commission their own custom fonts like Calibri that are designed around their idiosyncratic ideas about typography just so that the aggressive hinting, lack of pair kerning, etc, won't look quite so retro.
So what else did NextStep do that's interesting?
It learnt the Xerox Smalltalk lesson — that full object oriented, dynamic typed languages are a great match for UI work — but adapted the language so that it's compiled, not interpreted, and can link directly to the C libraries that were otherwise industry standard. That's Objective-C. It's just as happy talking with C++ nowadays, of course. The language and the framework are why the web was first developed on NextStep, why Doom was mostly developed on NextStep, etc.
It swept aside all the nonsense with application installers by introducing the application bundle. The application doesn't just look like a single icon in the Finder, it looks like one on disk too. Dragging it to the trash genuinely is an elemental file operation, not something that someone has hacked in as a special case. (aside: RISC OS did more or less the same thing at more or less the same time, as well as the dock and a focus on proper typography; all coincidence, apparently)
It introduced the fully compositing window manager. Consider where Windows was up to and including XP: preemptive multitasking, protected memory. So it doesn't affect the wider system if an individual app hangs or flips out, right? The answer is: only if you don't care whether the screen is painted properly.
File associations are handled by metadata, not as an exercise in string matching. If I want .doc to associate with Pages by default but have a few that render incorrectly and should be opened with full-on Word, I can set those to open with Word while leaving the rest alone. This becomes a property of the file and goes wherever the file goes. It is not a hack someone added into the Finder.
It was the first graphical environment with system-wide scripting. It was designed from day one to be architecture agnostic, supporting fat binaries. It beat OS/2 to the punch on both of these things.
Beyond that the big wins are really in the frameworks themselves. Pervasive rich text, system-wide spell checking, a system-wide encrypted store for passwords, etc.
So none of those is individually a massive leap (though it depends what you compare it to; if it's only commercial competitors then Objective-C would count, as someone finally realised what Xerox had pioneered under the hood) but I'd agree that NextStep was a decade ahead in the '80s based on the combination of technologies.
Though, yeah, then they decided to price it beyond any sense and predicate the machines on a dodgy media format. I guess Jobs learnt how to price things for optimum profits by reeling in from the far end.
Google is as Apple does
Reverse engineering forbidden? The right to block apps without prior notice at their sole discretion?
Re: Avoiding a takedown notice?
The Apple resolution procedure usually involves a complainant contacting Apple, then Apple contacting the developer, giving the two a chance to come to an agreement before it takes action.
So if the copyright holders had raised legal issues with Apple, Apple would have forwarded them to Elite and waited for further instructions. At that point Elite could easily "voluntarily" pull the apps.
Re: Nobody remembers Bill Gates saved Apple
Jobs wasn't booed by Apple employees, he was both booed and cheered — the video is easy to find — by a crowd of convention goers. Imagine the reaction Michael Foot would have received from a satellite link-up with Mrs. T at the Labour Party conference, then divide by about a hundred million. Chris Christie got a lot worse than Jobs did for embracing Obama. Though if I have to cite the Republican Party to make something else look reasonable, maybe I've already lost the argument?
Re: So ghastly...
Conversely, I found it to be speedy*, neither well nor poorly configured, and with a mouse so badly designed it made me want to go out and kill someone. The machine was all but unusable until third parties finally started making USB mice.
* in the same way that Pentium IIs felt at the time; not as the advertised massive leap forwards.
Re: Fourteen Devs issue joint statement
Can someone explain to those authors that they should complain to Apple? There's a resolution process in place for this sot of thing — they can get the apps withdrawn unless or until Wilcox starts paying them.
Re: If there was any doubt
Judging by u-turns to date, stupid. Here's just 45 of those that the coalition made in its first 42 months in power, albeit not from the most sympathetic source: http://www.theguardian.com/politics/2013/nov/28/coalition-u-turn-list-full
Re: Carphone Warehouse?
Copying the Apple Store plan also seems to be working brilliantly for Microsoft. The company itself has declined to give any specifics; the only analyst's guess I can find on Google puts them at a quarter as profitable per square metre.
At least if Samsung are doing this with Carphone Warehouse we can be certain they're not going for anything particularly ambitious.
Re: Enjoyable watch
It's possible Wozniak preferred that. It's also possible he's speaking with hindsight but per e.g. http://www.businessinsider.com/steve-wozniak-thought-the-first-macintosh-was-a-lousy-computer-that-failed-2013-6 he wasn't a big fan of the original machine.
His most specific gripe: not enough memory to get anything done, leading to endless disc swapping. I guess he wasn't the only one who thought so as the motherboard, despite using soldered RAM, contained the logic to drive 512kb (four times the 128kb it came with) and the Mac started shipping with 512k within a year or so after launch.
Re: iTunes @ Tomh (@rm -rf /)
What you're referring to is a slightly more specific than that: it's music the user has ripped themselves, and which they don't want to use a music locker service for. You can even use Google's if you want — it's free and it works across Android, iOS and the web.
So if you bought the music through iTunes you can download it again from Apple directly onto the device. If you ripped your own library to iTunes then you picked iTunes in the first place but you can just grab the MP3/AACs and take them elsewhere if you want. If you ripped it elsewhere you can use iTunes or you can use a music locker. Google's is free.