1832 posts • joined 18 Jun 2009
Re: You want to loose weight?
Wouldn't that just be one step towards no longer gaining weight? My understanding is that the most recommended way to lose weight is to eat a healthy mix of food in a minimum safe quantity and exercise to create a moderate daily energy deficit.
If I dare challenge the received image: Sega makes most of its money from manufacture and distribution of coin operated entertainments — not just video games but a bunch of things. It's always had a very profitable business in that. It had a successful foray into home entertainment but was losing money every year by the time the Dreamcast was on sale. It was therefore smart and pragmatic severely to scale down the home entertainment side while continuing to enjoy the coin op profits. It's impressive that such a company went for broke with the Dreamcast rather than hedging on the escape strategy but the latter was always a safe option.
No oblivion, none coming. Just a gamble that didn't pay off.
Re: The suit was obviously without merit @AC
You appear not to have understood the case.
If Ford said its feature worked "each and every morning" and it didn't then you would have a case.
The court looked at the specific adverts that Apple actually used and determined that there was no problem _specifically because_ Apple never promised it would work "each and every" time.
It's all in the article.
That's exactly what a Commodore owner would say.
Re: What the hell did they expect?
For the record, I find OS X's Launchpad to be stupid without a touch screen. But it was added as an extra: nothing else was taken away. If anything OS X has become more accommodating over time to those of us that keep regular use applications directly on the dock and the /Applications folder over on the right for start-menu like access to everything else, as Apple has introduced the speech bubble style folder that makes a better show of most /Applications folders.
To be fair to Rovio, playing and playing again in Angry Birds was similarly speedy back when it was a paid standalone app. The clutter of advertising has accumulated only after success. So I'm sure it's a classic tale of most of the team understanding the benefits but the marketing team having different ideas.
As for Flappy Bird? I can see the appeal: if you fail then it's unambiguously always your fault, the gameplay doesn't actually progress so there's no having to repeat yourself disincentive to hitting the play button again and it requires just enough attention to occupy you. So you end up hitting the play button repeatedly and losing track of the time. Meanwhile all it does for revenue is display a small advertising banner at the top of the game over screen but not during gameplay, which is actually quite smart because it's a contextually justified way to get a lot of impressions and doesn't annoy the user.
There are a lot of theories that Nguyen is some sort of genius — e.g. the rate button was also on the game over screen in early versions and would appear suspiciously close to where most people tap to fly. Meanwhile Apple's App Store uses recent positive reviews to weight its overall rankings as they attempt to quantify popularity by as many measurements as possible. So that may have helped give the app early momentum, whether intentional or not.
There's no such thing as CRT pixels in general; per the original back and white spec scan lines are entirely analogue, as is the display mechanism, and even with colour it's more complicated than that as there's the dot pitch and the type of separation to take into account: a low pass responds differently to a comb, etc.
Given that, why not just use the full screen at any old number of pixels? To comply with the PAL standard, the vertical sync pulse needs to be between 4.6 and 4.8 microseconds. So you need a clock speed that aligns well with that. But you also don't want to use too much RAM and you possibly want to hit a standard column count, like 80 in the case of the CPC. If you're a machine that shares memory but semi-intelligently like the Spectrum then more pixels would mean slower processing in the affected areas. You possibly also want a sufficiently trivial way to determine the start addresses for a line of pixels. And I'm pretty sure the Spectrum at least used video fetch as RAM refresh, so there were additional timing requirements there about hitting certain rows of RAM.
But the CPC, like the BBC and at least EGA and VGA video cards, uses a Motorola 6845 CRTC — cathode ray tube controller. It's programmer configurable to provide any line timings and pixel areas you want. So it's the developer's choice, subject to the comstraint that if they're not careful while developing then they might ruin a screen or two. The CPC also switches some of the address lines around to give linear memory along scan lines, rather than a BBC-style character centric layout, which introduced additional considerations.
Aside: in classic micro style, values you write to it take effect immediately so it's the mechanism by which later special effects were achieved: tell it to start horizontal sync and it'll reload the start address, but jump in at the last minute and tell it not to do so and it'll start doing pixels again in the same frame from a different address. So that's good for panels, split screen scrolling, etc. Stuff they'd eventually call 'Mode X' when someone else eventually spotted it on the VGA cards.
Gosh I'm late to the party, but:
The C64 and the Oric both use a 6502. The 6502 runs internally on a two-phase clock, like most chips from immediately before it, but is advanced enough to require only a single-phase clock input, which it doubles. As a result, e.g. the stated clock speed of a C64's 6502 is 1Mhz but if you compare access cycles and wait times, the work it's doing is broadly similar to a 2Mhz purely single-phase CPU like the Z80. Check out the memory access timing diagrams on a 6502 data sheet, then check them out on a Z80 data sheet. Check out the cycle timings for things like an 8bit add.
So, what can you do with 4Mhz RAM? You could connect it to the Oric or the C64's CPU and it would be running at four times the speed. You could connect it to the CPC's CPU and it would be running at the same speed. What you're getting in the CPC versus the other two machines is better described as: RAM that's twice as fast plus a CPU that's twice as fast (but a little more haphazard in its access patterns).
If we're citing game examples, look on YouTube for C64 Chase HQ versus CPC Chase HQ. Look at Hard Drivin'. Look at Carrier Command. Even if you just want to see how the C64 cut corners on the processor, compare the BBC Revs to the C64.
You beat me to it. But, yeah, if memory serves then the AY is three channels, each of which may be tone and/or noise whereas the SN is three tone channels plus one noise channel. Also the AY has a small number of fixed volume envelopes — timed patterns of volume ramps that it will repeat over and over again on a channel.
The AY is also marginally better for PCM output because both tone and noise are 1-bit signals and are mixed by logical OR. So I think you can rig it to give you a static non-zero wave for the CPU to throw volume levels at. On an SN you'd just ramp up the frequency beyond the audible range and let the natural low-pass filter of your ears discern the volume gymnastics.
This action is likely to be very successful
... in helping Rand to secure his front-runner status for the 2016 nomination, especially with Christie's Bridgegate woes. You know, if the first name isn't enough.
One wonders whether Apple's much ballyhooed attempts to manufacture these things in the US (national pride and all that) might be causing problems as presumably they've had to tool up a brand new factory and train a brand new work force. They could very well be building a lot fewer than they'd wanted to.
Re: Sounds like he's having an "In MY day" moment.
I agree entirely. I think computers were a positive effect on children in the '80s because they booted into BASIC and therefore were an effective educational tool; also games weren't yet sophisticated enough routinely to swallow a large amount of any given day.
But then I ask myself: why do I think that? Yeah, it's because I was a child in the '80s. So my opinion is probably biased rubbish.
There's no 'report a problem' button, so...
It's Der Spiegel, not De Spiegel. As in "the mirror" but at a completely different end of the market from The Mirror.
Re: Shock horror!
I don't think he does have a point: Apple has almost no expertise in being just one of many suppliers of anything. Everything Apple knows — and knows how to sell — is about tight vertical integration, with Apple being behind every part of the widget. Macs ship with OS X. iPhones ship with iOS. iPods link only to iTunes. Etc.
So while Apple would have some advantages in trying to sell an Android phone over Samsung, HTC, etc (for emphasis: _some_ benefits, i.e. they'd likely capture _some_ of the market) it'd be a riskier proposition that continuing as they are now, with no obvious benefit even if they prevail.
I therefore don't think it's correct for Apple to start selling Android mobiles. It'd be great for us, the market, but that's neither here nor there.
Re: NeXTSTEP (@Mage)
Modern OS X uses, effectively, Display PDF. PDF is the output of a PostScript program. Apple's Core Graphics is all the same primitives, fill modes, etc as PostScript without the PostScript interpreter. So the two have both solved the same problem in the same way.
This actually turns out to be a pretty good idea: that's why Apple's text looks like printed text, using classic printed fonts like Helvetica, and Microsoft have had to commission their own custom fonts like Calibri that are designed around their idiosyncratic ideas about typography just so that the aggressive hinting, lack of pair kerning, etc, won't look quite so retro.
So what else did NextStep do that's interesting?
It learnt the Xerox Smalltalk lesson — that full object oriented, dynamic typed languages are a great match for UI work — but adapted the language so that it's compiled, not interpreted, and can link directly to the C libraries that were otherwise industry standard. That's Objective-C. It's just as happy talking with C++ nowadays, of course. The language and the framework are why the web was first developed on NextStep, why Doom was mostly developed on NextStep, etc.
It swept aside all the nonsense with application installers by introducing the application bundle. The application doesn't just look like a single icon in the Finder, it looks like one on disk too. Dragging it to the trash genuinely is an elemental file operation, not something that someone has hacked in as a special case. (aside: RISC OS did more or less the same thing at more or less the same time, as well as the dock and a focus on proper typography; all coincidence, apparently)
It introduced the fully compositing window manager. Consider where Windows was up to and including XP: preemptive multitasking, protected memory. So it doesn't affect the wider system if an individual app hangs or flips out, right? The answer is: only if you don't care whether the screen is painted properly.
File associations are handled by metadata, not as an exercise in string matching. If I want .doc to associate with Pages by default but have a few that render incorrectly and should be opened with full-on Word, I can set those to open with Word while leaving the rest alone. This becomes a property of the file and goes wherever the file goes. It is not a hack someone added into the Finder.
It was the first graphical environment with system-wide scripting. It was designed from day one to be architecture agnostic, supporting fat binaries. It beat OS/2 to the punch on both of these things.
Beyond that the big wins are really in the frameworks themselves. Pervasive rich text, system-wide spell checking, a system-wide encrypted store for passwords, etc.
So none of those is individually a massive leap (though it depends what you compare it to; if it's only commercial competitors then Objective-C would count, as someone finally realised what Xerox had pioneered under the hood) but I'd agree that NextStep was a decade ahead in the '80s based on the combination of technologies.
Though, yeah, then they decided to price it beyond any sense and predicate the machines on a dodgy media format. I guess Jobs learnt how to price things for optimum profits by reeling in from the far end.
Google is as Apple does
Reverse engineering forbidden? The right to block apps without prior notice at their sole discretion?
Re: Avoiding a takedown notice?
The Apple resolution procedure usually involves a complainant contacting Apple, then Apple contacting the developer, giving the two a chance to come to an agreement before it takes action.
So if the copyright holders had raised legal issues with Apple, Apple would have forwarded them to Elite and waited for further instructions. At that point Elite could easily "voluntarily" pull the apps.
Re: Nobody remembers Bill Gates saved Apple
Jobs wasn't booed by Apple employees, he was both booed and cheered — the video is easy to find — by a crowd of convention goers. Imagine the reaction Michael Foot would have received from a satellite link-up with Mrs. T at the Labour Party conference, then divide by about a hundred million. Chris Christie got a lot worse than Jobs did for embracing Obama. Though if I have to cite the Republican Party to make something else look reasonable, maybe I've already lost the argument?
Re: So ghastly...
Conversely, I found it to be speedy*, neither well nor poorly configured, and with a mouse so badly designed it made me want to go out and kill someone. The machine was all but unusable until third parties finally started making USB mice.
* in the same way that Pentium IIs felt at the time; not as the advertised massive leap forwards.
Re: Fourteen Devs issue joint statement
Can someone explain to those authors that they should complain to Apple? There's a resolution process in place for this sot of thing — they can get the apps withdrawn unless or until Wilcox starts paying them.
Re: If there was any doubt
Judging by u-turns to date, stupid. Here's just 45 of those that the coalition made in its first 42 months in power, albeit not from the most sympathetic source: http://www.theguardian.com/politics/2013/nov/28/coalition-u-turn-list-full
Re: Carphone Warehouse?
Copying the Apple Store plan also seems to be working brilliantly for Microsoft. The company itself has declined to give any specifics; the only analyst's guess I can find on Google puts them at a quarter as profitable per square metre.
At least if Samsung are doing this with Carphone Warehouse we can be certain they're not going for anything particularly ambitious.
Re: Enjoyable watch
It's possible Wozniak preferred that. It's also possible he's speaking with hindsight but per e.g. http://www.businessinsider.com/steve-wozniak-thought-the-first-macintosh-was-a-lousy-computer-that-failed-2013-6 he wasn't a big fan of the original machine.
His most specific gripe: not enough memory to get anything done, leading to endless disc swapping. I guess he wasn't the only one who thought so as the motherboard, despite using soldered RAM, contained the logic to drive 512kb (four times the 128kb it came with) and the Mac started shipping with 512k within a year or so after launch.
Re: iTunes @ Tomh (@rm -rf /)
What you're referring to is a slightly more specific than that: it's music the user has ripped themselves, and which they don't want to use a music locker service for. You can even use Google's if you want — it's free and it works across Android, iOS and the web.
So if you bought the music through iTunes you can download it again from Apple directly onto the device. If you ripped your own library to iTunes then you picked iTunes in the first place but you can just grab the MP3/AACs and take them elsewhere if you want. If you ripped it elsewhere you can use iTunes or you can use a music locker. Google's is free.
Can you name one feature that iTunes gives a mobile device that the device doesn't inherently have in and of itself? You're obviously not thinking of iOS devices since they don't require iTunes for anything. I think the second part of your gripe may not be entirely up to date.
So it's a shame the first part remains so valid. iTunes on Windows was immediately came into my mind when Jobs wrote about the perils of cross-platform development in his Thoughts on Flash.
Re: OS X on multiple platforms?
What endian problems? Third-party software hadn't been built for both platforms, obviously. Everything inside the OS worked fine.
See the documentation for the byte-order utilities — https://developer.apple.com/library/mac/documentation/CoreFoundation/Reference/CFByteOrderUtils/Reference/reference.html — search for the text "Available in OS X v10.0 and later" and you'll see it's every single function in the document. Including "CFByteOrderGetCurrent" that "Returns the byte order of the current computer." and a whole bunch of other functions that do things like "[convert] a 32-bit integer from big-endian format to the host’s native byte order." or "[convert] a 32-bit integer from little-endian format to the host’s native byte order." (all of which compile as no-ops if your host architecture is the type you describe).
That's the C stuff. The Objective-C classes like NSNumber required no special handling because their storage is opaque anyway.
Re: How about two sizes? (@John Baily)
Don't forget blingtastic "champagne".
Some have argued that the 5C is a backdoor way of bifurcating the product line. It's not unimaginable that the 5C equivalent (6C?) would get a slightly larger screen and the 5S a much larger one.
What would be surprising is if anything happens before June at the earliest, or more likely September. I expect the same people who are taking a strong interest in the next iPhone now plan to start their Christmas celebrations this year in March.
Re: I hope
Is it because he'd probably switch Microsoft to being an OS/2 redistributor, then leave for IBM as soon as the money dried up?
> I don't use Adblock on The Reg
I didn't until they started sliding things all over the screen. Hopefully I've managed to target just that one thing.
I think the issue is more that what should be just a civil wrong was met by several government agents, apparently under the control of the MPAA, with the man being detained and questioned for an hour before they could be bothered to do even the most cursory inspection of the evidence.
Re: Going to be a painful future
Agreed in principle. What American*, no matter how frightened of terrorism, actually wants their tax dollars spent on getting Homeland Security to rough up people in cinemas? No politician of any party is going to stand up and defend this.
That said, supposing the man had started filming as security approached him then the likely outcome would have been that (i) since he has now taken video footage without permission in a cinema, obviously he's a terrorist and can go straight to jail; and (ii) his device would have been confiscated before he had a chance to send footage anywhere.
(* or anyone else, anywhere else — this just happens to be an American story)
Re: PR campaign?
I don't mind letting a security firm raise its profile if it helps to create the narrative that smart appliances have more negative qualities than positive.
Re: No constitution, remember...? @BongoJoe
My understanding of the legal position is this:
Per Factortame, some acts are of constitutional significance. The European Communities Act is the one that case is about but it's far from being the only one. Such acts are special because they are not subject to implicit repeal. If a later act wants to contradict a constitutional act then it has to do so explicitly.
Subsequent acts like the Human Rights Act (which incorporates the European Convention on Human Rights) have adopted some of the logic of this line of thinking: all acts are to be interpreted compatibly with the HRA unless they state explicitly that they're incompatible. Were there no recognition of the idea of a set of elevated acts, such a provision would be void since there's an underlying rule that no parliament can dictate which laws a future parliament may make or unmake.
So if a barrister could find a suitably significant act then he or she could argue that the subsequent one is not to be applied literally as written. Similarly judges could try to finagle some sort of unintended meaning out of the literal words if they really put their minds to it.
But in England and Wales courts cannot strike down legislation in any broad sense. This is something the Americans explicitly did differently as a balance and measure to try better to ensure ongoing separation of powers. See e.g. the Lord Chief Justice prior to 2005 for an idea of how much the British system has ever been bothered about technical separation of powers. We like strong competing interests but have historically not generally been especially bothered about whether powers technically may flow from one body to another.
Re: They are now Windows-compatible
[Mobile] Safari is the same, for the record. Zooms never reflow. I can't help feeling Google is throwing away a usability advantage. At worst they should, in classic Android style, make it optional.
Re: make less than $1,250 a day...
I make less than $1,250 a day. I'm living the dream!
Re: More rushed to market nightmares
The facts don't show "that his products have been involved in a number of fires that should not be expected nor tolerated for these vehicles" at all.
The facts are that three Tesla vehicle fires have been reported, all following a collision. So the [US] National Highway Traffic Safety Administration is investigating whether there's a problem. It may find that there is and it may find that there isn't.
In most cases it'll mean recoding and recompiling to take full advantage. Probably only OpenCL apps will just work more quickly, and theoretically DirectCompute apps but Microsoft's inexplicable decision to bury that in DirectX makes it somewhat obtuse and hence obscure.
Right now Nvidia seems to dominate the market for compute languages with its proprietary CUDA, which isn't going to work on an AMD product but that and OpenCL aren't even as different as, say C++ and Java as they're built on fundamentally the same concepts. Think more like mid-'90s C++ and Ada95.
They might not buy it in preference to other options but if were de-Apple'd other than the price then there's nothing to dissuade them particularly.
The HP Z420 has the same RAM, processor, SSD size and essentially the same GPU but with twice as much GPU memory. It costs 13% more.
The Lenovo ThinkStation S30 has the same RAM and processor, half the SSD, 50% more GPU RAM but an Nvidia rather than an AMD (fantastic if you're a CUDA person but it benchmarks more slowly for OpenGL stuff), and costs almost 35% more.
That said, both of those are traditional desktops so they have the internal slots and drive bays. Also they are both computers that are already on the market versus the Mac Pro which isn't quite yet, so their pricing will have been set when components were more expensive than they are now.
Wait a few months and somebody will be undercutting Apple if they're not already, if only because Apple adjusts price and spec only maybe once a year and prices to be profitable from day one.
Re: Ditched the floppy without supplying a practical replacement (@Dave 126)
Sony launched a MiniDisc data drive but those geniuses decided to make the drives incompatible with the normal discs, creating specific MD Data media — it's is identical to a normal minidisc except that the plastic case has an angled corner in the top-right. Presumably the record label had a word about music piracy and thereby lost Sony however many billions over the course of the '90s.
Re: Bring back Aero too
I'd prefer XP with a compositing window manager and a smarter start bar. Or Windows 7 as I like to think of it.
I actually think I'd probably be fine with Windows 8 as I've noticed that I tend to launch a few applications at the start of the day and then just use those, but there seems to be no benefit in upgrading. It's a shame there's no obvious commercial model that just makes the OS updates free, without locking down the hardware and introducing planned obsolescence.
Can we expect a Sam Coupé article in five years?
I can't be the only person to have had one of those?
Also, as a niggle: I'd argue that the 68008 is either 8 bit or 32 bit as it's an 8-bit bus with a 32-bit instruction set architecture. Which I guess means it's an 8 bit machine in context, given that the hardware engineering seems to have been the primary goal of the project, Sinclair being a company that made money through selling hardware.
On the contrary, ARM is a proven architecture in a way Itanium definitely wasn't. Itanium bet on very long instruction words (VLIW): each instruction was a compound built of the instruction for every individual unit in the CPU.
Famously the Pentium has separate integer and floating point units. If it had used VLIW then each instruction would have specified both what the integer unit should do and what the floating point unit should do to satisfy that operation.
So the bet was that compilers are better placed to decide scheduling between multiple units in advance than a traditional superscalar scheduler is as the program runs. The job of organising superscalar dispatches has become quite complicated as CPUs have gained extra execution pipes so why not do it in advance?
The idea of VLIW had been popular in academia but had never succeeded in the market. So: the Itanium architecture was unproven in a way that ARM most definitely isn't.
In the real world the solution to the problem VLIW tackles has become to cap single cores at a certain amount of complexity and just put multiple cores onto each die.
Re: highly dubious (@Immenseness)
I heard that if you can give it 1.21 gigawatts then it can go almost three full days without needing to be plugged back in. It's Apple's version of wireless charging.
I think 4K is here to stay on the basis that it's very useful for computers and costs of scale make it more efficient to manufacture one type of panel for all uses.
I know you might think that antialiasing solves the resolution problem because you can no longer see the pixels, but it's effectively a high-pass filter. You lose detail and things don't look sharp. Compare any of the first generation tablets to the modern lot, even at a distance.
I think because chose and choose versus lose and loose tends to confuse people. If you were a bad speller, which of those would you think should rhyme?
You can also imagine quite a bit more of Nintendo's work getting past the censors
Admittedly I own none of the modern consoles but to an outsider the overwhelming impression is that Sony and Microsoft will sell you games where army men direct you where to stand on screen in order to see the next story-boarded explosion whereas Nintendo prefers to promote the 25th iteration of Mario Kart.
I'd dare imagine Samsung can spend a little less on advertising now that it's accelerated away from the pack. It now just needs to make sure it isn't being outdone. I also don't think the "just another Android handset" necessarily applies as people show strong brand loyalty following positive experiences even in markets where the products are essentially interchangeable — think of things like food, clothing, etc.
I feel quite sad for HTC. The One should have been the hit of last year but ended up not even reaching HTC's expectations.
I don't think it's safe to assume Apple will respond as they're happy to ignore so many other segments that would be an easy segue — think the upgradeable desktop, the phablet, the netbook and more. They also often don't do very well when they do fill an apparently obvious incremental need, so I suspect that sucks some of the motivation out of the room. See the market performance of the headerless Mac Mini.
The evolution of the iPhone screen is also a relevant case study. They quadrupled the pixels. They switched from 35mm-film style 3:2 to cinematic 16:9. Those are the two changes in seven models.
Sounds like a fringe use case?
Good on Samsung for offering a range of devices for a range of consumers but surely it's going to be overly bulky? I think laptops get away with it because you put them on a surface to use and then ordinarily sit at arm's length away.
On the plus side, the weight doesn't sound that bad. It's about 750g, apparently, which although 57% heavier than this year's iPad is just 13% heavier than last year's. It's 158% heavier than a Nexus 7 though.
Re: @ThomH Qt quick (@Richard 12)
"That's utter tosh."
No, it was poorly phrased. By "full-fat QT was considered too much hassle on a handset" I meant "the full QT API was considered to add too many unnecessary complexities when developing for a handset". No performance issue — just a question of how precisely you tailor an API for use in a specific domain to the exclusion of other uses.
"You're moaning about an API you've clearly never tried based on a brief description of an early Alpha that explained how to create a custom "button class", and ignoring the features of the beta and released."
I'm moaning about an API based on the description of it given to me by people Nokia paid to do so as pitchmen, presented a week before they abandoned Symbian, Maemo and QT.
If you're saying I should instead judge a later version then my judgment is this: it's completely useless because Nokia phones use Windows Phone.
- +Analysis Microsoft: We're making ONE TRUE WINDOWS to rule us all
- Apple: We'll unleash OS X Yosemite beta on the MASSES on 24 July
- Pics It's Google HQ - the British one: Reg man snaps covert shots INSIDE London offices
- White? Male? You work in tech? Let us guess ... Twitter? We KNEW it!
- Apple fanbois SCREAM as update BRICKS their Macbook Airs