The iPad similarly doesn't have Calculator. I wouldn't mind if it didn't make me so painfully aware that I'm out of practice with mental arithmetic.
2199 posts • joined 18 Jun 2009
The iPad similarly doesn't have Calculator. I wouldn't mind if it didn't make me so painfully aware that I'm out of practice with mental arithmetic.
Based on the number of people that seem capable of operating phones based on the Linux kernel, I'd say the tailored user interface obviates any concerns about the ability of the man on the street to use the bash/csh/X11/etc/whatever stack usually associated with 'Linux'.
That being said, I suspect Windows was chosen because it comes with good security support most of the time with the cessation of support having been hand waved away. The Linux guys are very good at updating the kernel but then who's responsible for pushing that to the machines? And the XP support period has beaten any of the commercial Linuxes by quite a stretch.
So we're accusing Apple of breaking third-party Lightning cables without any evidence that they have, comparing them to Microsoft despite that not really being something Microsoft would do, then recommending Amazon cables as still working even though we still have no evidence that others have stopped working or, therefore, that Amazon cables still do?
El gato no come?
I don't care enough to compare the specs directly but your Samsung is a different proposition because it's an internal drive with no Thunderbolt port.
However, the two other Thunderbolt SSDs available on Amazon are:
• a 256GB Lacie for £254 (which, like the reviewed product, also has a USB3 port); and
• a 256GB MiniPro for £238 (with no additional USB3).
So the Lacie is more than 40% cheaper but appears to be from a year and a half ago so is likely slower. However the MiniPro is not yet a year old and boasts of being "capable" of read speeds "in excess of 500MB/sec" and therefore would be faster if the marketing puff were reliable. So it does sound like a better deal.
Kicking it old school here, I find my AMOLED Nexus One (yes, definitely the AMOLED version: that PenTile matrix is clear) harder to read in direct sunlight than whatever my iPhone 5s has. But maybe it's just not automatically adjusting brightness or something? Better not rule out user error.
Ford at least were rumoured to be sticking with intelligence in the car but via QNX rather than the Microsoft product. I can understand the push towards devolving processing to the phone — apart from making it replaceable I think it also eases several regulatory hurdles since there's one testing standard for normal consumer electronics and another for electronics in cars — but betting on one or two specific handsets now when the car will probably still be in use by somebody in 15 years is silly.
Standards have always been a useful tool for companies to use in certain situations. Look at web standards: useful for Microsoft to ignore so it could gain a monopoly, then useful for Mozilla et al to push so that the competition would all be aiming for the same target, now useful for everyone because nobody has overall control. This is much better for the market but every step along the line has been motivated by individual interests.
In this case the combination of standards you'd need to implement would be significantly more onerous than just supporting (pretty much) a single device and I guess Mercedes et al are more interested in beating each other to market. They'd probably also argue that most of the other similar in-car systems can't connect to your mobile at all for things like GPS, music with the menus up on the dash, contacts, calendar, etc, so it's all a bonus, right?
I hope standards will be forthcoming.
I guess it gets you rich text with an established scripting language and platform-neutral API which doesn't enforce many assumptions about specific typography?
Though I'm not very enthused if the first laundry list item is a "file system browser". My OS already has one of those. Yours does too. They're probably different. Why would either of us want to learn a third?
Just use whatever's already in the OS, please. I don't care how boring it is.
The linked article states:
"A test case could have caught this, but it's difficult because it's so deep into the handshake. One needs to write a completely separate TLS stack, with lots of options for sending invalid handshakes."
So rather than the absence of proper unit testing I'd say it was the absence of exceptional unit testing, but it plays into the usual narrative around Apple's attitude towards security versus the more commerce-oriented firms.
But these same issues must be concerns for Firefox et al. How many variations of ostensibly valid but actually invalid SSL certificate can there be and has nobody set up test servers that automatically vend those? Writing a unit to test to connect to each of those, with and without a data fuzzer, doesn't sound too hard.
Mercedes had a self-driving car take itself from Germany to Denmark back in the mid-'90s so are probably used to those tech upstarts duplicating their work by now.
At $60,000 the Tesla is a budget car, silly!
Hey, businesses, wouldn't it be great if your staff were more distracted? Hey, businesses, wouldn't it be great if all your commercial practices — whether genuinely dodgy or just ordinarily proprietary — were more widely and more easily recorded? Etc.
Phwoar! Look at the foraminifera on that!
... or Great Britain? Oh, sorry, it's still optional, right? I'll ask again in a few years.
He was happy to buy a licence to post on Google Play so I suspect he's fine with licensing. Or maybe it's just that his budget was $25 rather than $99/year?
I thought Apple's recent track record on mapping was quite encouraging: they've been slowly and methodically fixing the problems without calling a press conference every three months to announce they're "Revolutionising maps. Again." or whatever. Much better behaviour than I think any of us might have predicted.
Wouldn't that just be one step towards no longer gaining weight? My understanding is that the most recommended way to lose weight is to eat a healthy mix of food in a minimum safe quantity and exercise to create a moderate daily energy deficit.
If I dare challenge the received image: Sega makes most of its money from manufacture and distribution of coin operated entertainments — not just video games but a bunch of things. It's always had a very profitable business in that. It had a successful foray into home entertainment but was losing money every year by the time the Dreamcast was on sale. It was therefore smart and pragmatic severely to scale down the home entertainment side while continuing to enjoy the coin op profits. It's impressive that such a company went for broke with the Dreamcast rather than hedging on the escape strategy but the latter was always a safe option.
No oblivion, none coming. Just a gamble that didn't pay off.
You appear not to have understood the case.
If Ford said its feature worked "each and every morning" and it didn't then you would have a case.
The court looked at the specific adverts that Apple actually used and determined that there was no problem _specifically because_ Apple never promised it would work "each and every" time.
It's all in the article.
That's exactly what a Commodore owner would say.
For the record, I find OS X's Launchpad to be stupid without a touch screen. But it was added as an extra: nothing else was taken away. If anything OS X has become more accommodating over time to those of us that keep regular use applications directly on the dock and the /Applications folder over on the right for start-menu like access to everything else, as Apple has introduced the speech bubble style folder that makes a better show of most /Applications folders.
To be fair to Rovio, playing and playing again in Angry Birds was similarly speedy back when it was a paid standalone app. The clutter of advertising has accumulated only after success. So I'm sure it's a classic tale of most of the team understanding the benefits but the marketing team having different ideas.
As for Flappy Bird? I can see the appeal: if you fail then it's unambiguously always your fault, the gameplay doesn't actually progress so there's no having to repeat yourself disincentive to hitting the play button again and it requires just enough attention to occupy you. So you end up hitting the play button repeatedly and losing track of the time. Meanwhile all it does for revenue is display a small advertising banner at the top of the game over screen but not during gameplay, which is actually quite smart because it's a contextually justified way to get a lot of impressions and doesn't annoy the user.
There are a lot of theories that Nguyen is some sort of genius — e.g. the rate button was also on the game over screen in early versions and would appear suspiciously close to where most people tap to fly. Meanwhile Apple's App Store uses recent positive reviews to weight its overall rankings as they attempt to quantify popularity by as many measurements as possible. So that may have helped give the app early momentum, whether intentional or not.
There's no such thing as CRT pixels in general; per the original back and white spec scan lines are entirely analogue, as is the display mechanism, and even with colour it's more complicated than that as there's the dot pitch and the type of separation to take into account: a low pass responds differently to a comb, etc.
Given that, why not just use the full screen at any old number of pixels? To comply with the PAL standard, the vertical sync pulse needs to be between 4.6 and 4.8 microseconds. So you need a clock speed that aligns well with that. But you also don't want to use too much RAM and you possibly want to hit a standard column count, like 80 in the case of the CPC. If you're a machine that shares memory but semi-intelligently like the Spectrum then more pixels would mean slower processing in the affected areas. You possibly also want a sufficiently trivial way to determine the start addresses for a line of pixels. And I'm pretty sure the Spectrum at least used video fetch as RAM refresh, so there were additional timing requirements there about hitting certain rows of RAM.
But the CPC, like the BBC and at least EGA and VGA video cards, uses a Motorola 6845 CRTC — cathode ray tube controller. It's programmer configurable to provide any line timings and pixel areas you want. So it's the developer's choice, subject to the comstraint that if they're not careful while developing then they might ruin a screen or two. The CPC also switches some of the address lines around to give linear memory along scan lines, rather than a BBC-style character centric layout, which introduced additional considerations.
Aside: in classic micro style, values you write to it take effect immediately so it's the mechanism by which later special effects were achieved: tell it to start horizontal sync and it'll reload the start address, but jump in at the last minute and tell it not to do so and it'll start doing pixels again in the same frame from a different address. So that's good for panels, split screen scrolling, etc. Stuff they'd eventually call 'Mode X' when someone else eventually spotted it on the VGA cards.
Gosh I'm late to the party, but:
The C64 and the Oric both use a 6502. The 6502 runs internally on a two-phase clock, like most chips from immediately before it, but is advanced enough to require only a single-phase clock input, which it doubles. As a result, e.g. the stated clock speed of a C64's 6502 is 1Mhz but if you compare access cycles and wait times, the work it's doing is broadly similar to a 2Mhz purely single-phase CPU like the Z80. Check out the memory access timing diagrams on a 6502 data sheet, then check them out on a Z80 data sheet. Check out the cycle timings for things like an 8bit add.
So, what can you do with 4Mhz RAM? You could connect it to the Oric or the C64's CPU and it would be running at four times the speed. You could connect it to the CPC's CPU and it would be running at the same speed. What you're getting in the CPC versus the other two machines is better described as: RAM that's twice as fast plus a CPU that's twice as fast (but a little more haphazard in its access patterns).
If we're citing game examples, look on YouTube for C64 Chase HQ versus CPC Chase HQ. Look at Hard Drivin'. Look at Carrier Command. Even if you just want to see how the C64 cut corners on the processor, compare the BBC Revs to the C64.
You beat me to it. But, yeah, if memory serves then the AY is three channels, each of which may be tone and/or noise whereas the SN is three tone channels plus one noise channel. Also the AY has a small number of fixed volume envelopes — timed patterns of volume ramps that it will repeat over and over again on a channel.
The AY is also marginally better for PCM output because both tone and noise are 1-bit signals and are mixed by logical OR. So I think you can rig it to give you a static non-zero wave for the CPU to throw volume levels at. On an SN you'd just ramp up the frequency beyond the audible range and let the natural low-pass filter of your ears discern the volume gymnastics.
... in helping Rand to secure his front-runner status for the 2016 nomination, especially with Christie's Bridgegate woes. You know, if the first name isn't enough.
One wonders whether Apple's much ballyhooed attempts to manufacture these things in the US (national pride and all that) might be causing problems as presumably they've had to tool up a brand new factory and train a brand new work force. They could very well be building a lot fewer than they'd wanted to.
I agree entirely. I think computers were a positive effect on children in the '80s because they booted into BASIC and therefore were an effective educational tool; also games weren't yet sophisticated enough routinely to swallow a large amount of any given day.
But then I ask myself: why do I think that? Yeah, it's because I was a child in the '80s. So my opinion is probably biased rubbish.
It's Der Spiegel, not De Spiegel. As in "the mirror" but at a completely different end of the market from The Mirror.
I don't think he does have a point: Apple has almost no expertise in being just one of many suppliers of anything. Everything Apple knows — and knows how to sell — is about tight vertical integration, with Apple being behind every part of the widget. Macs ship with OS X. iPhones ship with iOS. iPods link only to iTunes. Etc.
So while Apple would have some advantages in trying to sell an Android phone over Samsung, HTC, etc (for emphasis: _some_ benefits, i.e. they'd likely capture _some_ of the market) it'd be a riskier proposition that continuing as they are now, with no obvious benefit even if they prevail.
I therefore don't think it's correct for Apple to start selling Android mobiles. It'd be great for us, the market, but that's neither here nor there.
Modern OS X uses, effectively, Display PDF. PDF is the output of a PostScript program. Apple's Core Graphics is all the same primitives, fill modes, etc as PostScript without the PostScript interpreter. So the two have both solved the same problem in the same way.
This actually turns out to be a pretty good idea: that's why Apple's text looks like printed text, using classic printed fonts like Helvetica, and Microsoft have had to commission their own custom fonts like Calibri that are designed around their idiosyncratic ideas about typography just so that the aggressive hinting, lack of pair kerning, etc, won't look quite so retro.
So what else did NextStep do that's interesting?
It learnt the Xerox Smalltalk lesson — that full object oriented, dynamic typed languages are a great match for UI work — but adapted the language so that it's compiled, not interpreted, and can link directly to the C libraries that were otherwise industry standard. That's Objective-C. It's just as happy talking with C++ nowadays, of course. The language and the framework are why the web was first developed on NextStep, why Doom was mostly developed on NextStep, etc.
It swept aside all the nonsense with application installers by introducing the application bundle. The application doesn't just look like a single icon in the Finder, it looks like one on disk too. Dragging it to the trash genuinely is an elemental file operation, not something that someone has hacked in as a special case. (aside: RISC OS did more or less the same thing at more or less the same time, as well as the dock and a focus on proper typography; all coincidence, apparently)
It introduced the fully compositing window manager. Consider where Windows was up to and including XP: preemptive multitasking, protected memory. So it doesn't affect the wider system if an individual app hangs or flips out, right? The answer is: only if you don't care whether the screen is painted properly.
File associations are handled by metadata, not as an exercise in string matching. If I want .doc to associate with Pages by default but have a few that render incorrectly and should be opened with full-on Word, I can set those to open with Word while leaving the rest alone. This becomes a property of the file and goes wherever the file goes. It is not a hack someone added into the Finder.
It was the first graphical environment with system-wide scripting. It was designed from day one to be architecture agnostic, supporting fat binaries. It beat OS/2 to the punch on both of these things.
Beyond that the big wins are really in the frameworks themselves. Pervasive rich text, system-wide spell checking, a system-wide encrypted store for passwords, etc.
So none of those is individually a massive leap (though it depends what you compare it to; if it's only commercial competitors then Objective-C would count, as someone finally realised what Xerox had pioneered under the hood) but I'd agree that NextStep was a decade ahead in the '80s based on the combination of technologies.
Though, yeah, then they decided to price it beyond any sense and predicate the machines on a dodgy media format. I guess Jobs learnt how to price things for optimum profits by reeling in from the far end.
Reverse engineering forbidden? The right to block apps without prior notice at their sole discretion?
The Apple resolution procedure usually involves a complainant contacting Apple, then Apple contacting the developer, giving the two a chance to come to an agreement before it takes action.
So if the copyright holders had raised legal issues with Apple, Apple would have forwarded them to Elite and waited for further instructions. At that point Elite could easily "voluntarily" pull the apps.
Jobs wasn't booed by Apple employees, he was both booed and cheered — the video is easy to find — by a crowd of convention goers. Imagine the reaction Michael Foot would have received from a satellite link-up with Mrs. T at the Labour Party conference, then divide by about a hundred million. Chris Christie got a lot worse than Jobs did for embracing Obama. Though if I have to cite the Republican Party to make something else look reasonable, maybe I've already lost the argument?
Conversely, I found it to be speedy*, neither well nor poorly configured, and with a mouse so badly designed it made me want to go out and kill someone. The machine was all but unusable until third parties finally started making USB mice.
* in the same way that Pentium IIs felt at the time; not as the advertised massive leap forwards.
Can someone explain to those authors that they should complain to Apple? There's a resolution process in place for this sot of thing — they can get the apps withdrawn unless or until Wilcox starts paying them.
Judging by u-turns to date, stupid. Here's just 45 of those that the coalition made in its first 42 months in power, albeit not from the most sympathetic source: http://www.theguardian.com/politics/2013/nov/28/coalition-u-turn-list-full
Copying the Apple Store plan also seems to be working brilliantly for Microsoft. The company itself has declined to give any specifics; the only analyst's guess I can find on Google puts them at a quarter as profitable per square metre.
At least if Samsung are doing this with Carphone Warehouse we can be certain they're not going for anything particularly ambitious.
It's possible Wozniak preferred that. It's also possible he's speaking with hindsight but per e.g. http://www.businessinsider.com/steve-wozniak-thought-the-first-macintosh-was-a-lousy-computer-that-failed-2013-6 he wasn't a big fan of the original machine.
His most specific gripe: not enough memory to get anything done, leading to endless disc swapping. I guess he wasn't the only one who thought so as the motherboard, despite using soldered RAM, contained the logic to drive 512kb (four times the 128kb it came with) and the Mac started shipping with 512k within a year or so after launch.
What you're referring to is a slightly more specific than that: it's music the user has ripped themselves, and which they don't want to use a music locker service for. You can even use Google's if you want — it's free and it works across Android, iOS and the web.
So if you bought the music through iTunes you can download it again from Apple directly onto the device. If you ripped your own library to iTunes then you picked iTunes in the first place but you can just grab the MP3/AACs and take them elsewhere if you want. If you ripped it elsewhere you can use iTunes or you can use a music locker. Google's is free.
Can you name one feature that iTunes gives a mobile device that the device doesn't inherently have in and of itself? You're obviously not thinking of iOS devices since they don't require iTunes for anything. I think the second part of your gripe may not be entirely up to date.
So it's a shame the first part remains so valid. iTunes on Windows was immediately came into my mind when Jobs wrote about the perils of cross-platform development in his Thoughts on Flash.
What endian problems? Third-party software hadn't been built for both platforms, obviously. Everything inside the OS worked fine.
See the documentation for the byte-order utilities — https://developer.apple.com/library/mac/documentation/CoreFoundation/Reference/CFByteOrderUtils/Reference/reference.html — search for the text "Available in OS X v10.0 and later" and you'll see it's every single function in the document. Including "CFByteOrderGetCurrent" that "Returns the byte order of the current computer." and a whole bunch of other functions that do things like "[convert] a 32-bit integer from big-endian format to the host’s native byte order." or "[convert] a 32-bit integer from little-endian format to the host’s native byte order." (all of which compile as no-ops if your host architecture is the type you describe).
That's the C stuff. The Objective-C classes like NSNumber required no special handling because their storage is opaque anyway.
Don't forget blingtastic "champagne".
Some have argued that the 5C is a backdoor way of bifurcating the product line. It's not unimaginable that the 5C equivalent (6C?) would get a slightly larger screen and the 5S a much larger one.
What would be surprising is if anything happens before June at the earliest, or more likely September. I expect the same people who are taking a strong interest in the next iPhone now plan to start their Christmas celebrations this year in March.
Is it because he'd probably switch Microsoft to being an OS/2 redistributor, then leave for IBM as soon as the money dried up?
> I don't use Adblock on The Reg
I didn't until they started sliding things all over the screen. Hopefully I've managed to target just that one thing.
I think the issue is more that what should be just a civil wrong was met by several government agents, apparently under the control of the MPAA, with the man being detained and questioned for an hour before they could be bothered to do even the most cursory inspection of the evidence.
Agreed in principle. What American*, no matter how frightened of terrorism, actually wants their tax dollars spent on getting Homeland Security to rough up people in cinemas? No politician of any party is going to stand up and defend this.
That said, supposing the man had started filming as security approached him then the likely outcome would have been that (i) since he has now taken video footage without permission in a cinema, obviously he's a terrorist and can go straight to jail; and (ii) his device would have been confiscated before he had a chance to send footage anywhere.
(* or anyone else, anywhere else — this just happens to be an American story)
I don't mind letting a security firm raise its profile if it helps to create the narrative that smart appliances have more negative qualities than positive.
My understanding of the legal position is this:
Per Factortame, some acts are of constitutional significance. The European Communities Act is the one that case is about but it's far from being the only one. Such acts are special because they are not subject to implicit repeal. If a later act wants to contradict a constitutional act then it has to do so explicitly.
Subsequent acts like the Human Rights Act (which incorporates the European Convention on Human Rights) have adopted some of the logic of this line of thinking: all acts are to be interpreted compatibly with the HRA unless they state explicitly that they're incompatible. Were there no recognition of the idea of a set of elevated acts, such a provision would be void since there's an underlying rule that no parliament can dictate which laws a future parliament may make or unmake.
So if a barrister could find a suitably significant act then he or she could argue that the subsequent one is not to be applied literally as written. Similarly judges could try to finagle some sort of unintended meaning out of the literal words if they really put their minds to it.
But in England and Wales courts cannot strike down legislation in any broad sense. This is something the Americans explicitly did differently as a balance and measure to try better to ensure ongoing separation of powers. See e.g. the Lord Chief Justice prior to 2005 for an idea of how much the British system has ever been bothered about technical separation of powers. We like strong competing interests but have historically not generally been especially bothered about whether powers technically may flow from one body to another.