Re: While I was rummaging through a box...
Grab some UEFs from stairwaytohell.com then use something like uefreader.sourceforge.net to output as audio...
2094 posts • joined 18 Jun 2009
Grab some UEFs from stairwaytohell.com then use something like uefreader.sourceforge.net to output as audio...
The 2GB RAM was the turn-off for me. If I'm basically going to be unable to use the desktop then I might as well buy an ordinary tablet — it'll pretty much certainly get me a much higher pixel density for the browsing, media consumption, etc, that you're pretty much limited to anyway.
And don't tell Microsoft, but other office suites are available. Including ports of OpenOffice for Android — install AndrOpen Office and you'll actually feel like you're using a Windows desktop application.
I found it unclear where the $299 claim comes from; the device is $408.60 on Amazon US (before sales tax) and £339.99 on Amazon UK (with VAT). Take the VAT off the UK price and you get £283.325 (sic). Convert that to dollars and you get $463.83.
So the pre-tax difference is that UK residents pay an extra $55.23 — about £34. Which is almost exactly 10% of the price, but nothing like as bad as $299 versus £350.
The equivalently-priced Surface isn't more flexible or less restricted than the iPad — all apps have to come from Microsoft's storefront with Microsoft's blessing.
Based on price, Apple's competitor to the Surface Pro is the MacBook Air, which again is pretty much exactly as flexible and unrestricted.
... that should be enough to meet the demand consumers have so far shown for smart watches.
The big iPad would be interesting if they were to merge iOS into OS X (ie, make both sets of API available) but that feels exceedingly unlikely. It also suggests I've learnt nothing whatsoever from Windows 8.
I think you've got a very peculiar definition of stolen. E.g. the BSD authors want other people to reuse their code in any way that it proves useful. They're probably very happy that Apple uses their code.
To put it another way: if Apple "stole" the mouse despite it being implemented widely by others earlier then it follows that Samsung did "steal" multitouch, etc from Apple. If what Samsung did was reasonable then Apple's use of the mouse was also reasonable. You can't have it both ways.
It is, at worst, hypocrisy rather than irony.
To be fair, Apple's claims to be an innovator are primarily outside court. Inside court it merely establishes that it holds design patents and that argue that the defendant infringed on those patents, which they usually have because the patent system allows ridiculously broad design patents. Apple then throws in that the patents were copied maliciously, specifically to usurp the iPhone, because that works in their favour for quantification of damages.
You can dislike the people at Apple for the smugness of their advertising, and you can dislike them for the cynicism of their legal manoeuvres but you're disliking them for two separate reasons.
If you're anything like me you can even dislike them for those things but still rank them in the top half of the tech industry as Apple's sins have a much more diffuse effect on the market and on individual consumers than the classic villains.
This is Bill & Melinda Gates' other contribution to world health.
... well-engineered and extremely popular as it was, its users' obsession with skins and customisation was the first thing that made me feel that I'm something different from the typical self-declared "technical" user. I just want the music to play and the thing that does that to be as invisible as possible.
It's sad to see WinAMP go though. It ushered in the modern world of music consumption.
I thought they became acceptable when sufficiently much of what consumers use computers for had moved into the browser so that, with the native Microsoft Office also available, the software gap no longer mattered. Apple's transition to a competent OS in the years immediately preceding helped too. The switch to Intel was just the icing on the cake.
As for BlackBerry, I have to agree with the other commenters that a difference here is that Google probably isn't going to go on an anti-competitive market fixing spree. I also don't see how it matters that much to BlackBerry if people end up thinking of their OS as just a weird version of Android — from where they are now, anything that makes money is a win.
I wouldn't read too much into it — BBC 3 is evidence that plenty of people at the BBC hate the public in general.
While I agree with you, I wouldn't rule out an attempt by Apple. I think they might use their "retina" sales tag to try to grab some cash during the great up-sell to 4k. With a built-in AppleTV and the assumption of Wifi it'll play well with Apple's traditional sales pitch of simplicity — just plug it into the electricity.
Use their Hollywood clout to talk the networks into live streaming through a unified platform and they might even have a compelling sales pitch. For the Americans, whose networks they'd actually bother with, anyway.
It'll need to happen very soon though.
It always used to be the Microsoft dictum that they would always prefer to purchase than to license — e.g. that's why they foisted Helvetica-alike Arial on the world rather than using the real thing. But I guess that was back when there was virtually unbounded room for growth. They were probably more cautious about a peripheral for a video game console with known sales figures.
With this device I can rule 13% of the smartphone-buying world!
The new iPhone launched 11 days before the end of Q3, and even then only in its first ten countries. There are a lot of reasons to think Apple's stubbornness is costing it huge amounts of marketshare but the Q3 year-on-year sales report isn't one of them.
My feeling is that Apple succeeds when it is technologically ahead. It slowly withers when it tries to rely on being cool. With the Mac they eventually found a way to make it sustainable but they've never managed to turn product lines around and reclaim dominance in a market they've previously lost. So I'm not optimistic for iOS, especially if the 5C is all we're getting as a price reduction.
I think it was more the bit where if he had read it in the book then it had to happen than the getting into NY trouble, as per the stuff with River's wrist. The book said "final farewell".
So, yeah, convoluted plot contrivances concerned with arm waving and brief mentions of causality.
Between this and PNaCl it sounds like Google has two broadly similar irons in the fire. Buyer beware.
BlackBerry's problem was that it wanted to ship a modern OS in a touchscreen mobile. It didn't have the technology ready. So it threw away its existing customers in order to spend a few years developing a me too product.
Apple's problems are entirely distinct. It's comfortable servicing one segment of the market and that segment isn't growing much any more. It seems vaguely interested in other segments (per the 5C) but doesn't really seem to have much enthusiasm in pursuing them.
The two are in very different positions. It's likely that what happens to them from now on will be very different.
See also: what Windows did to Commodore and Atari versus what Windows did to Apple.
That programme stuck with me for years before I once again found out what it was — but I guess the plot line of everybody in a school being given a free computer was an easy sell to a child such as I.
I think he was just comparing Nexus 4 sales to Apple sales — one supplier versus one supplier rather than every Android supplier versus the one iOS supplier.
That said, who would really expect the Nexus 4 to outsell the iPad Mini? It's an unsubsidised phone available only via mail order. That's just not how most people buy their phones. I'll bet the Nexus amounts to a tiny segment of Android sales.
My quick search has revealed rapidonline.com will ship overseas, but if you don't want the Pi then there seems to be no way to obtain the SD card. It's about $278 for the bundle containing everything after postage has been added.
I would do but since Anonymous has been active for several years there's quite a lot of evidence about their hive mind and since this article is about — amongst other things — one lot of them trying to change the global surveillance infrastructure by defacing a dry cleaning company, the criteria you mention would falsely exclude a lot of Anonymous members.
Or maybe you know something we don't? Maybe the dry cleaning company wasn't falsely classified based on no evidence? Maybe the Australian government does all its spy work via cleaning companies?
People they can rule out immediately:
• those with a decent grasp of the English language;
• those who have ever contributed anything constructive to society;
• anti-bullying campaigners and their sympathisers;
• anybody that thinks, on reflection, that joining a baying mob is a bad idea.
You mean the slit-scan photography, as most famously used by Who in the Tom Baker title sequence? That's actually a fairly easy idea for most computer scientists to grasp. You know how parallax scrolling works in video games? You know how by the early 1990s they were routinely doing every single scan line individually to give apparently perspective to floors and ceilings?
Imagine you created that effect visually by cutting a horizontal slice in a piece of card and suspending that a distance above a colourful 2d image. Then light the image and make everything else pitch black. Take a single frame of film by pointing the camera at the slit and moving it directly towards or away from the surface.
The effect is that the only part of the film that's exposed is that which can see through the slit. As the camera moves, the slit moves within the frame to expose a different part of the 2d image and you're now closer to the image so its larger. So you get an apparent 3d transformation of the 2d image built up with a continuous equivalent to the parallax floor.
Move the background image a little and repeat for the next frame. And again. And so on. Then you've got the pattern apparently coming towards you.
Then cut a more interesting shape than a horizontal line and add Tom Baker's face on top.
Android allows both Dalvik and native apps to coexist as first-class citizens; Dalvik is oriented towards a completely independent implementation of Java. Android sits upon the Linux kernel but in modern terms the kernel is just a tiny fragment of an operating system. The libraries, widgets, etc are just as much what people think of when they think of an OS.
This story has one upside, at least.
Some sort of side-view real time strategy game, I think, clearly for the BBC Micro.
You should have a word with Microsoft. They seem so sure that the Intel-bearing Surface can't be made cheaply that they endured a US$900m write-down halfheartedly trying to push an ARM version.
The Surface Pro 128GB list price was £900; presumably the Haswell-powered Pro 2 will be the same. Chuck Apple an extra £200 for the entry level 13" Retina MacBook Pro and they'll give you a larger, higher resolution screen and a 50% faster CPU.
They don't say that. Only an AC could make such a ridiculous straw man argument, etc, etc, etc.
I attended university at the turn of the millennium; I was a young child during the '80s and packed my teenage years entirely into the '90s.
My experience, shortened to the interesting bits: I received an obsolete micro from the classifieds somewhere in the early '90s; left to figure things out on my own as at that stage the computer had no magazines or commercial support I achieved some things I'm very proud of but was remarkably naive in other areas.
In the late '90s we got a PC and the Internet. So suddenly I had access to unending reams of documentation and properly technical people to discuss things with. My abilities took a huge leap forward. I progressed much faster than I probably would have if I'd continued in independent study or muddling through with a single book or two.
As a result, just as others above think the most educational environment was having limited choices and needing to figure everything out for themselves, I think the most educational environment was taking a bit of time to get the absolute fundamentals down then being exposed to the breadth of everything available. Probably people a decade younger than me that the best way to learn is to be dropped in immediately amongst the breadth.
It'll be interesting to see what the second article advocates but too many of the commenters seem to be confusing causation and correlation so as to jump from perceiving an experience to be common to suggesting that it's a good idea.
It runs slightly contrary to the rose-tinted nostalgia of some of the other posters, but many of the worst coders I've worked are those who start with increment coding and debugging and proceed to the conclusion that the correct way to figure out how libraries work is by empirical investigation. Reading documentation just takes time, right? And if nobody's going to read it, why write it in the first place?
If I were asked to come up with a related rant immediately it would be about people who think that date handling is easy, so they wrote it all themselves, and mysteriously enough their code gets the length of a day wrong twice a year. But that's okay because their 200 lines of date handling "was a quicker solution" than five lines of API calls that would require you actually to have learnt about what's already provided. If you wanted a rant tomorrow? Probably something else.
If memory serves, the Ark set is used to bookend the series — the Cyberman story at the end returns to and reuses parts of it. That was presumably some sort of budgetary jiu-jitsu?
In this case they didn't — it's a standard buffer overflow attack.
But if you're asking who would? Well, early-1990s Microsoft did in WMF. They closed that hole back in 2006 but you should be able to find ample reporting from then.
I have a Western Digital 'My Passport' USB 3 1TB device. It's just for Time Machine so I plug it in and ignore it. Everything is working without issue.
Naturally I haven't installed any of WD's custom software because I'm about as enthusiastic about that as I am about consumer printer drivers — one just assumes it'll inexplicably install a 2GB boot-time kernel extension in order to be able to say in a booming voice "You have 20GB of space left; visit www.westerndigital.com to buy another hard disks" every ten minutes. And probably require a network connection.
The shell companies comment makes it pretty clear whom AC2 thinks is most culpable. It's also unnecessary since we know exactly which company is taking legal action and which other companies are behind it.
Of the various company pairs, Nokia was first to sue Apple, Apple was first to sue Samsung and HTC, Motorola was first to sue Apple, Microsoft extracts royalty payments from all of those except Apple and has made FRAND claims against Motorola (leading to a countersuit in layman's terms at least), Samsung was first to sue Ericsson, Sony's main involvement has been via a holding company with Nokia which has sued Apple, Google has been involved primarily through support for HTC in a countersuit and, of course, through its acquisition of Motorola, and BlackBerry (/RIM) appears to have suffered only at the hands of companies other than all of those.
Nobody's hands are clean.
@AC1: that's bound to be what he's reacting to, and he's right. The market would be much more advanced if the focus was exclusively on how the things consumers like can be made even better, not who has the best legal protection on those things. Pleasing consumers should be paramount in a functioning market but the way that inventor/creator protections have been so massively distorted tends to subjugate them somewhat.
@AC2: the Rockstar Consortium is who is suing. Apple deserves to be attributed with responsibility for some of that. So do Microsoft, BlackBerry, Sony and Ericsson.
The iPad 3 came out last March with a big press event and lots of publicity about its 'retina' screen.
The iPad 4 came out eight months later in November, having been mentioned almost in passing at the iPhone launch with "oh, and we've added the new connector to the iPad too".
The iPad Air has had the big keynote treatment again.
So, yeah, 4 versus Air is technically accurate if you're asking about Apple's year-on-year health, but 3 versus Air would be more realistic, I think. Otherwise this is more like trying to determine how much Apple benefits from its usual method of product introduction.
Blaming Apple for anything here is absurd — video games consoles have been a walled garden since the NES's lockout chip.
I think the difference is that BlackBerry were overtaken by technology — everyone else's products were much better while they ran around buying QNX and desperately trying to get BlackBerry 10 ready. The same isn't true of Apple. A majority of people prefer alternatives and there are a bunch of things Apple consciously disallows but you don't get the sense that they're technologically behind. They're just control freaks with very specific ideas of what a device should be.
But have you heard about the really amazing new bread they're getting in next Wednesday?
The Apple shop employees are paid to clap and cheer like morons at every product launch. They'll even insist on giving you high fives on your way, though they are at least doing it in order to provide themselves with shelter, warmth, food, etc, rather than because they honestly think that shaving 180g off an already successful product is really exciting.
I guess the issue on Apple's side, when launching modest annual evolutions, is whether to risk a damp squib of a launch event like this, or to forego one for the first time since the iPad was announced. Damp squib it is.
It was set in San Francisco — specifically all of those flat parts of it that look spookily like Vancouver.
Per http://www.w3counter.com/globalstats.php the global web browsing market share for every version of iOS in the top ten (ie, 6 and 7) is 8.64%. The equivalent share for Android (which includes versions 2 and 4) is 5.22%.
From that we can conclude that, for web browsing, iOS is used about 165% as often than Android. But that doesn't differentiate tablets and phones so standard comments about pay-as-you-go contracts apply — because Android devices are more affordable, they're much more likely to be bought by people that don't intend to spend a lot on data. That doesn't mean they're not being used for Temple Run.
And to put the whole thing in perspective, Windows (XP + 7 + Vista) is used about 745% as often as iOS.
Apple of the 1990s is why Apple no longer thinks it is safe to try to fence off one corner of one market.
Given that the company is very good at product launches, it has usually followed a strategy of expanding into neighbouring markets rather than expanding its range within a market. From computer to MP3 player, from MP3 player to phone, from phone to tablet. It probably helps that it's so bad at the other idea: witness the cheap Mac (starting at only US$599!) and the cheap iPhone (only US$99 on a contract!).
They'll probably be healthy with 10% of the high end of three markets. If they want to expand they'll find some other market to try to muscle into.
H.264 is getting on a bit and if the MPEG formats remain the industry standards then the MPEG members are more likely to make some serious money on HEVC/H.265.
Cisco makes a lot of money from telepresence equipment, including software and the physical stuff you kit out your boardroom out with. If it's automatically compatible with Android and iOS and everything else that comes with a browser but has no plug-in architecture at all or for which it would be a major hassle to maintain a plug-in then that's a big plus.
My 4S was the same, though if memory serves it could do only barely more than a day when new, so battery decline isn't so much of an issue as the battery life being bad in the first place — albeit along with most of the rest of the industry.
I've considered all the possibilities and I think: we'd have missed out on Clippy.
Snide remarks aside, respects to Bill Lowe. How many of us can claim to be be so sure in our belief about where the future lies as to overcome the inertia of a company like IBM and shape it ourselves?
There's no evidence whatsoever to suggest that "The decision to remove CD drives is also cost driven". It further strikes me as at leart an order of magnitude more likely that the decision was motivated by the desire to make the machines thinner and lighter (it's one of Apple's favourite boasts) and to boost the battery life (both by eliminating the last moving part and by making more space available for those custom shaped batteries they love to glue in).
There's a difference in evaluation between a premium product and a budget product. Whether you think Apple produces a premium product or not is immaterial — some do, and they believe they're paying for (i) the hardware; (ii) the design; and (iii) to enter or remain in the ecosystem. They don't evaluate on component mark-ups alone because they believe that (ii) costs money and (iii) adds value beyond the components.
So they ask themselves "do I want the machine with the 5 megapixel screen that runs below 20 decibels for 12 hours between charges and weighs less than 2kg?" not "which machine is the cheapest for the performance level I've deemed adequate?"
If you think most people don't agree that Macs are worth the money then you're right. That's why at least 85% of buyers pick something else, even according to the most optimistic estimate of marketshare that Google could find me.
The new Macbook Pros are cheaper than the old. The 15" isn't a like-for-like comparison because the new entry level omits the discrete GPU and the old didn't, but last year's 13" was $1,499 at the time of discontinuation and the new is $1,299. That's a bit more than 13% cheaper.
I also think we're already several years past the point where people who like to upgrade their computers would consider a Mac?