Re: Another application?
Morse code? That doesn't even have the prefix property. Maybe a Huffman coding on the alphabet with frequencies dictated by your language?
2009 posts • joined 18 Jun 2009
Morse code? That doesn't even have the prefix property. Maybe a Huffman coding on the alphabet with frequencies dictated by your language?
That's what the MagicCode product mentioned in the article deals with — executing ARM code on MIPS. So they've got the native code problem solved.
Apparently not obviously enough for you.
DrStrangeLug's comment parodies the way that Apple often acts and/or is treated as though it invented things like the MP3 player, smartphone, etc, rather than merely launching commercially and critically successful versions. JDX's comments parody the image of an Apple fan as genuinely believing the invention myth, and of Apple announcing each new iteration of a product as revolutionary when it's merely a minor step forwards.
The clues were: DrStrangeLug's abstract description of a desktop computer, and JDX's singling out of 1997's iMac — the first big Jobs launch that did a lot to create the modern Apple — as the original computer.
For extra amusement, lurker appears to think JDX is being serious and, apparently, so do you. I further found it amusing that, given the level of debate around here, it's unclear whether the down votes are from anti-Apple people not getting the joke or pro-Apple people taking things too reverently.
I haven't specifically checked the schedules but there are probably some sitcoms on BBC3/CBS (delete as per your side of the Atlantic) this week that will cater to those that need this level of explanation.
What do we think on the down votes here? Is it the blindly pro- or the blindly anti-Apple failing to get the joke?
The lens flare wasn't fake, at least in the sense of added on later. it was created by the actual set versus the actual lenses.
... which should probably teach us that the difference between something being immersive and being distracting is not solely related to the extent that digital tool were used.
If you'd used FUNC+K then think how much time you could have saved!
It was always Starship Command for me, and some Repton 2. Oh, and Gaunlet, the Defender clone.
The Slogger has a really neat trick — if the program counter goes beyond D000 (if memory serves) then writes automatically go to the video page. Othwise they go to the shadow page. Why? Because that's where the ROM routines for graphics output are and the original machine ROM is used unmodified.
You take "Apple is preparing to train its shop staff to use Microsoft Windows in a bid to flog more Macs to business users reared on PCs" to mean "users who don't want to use OS X"? That's like saying that if shop staff are trained to demonstrate a browser then that means they're targeting customers who don't want to use desktop applications. It's a false dichotomy.
VirtualBox is available and just as good as elsewhere. Giving everyone the benefit of the doubt and assuming all decisions were made purely on merit, I imagine that Apple might prefer to push Parallels because it does a lot of work tightly integrating the two environments. For example, file associations work in both directions, from the Finder onto the Windows desktop and vice versa, the start menu is added to the dock, your Windows applications appear in /Applications and hence in the Launchpad, etc.
It's actually why I stopped using Parallels and started using VirtualBox. I needed Windows for a particular application only and otherwise to be contained in its box — I actively didn't want Parallels to start messing about with my whole system.
There's really no basis to conclude that Apple is "trying to sell systems to people that don't want to use their operating system". Much more likely they're trying to sell systems to people that do want to use the Apple operating system but falsely believe that they can't because a vital piece of software isn't available in a native port.
Otherwise they'd be demoing BootCamp rather than Parallels.
I've seen architects with Macs, but they've all been running BootCamp. I suspect Autodesk noticed the same thing, hence why AutoCAD is now available in a Mac native port again after a nearly two-decade absence.
This movie hasn't sold. Deadline's summary of the weekend US box office results was: "Oprah’s PR Blitz Helps ‘The Butler’ Open #1 With $25M: Soft Box Office As ‘Kick Ass 2′ Falls, ‘Jobs’ Biopic Dies, ‘Paranoia’ Bombs"
See the original — http://www.deadline.com/2013/08/did-oprah-affect-box-office-the-butler-tops-box-office-friday/ — for the full text, but highlights are:
Flopping in wide release (2,381 theaters) was ... Jobs [which] came in only #7 with a meager $6.7M despite a plethora of TV ad buys. ... Rotten Tomatoes critics only gave it 24% positive reviews because of its superficial made-for-TV depiction of a complex creative and business icon. Still it’s surprising how many Apple devotees stayed away despite the marketing’s psychographic targeting to them.
If they'll be shipping it with only 1gb of RAM and given that iOS doesn't use virtual memory for process storage, why would Apple want to transition? The only use under iOS as currently designed would be to allow larger sections of the disk to be memory mapped (a virtual memory use Apple does permit), which is not exactly a limitation developers often run up against.
I could understand it if the risk were ignoring the next step until it's too late but the 64-bit ARM architecture is ready to go and Apple controls both the tools and the channel of distribution so it can force very quick changes in those.
As Apple doesn't usually implement technology until there's a pressing business need to do so, this rumour doesn't sound all that likely to me.
While it's possible Apple's next release will shake things up, years of Intel's next processor always being the one to unseat ARM, the next Windows Phone to be the one that storms the market, next year to be the year of the Linux desktop, etc, I've come to the conclusion that tech forum commenters have a very skewed idea of how quickly these things change.
General trends will probably continue.
It'd be wrongful too, given that we give employees a right to notice. So it wouldn't happen in the UK for at least two reasons.
On the contrary, quite a lot of the fundamental employment legislation — e.g. the Equal Pay Act 1970 — was brought in only either to meet EU obligations or to bring UK law into line with European expectations with an eye towards future membership.
The way EU law works in the UK is that it has to be explicitly enacted by our parliament. They have an accelerated process for a lot of the piecemeal regulations but whenever something big comes along they often write a full-blown domestic bill for it.
They're very similar to the colours of many cases. And they're made out of plastic so that you don't have to surround your anodised aluminium in a plastic case.
I own: an iPad 3 and a new Nexus 7.
The 7's user experience still isn't quite as smooth as the iPad, for technical reasons that are fairly easy to understand: iOS has Core Animation, which handles 60fps user interface transitions in a separate thread with all the tricky main thread interaction stuff handled for you, and has essentially programmatic layout where writers need optimise for only three screen layouts. Android is declarative and people can't reasonably optimise for every device out there.
If there's a reason the iPads feel more responsive then it's probably that, but it's a very shallow impression. If I actually do things like time how long it takes an application to launch, how long it takes a web page to appear, etc, then the Nexus wins — and it does so for half the price.
Are you Michael Dell?
I'm curious about this too, but with the caveat that the PowerPC architecture obviously has some advantages for some demographics — the first generation XBox was x86 based but the 360 switched to PowerPC, even at the cost of Microsoft having to buy a company to supply emulation software for backwards compatibility. Similarly you'll find PowerPC-derived parts in the PS3 and every Nintendo since the Gamecube.
Then again, both Sony and Microsoft have switched to essentially the same x86 architecture for the next generation so maybe that demonstrates nothing.
Some Android titles are built using the native development kit, which builds directly for the ARM and whatever else the developer chooses. If an app is compiled for native execution then usually it's a game — it's especially likely with ported titles.
Besides customisability, amongst ARM's other strengths are code density (in thumb mode, anyway) and power efficiency.
I guess in England & Wales they could try to argue passing off — that the second company is trying to acquire goodwill by naming their product similarly to the first company's, and in so doing is damaging the first company's standing. But it requires a proper misrepresentation whereby Bang With Friends would actually be meant to be perceived as a Zynga product rather that just having a bit of fun with the name so, yeah, it feels unlikely.
Per GeekBench numbers the iPads really divide into three groups:
The iPad 1, with a score of 454.
The iPad 2, iPad 3 and iPad Mini with scores of 781, 791 and 745 respectively.
The iPad 4 with a score of 1756.
So if you bought an iPad 2 then you've got a device a shade faster than one of the current generation models. Naturally app makers are therefore still going to optimise for that level of performance.
Conversely if you went out and bought a Motorola Xoom or one of the other early Android tablets then not only did your specific device fail to capture enough of the market to be interesting in itself but also the devices that have powered the subsequent explosion in Android share are noticeably faster (yes, even the Kindle Fire). So app makers are unlikely to spend much time optimising for that level of performance.
Did you two read the same article as I did? The third paragraph says:
"... 'pads, including white box devices, grew 43 per cent to 51.7 million standalone products."
The overall tablet market is very healthy; the story is simply that Apple's shipments are down 14% and that in a quickly inflating market that has translated to a drop in market share of almost 20 points. So there's a bit of a changing of the guard going on, with various flavours of Android (ie, I'm counting Amazon's fork) in the ascendant.
I would imagine Samsung's impact on Apple is very limited compared to that of Asus/Google and Amazon.
Depends how they implement it; the (in my personal experience, quite frequent) home button failures are to do with moving parts whereas a fingerprint scanner — at least the sort you usually see on laptops — doesn't have any moving parts. So I guess it depends on what's going to happen to the button overall.
I don't think they're selling very well at all. The Apple third quarter results press release (http://www.apple.com/pr/library/2013/07/23Apple-Reports-Third-Quarter-Results.html) says this:
The Company sold 31.2 million iPhones, a record for the June quarter, compared to 26 million in the year-ago quarter. Apple also sold 14.6 million iPads during the quarter, compared to 17 million in the year-ago quarter. The Company sold 3.8 million Macs, compared to 4 million in the year-ago quarter.
The Apple TV isn't mentioned anywhere in the press release summary; one can therefore assume it's not significant to the larger report.
... and as someone aware of OS/2 and exposed to the standard array of Acorn machines that filled UK schools in the late 80s through to the early 90s, I am amused to see Amiga owners moaning that they had pre-emptive multitasking a decade before 90% of the market given that they didn't have (i) built-in buttons, menus or any other widgets; (ii) protected memory; or (iii) any significant inter-app communications.
It was a diverse mix back then — no single system ticked every box.
I think this is Google's attempt to sweep away DLNA and AirPlay in favour of building programmability into the receiver based on emerging web standards — assuming Google's "everything must be approved by us before deployment" stance is temporary that will remove the gatekeeper from the process while allowing parties with content to control development and deployment cycles.
Although a lot of TVs have very similar programmability built in, few of them are mutually compatible and most of them have an app store in the middle. If you're Amazon or whomever, you can't just write your mobile client and your web app and have the former instruct the TV to move to the latter.
I think Nintendo's problem is that its brand is perceived as amusement for normal people. That helped a lot when the only competition was Sony and Microsoft's ever better visualised games in which army men tell you to walk yo a certain place. Now though Nintendo is competing with Google and Apple, and the target Nintendo consumer no longer thinks games are worth £40.
Ummm, the Macintosh was introduced and sold very quickly, then sold very slowly. Tommy Cooper died live on stage. The GCSEs were introduced. If just 1,900 voters had felt differently, Reagan would have been the first candidate to carry every single US state (though Nixon's reelection was more successful if you discount the electoral college). Frankie Goes to Hollywood were massive. Jet Set Willy was released.
Since coming to the US I've been required to submit fingerprints: (i) in order to apply for a visa; (ii) every time I wish to enter the country; and (iii) in order to get a driving licence. The last one applies equally to US citizens, and is the general norm for official agencies.
To my native UK I've been required to submit fingerprints: never.
So I suspect Americans have already come to terms with giving their fingerprints away.
I think this may be what is happening:
When the MaBook has better battery life, people who want better battery life buy it. They say they like it because it has better battery life.
When the MacBook is lighter, people who want a lighter laptop buy it. They say it is better because it is lighter.
When people imagine there's some sort of pro-Apple cheer leading squad conspiracy, their mind just assumes that any pro-Apple comment must have an ulterior motive.
Per CNET, Apple wanted $1/port circa 1999. Licensing is now handled by everyone's other favourite, the MPEG LA, seemingly for more like $0.25/device but I can't find any confirmation of when it was handed over and therefore when prices became more reasonable.
AnandTech reports that there's no per-port fee for Thunderbolt so it looks like the cost is just whatever the actual chips cost from Intel.
So a fairly obvious prediction: Thunderbolt will take off if and when Intel incorporates it with whatever other silicon so that they're offering it for free if you buy whatever the modern equivalent of a centrino is.
No, I stand corrected. Based on a component costing over at Tech Insights, the key physical components in an iPhone 5 are:
• the A6 processor; probably fabricated by TSMC or Samsung;
• RAM from Elpida;
• antenna switch and wifi/bluetooth wireless chips from Murata (incorporating a Broadcom component);
• a Triquint power amplifier;
• an Avago chip for LTE/UMTS/CDMA;
• four separate parts from Skyworks for power amplification;
• a Dialog Semiconductor power manager;
• a Cirrus Logic audio codec;
• accelerometer and gyroscope from STMicroelectronics;
• further Broadcom components for the touchpad controller;
• Qualcomm chips for further LTE stuff, DC-HSPA+, EV-DO, GSM, CDMA, etc and power management thereof;
• a Texas Instruments touchscreen controller;
• SanDisk flash; and
• an RF Micro Devices antenna.
How many of those companies are suing Apple or is Apple suing?
"Most of the technology" was supplied by FingerWorks, ARM and PowerVR.
I'm aware of the precedent — people completely failed to sell any tablet computers for more than a decade before that market really opened up — but the Sony smart watches are mature, inexpensive at around $100, and they work with Android phones so they do email alerts, Twitter, Facebook, whatever else your phone does. Yet nobody is buying them.
KHTML: began 1998
WebKit: forked from KHTML, 2002
Blink: forked from WebKit, 2012, more cleanly to incorporate patches accumulated since 2008
I make that four years before Apple, six further years before Google, four years since Google. That'll do for dividing credit, surely?
As mentioned above, it's unclear what proportion of the market the sub-$300 bracket accounts for and therefore not technically clear that Chrome is gaining traction. It could easily be that there were basically no sub-$300 laptops and now there are some, but most of those on offer run Chrome.
That being said, if you couple this with the statistic that more than 90% of laptops costing more than $1,000 sold are Macs it does start to look like Microsoft may be about to suffer a bit of a pincer movement.
Based on the unwieldy 808 I expected the phone to fall uncomfortably between two stools — lacking both the convenience of any other phone and the image quality of a full camera*. It seems they've resolved the first issue. Good work, Nokia.
(*) having been asked to cite sources on my previous assertion, here's one: http://www.dxomark.com/index.php/Mobiles/Column-right/Mobile-rating — "However, the PureView’s small exposure troubles in high contrast images, and its very slight color casting and color shading flaws in outdoor settings illustrate that the smartphone is not perfect – cellphone and tablet cameras still have a lot to accomplish to meet the quality of the best DSCs."
I own a flat in London, Zone 2. BT still isn't able to supply fibre-to-the-cabinet to it. I therefore do not advocate that just moving to the south east will make you important in their eyes.
"What's the correct phone shape? A 5.5" Galaxy Note?"
Candybars seem to be the preferred shape. Candybars with bits poking out not so much.
DPReview, DxOMark, pretty much all the objective reviews. It's the best camera-in-a-phone on the market by a healthy margin but it was found lacking compared to the modern compact cameras. It's a small sensor and a tiny lens. Nokia can't beat physics.
"First you're saying it's not as good, now you're saying it is better but no one cares - which is it?"
You failed to comprehend the point I attempted to make. It's worse than a dedicated camera. It's better than other cameraphones. But it's less convenient than other cameraphones, and the lesson of MP3s is that convenience trumps quality.
But! They've paid all that money to be able to print 'Carl Zeiss' on their lenses! So that makes them better, right?
You don't gain large files; the large number of megapixels are processed to produce an ordinary resolution output. So what the device primarily gains over an ordinary 5MP sensor is colour resolution, for better post processing.
I guess it's easier to make photocells more sensitive (as is necessary to make them smaller because they'll receive less light) than to give them a greater overall range?
So it's not really conveniently shaped as a phone and its photos aren't as good as a real camera? I can't help feeling this is like producing an MP3 player that can output 400Khz audio or something like that — yes, most people could tell the difference in a blind test but, no, nobody is sufficiently bothered about it to put up with the inconvenience.
So I guess it's just supposed to help attach better-than-usual photography to the Nokia brand and not actually to sell?
This is a risk with all lithium-ion batteries; Apple uses the same chemistry as everyone else.
Assuming external chargers are available then technically this is a reason to pick the Samsung over the Apple — the S3 (and indeed the S4) both have removable batteries so you could keep the phone next to your bed in case of emergencies on one battery while your other charges in some other part of the house. Charging tends to heat batteries so therefore is an occasion when the risk of combustion very slightly increases.
What are you talking about? Apple is clearly taking over... all of BlackBerry's former marketshare.
"Um, unless you have a number of cameras arranged spherically around a given point of interest capturing the subject simultaneously, the image captured - whether using a lightfield camera or not - will always be from the point of view of the camera."
An ideal light field camera captures every ray of light hitting a surface as direction + colour. An ideal screen would reproduce those rays. So the hypothetical reproduced image is the same as looking through a window at the original scene.
I don't know about you, but I would consider the view through my window to be 3d, even though it doesn't project anything into my room and I can't move it.
Lightfield displays* have been invented for true 3d images, they're just beyond realistic mass-market manufacture and there's no content whatsoever.
* if implemented in an artificially ideal capacity they would capture every ray of light that hits a plane — recorded as colour plus direction — and reproduce those later. So it's the rays of light that are reproduced, not a projection (or two projections) of them; you're free to focus for yourself, to tilt your head however you want, etc. They're achieved that with an array of micro-lenses in front of an image plane, so at each point that would be a single pixel on a traditional screen is a lens and which of the inner pixels you see depends on the angle between your eye and the lens. But that means multiplying up the panel resolution by a very large number and then producing and aligning the microlenses with sufficient quality.
The Lytro camera does this in reverse, which is why its photos can be refocussed after the fact and also why they're so low resolution compared to the output of ordinary 2d sensors.
On the contrary, it confirms that playground name calling isn't always well received. Especially when you have to shoehorn in the topic.