> Personal goodliness is not correlative with software development ability.
7695 posts • joined 21 Jul 2010
> Personal goodliness is not correlative with software development ability.
It's not a 'must haven't feature, but I can see it being useful for music and graphic applications. Still, the secure CPU side of it is interesting, especially in the light of Intel security flaws and hidden 'management engines'.
It seems that Apple have been canny or lucky in not trusting Intel with user's biometric data:
And one assumes that those vendors with capable iPad apps won't have much trouble rolling out an ARM Macbook application - it'd largely just be changes to the UI.
If read AOs sentence in context, you'll see that is what he meant and what he wrote.
In any case, creativity stems from waste and redundancy - it's more fun to have an excess of processor power than vice versa!
Genuine questions for a more technical bear than me:
Is it feasible to have OSX run on ARM but offload some tasks to an x86 processor as required? Or:
Is is feasible for OSX to run on ARM, and then Xwindows (or whatever the equivalent is) into an x86 instance of OSX that is spun up when required?
Either way, the ARM chip is primary, but the computer contains an x86 chip to be used occasionally.
It seems to me that the applications that people use 80% of the time (Safari, email, LightRoom etc) are those that can be quickly compiled for ARM by Apple or others, leaving the x86 chip for legacy applications on occasion.
It's a question of how to roll it out to the public. You could imagine an ARM Macbook and x86 MacBook Pro range, for example... or maybe not. I really don't know. Apple like to keep their message as simple as possible.
> Some famous engineer is reported to have told his disciples: "Simplify, and add lightness". Damn right. That's the kind of rethink I'd love to see.
Unfortunately, he didn't add security - a few of his racing drivers died in crashes.
When Steve Jobs returned to Apple and killed off the clones, he wanted to make an exception for Sony VAIO computers, though in the end Sony were too far down the path of building x86 Windows machines. The first x86 builds of OSX were demonstrated to Apple management on VAIO laptops, and it only took a small team a few days - but then NeXTstep was always designed to run on a variety of CPUs.
The Touchbar on the MacBook Pro has been made to run Doom*... it's not Crysis but ya gotta start somewhere!
* What hasn't been made to run Doom?
The hybrid Intel AMD chips are possible because of:
Intel announced its EMIB technology over the last twelve months, with the core theme being the ability to put multiple and different silicon dies onto the same package at a much higher bandwidth than a standard multi-chip package but at a much lower cost than using a silicon interposer.
No technical reason Intel can't combine an Intel CPU with an Apple GPU, if that was something Apple want.
And course OSX was made from NeXTstep which itself was designed to be ported across CPU architectures.
Oreo is more modular than previous versions, in theory allowing updates to be created without ODM binary blobs, so it should hopefully result updates being supplied for longer.
See: Project Treble
If you don't think there is a problem, then why the hell do you think Google have tried so hard to rectify it, with Silver edition phones, the Nexus and Pixel lines, and the long overdue modularity of Project Treble in Android Oreo?
I'm not going by what I've read - the inherent architecture of Android that has traditionally made updating reliant on a multi-partner process of binary blobs and roll-outs means that I have Android tablets stuck on versions (3.x, 4.2) that demonstrably do not run the apps I want them to.
For an illustration of a phone OS that had *many* but not *all* apps available to it, look no further than Windows Phone. Even just the chance that your new phone might not run [ favourite niche app] might put a prospective buyer off.
Samsung do have their own phone OS, but they don't have full replacements for all Google's services. Samsung phones ship with their own version of many Google apps - eg a Samsung email client, map app, app store etc. - which gave the impression they were at least considering dumping Android as a contingency plan (or using the threat of doing so as a bargaining chip).
Why would he be - he's not responsible for the situation.
The creator of Apple's Swift language went to Google a little while back.
Normally yeah, but in this case Rama has evidently decided we're not worth braking for!
Google struggle to get their more recent Android versions out to users. Even the OnePlus phone reviewed on the Reg today isn't running the latest Android version yet. Any new Android feature takes longer to get to a decent percentage of users phones.
Apple's competent marketing department has its work made easier by other decisions Apple make.
Hehe, it's like what have the Romans ever done for us! "Yes okay, but besides the low audio latency, rich app developer ecosystem, strong 3rd party peripheral support and regular and sustained updates, what have the Romans ever done for us?!"
- sent from my Nexus
There's nothing wrong with an honest acknowledgement that one can't see a 'killer app'. And hey, if you did, would you be daft enough to post your online? :)
I might suggest that there's no one killer app for the current near-ubiquitous touch screen smartphone.... but the smartphone does lots of things well enough. From an alarm clock to a podcast player, a camera and a flashlight, an atlas and a calculator. Whose to say that more environment-aware phones won't follow this same 'lots of little things' pattern?
Me, I have a hunch that online retail and the DIY markets will both benefit. But that's just my speculation.
30% of the price of the app, just as now.
It's been rumoured already:
Apparently they've looking at using a time-of-flight laser system, as opposed to their current distorted IR grid method.
Qualcomm's IR sensors -being touted at Android OEMs - are intended to be mounted on the rear of phones.
You left out that episode of Community where students award each Meow Meow Beans; the college quickly ends up resembling a shiny Sci-Fi distopia in the style of Logan's Run :)
> I presume Apple will come up with some sort of headgear though, because there is a great gaming opportunity there. Will they come up with some novel controller as well though?
Apple historically haven't jump ed at gaming opportunities. I mean, they've never even bothered making physical gamepad reference-design for iPhones in an effort to eat Nintendo's lunch.
On the desktop computer side, catering to gamers doesn't play to Apple's strengths as games favour commodity hardware to drive up frame rates and drive down latency. Those Mac users I know who game just do it on a PlayStation on a big TV.
There was that time Steve Jobs introduced Halo, but Microsoft bought the game studio before it was released on Mac to make it an XBOX exclusive.
> It has never occurred to me to apply Aesops Fables to computing.
The book Escher Bach and Godel uses Aesop-like fables as an introduction to each chapter about information theory, though the author gives the nod to logician Charles Dodgson - better known to us as Lewis Carroll for his stories containing strange talking animals.
Thanks FIA. Perhaps it wasn't clear to @cambsukguy that I was only using Google Translate or Night Sky as an example of an existing environment-aware* app that readers here might already be familiar with. An example merely to illustrate how a screen-based AR app can be used without specialist eyewear. That's it. No more, no less.
Whoever was first to create such a system is a moot point. (Just as moot as me pointing out that I'm not going use an app as an example if its platform has such little market share that most readers won't have encountered it. Even if I do like some aspects of WinPho UI :))
*Obviously 'environment aware' in a limited way. Translation apps just deal with a 2D image, night sky apps just location, attitude (gyro) and direction (compass).
The *other* Google effort in this realm is their Project Tango, but it's only available on two consumer phones - not nearly enough to excite 3rd party developers, which is the point this article is making. It uses two cameras to create a 3D depth map of a room, and the reviews suggest it taxes the phone's RAM. It does produce proper point-clouds though.
The *other other* Google effort is ARCore, which uses software on data from a single camera. As such it can in theory run on any phone with enough processing grunt. There are limits to what a single camera can do, though one assumes this system takes advantage of the fact that people don't hold their phones stationary.
And whilst not a a Google effort per se, Qualcomm (whose SoCs already power a lot of Android phones) are pushing both a passive and an active IR scanning chip/sensor package to Android OEMs for release next year. Their demo shows a real-time moving 3D scan of a pianist's hands and piano keys.
In a workshop, safety goggles are already worn, so there's no extra discomfort in donning eyewear. Also, it's your damned workshop, so there's no privacy concerns to the public if you're wearing a head-mounted camera. Being able to measure dimensions on your workpiece on the fly, hands-free, whilst working with a CNC router work really ease the workflow. Plus, tape measures are a bugger for hiding, and I don't like having metal rulers near spinning blades.
However, goggles would be for niche use cases - this article is about placing environment-aware tech into millions of phones with initial indifference of most of the end users. Of course the sensor and silicon tech in phones and goggles has much in common.
When I've used CNC machines I've had to design on one computer and initiate the job from a dedicated control computer. There's scope for streamlining the whole process. It would be akin to the ease with which we can now send a photograph of something to a colleague on the spot with only a few taps, compared to the faff of only a dozen years ago where we would have return to a PC after taking a pic and swap the SD card over.
Heck, it would be bloody handy for the CNC router to have machine vision, so it can detect (and correct) if it's strayed from its zeroes or has broken its bit. There's already 3D printers which do this.
Rule 34 of the internet
As your maths teacher once scrawled on your exercise book: Please show your working! :)
Absolutely. If one is trying to understand how a technology will change the market as a whole, it is blinkered to the point of cretinous to only look at ones own individual behaviour.
I was once told by an associate of Felix Dennis that he became so rich because he would know what people wanted before they did. It mirrors Henry Ford's observation that if he asked people what they wanted, they would ask for a faster horse. Sony traditionally don't use focus groups, and nor do Apple.
Adding 3D capability to a TV set costs almost zero because the refresh rate is already made high (120hz) for sports and video games (active type 3D). Passive type 3D sets just require the pixels to be polarised.
You can't be said to have 'bought into' something if the cost is nothing.
It'd be a brave man who asks his wife if he can don some goggles in the marital bed to make her look like a pron starlet. The goggles leave enough of his face exposed for her to slap him hard.
Which is why MS have been canny to aim the Hololens at corporate and engineering environments - where you don't upset the privacy of members of the public, as you would by wearing a Google Glass in a cafe.
AR doesn't usually equate to wearing goggles a la Google Glass - which I agree would be a barrier to use in social situations.
It can be more akin to Google's Translation app (where you hold your phone over French text and see the English translation on the screen in context) or the Night Sky app. Like that, but for 3D physical environments and objects.
AR can become a replacement for a measuring tape for the casual user, in the same way as surveyors now use laser distance finders instead of sticks. Useful stuff, not gimmicks.
AR doesn't require googles. For many use cases it's sufficient to look through the phone screen. A current example would be Google's translation app - if you hold the phone above some French text the phone will display an English translation in context. Another example is the Night Sky app, which identifies constellations and planets from your point of view (location, direction)
The whole thrust is that it doesn't matter if you see the point or not: it'll come to enough handsets that maybe some app developer will find a killer use-case that you've over looked.
It's more akin to having GPS on a phone than a 3D television. Whilst it was immediately clear that live navigation would be quite useful, it was harder to foresee what it would enable - Uber and taxi driver protests.
Have a look at the history of internet use by demographic, sector, gender etc, have a think about online retail, and ponder how a 3D scanning environment-aware phone might plug into that.
It's not just Apple - Qualcomm are touting similar sensors and co-processors for release early next year.
Curious. Your comment caused me to look it up, and yes, the OnePlus 3 only used half its RAM in order to save on battery life - though OnePlus did distribute the kernal files required for modders to roll their ROM.
I can't find a direct answer to your question, but other reviews have noted that it has a handy feature to stop apps from being backgrounded (handy for stopping streaming apps from losing their buffered content, for example, or for stopping games from returning to their start screens) which does sound like the sort of thing you'd need plenty of RAM for.
When phones only had 4 or 8GB of storage, an SD card would be a 'must have' feature. However, in these days of phones with 128GB storage, it isn't as crucial to as many people - their music file sizes haven't grown drastically, nor has the amount of time they spend away from their home server.
Some apps will by default store to an SD card, but if the card is removed (to pop into another device for a bit), the phone app used, and then the the SD card reinserted, the data on internal storage and SD card isn't always consolidated. In WhatsApp this makes some data permanently unavailable (effectively lost).
>" a perverse embargo schedule imposed by OnePlus forbids us from telling you more until next week. Bear that in mind as you peruse "Hands On" features; they are withholding some quite interesting information."
Sorry AO, I read that as suggesting there was a fly in the ointment that you weren't yet allowed to disclose.
In his preliminary Hands On review of the phone published before the embargo was lifted, Andrew hinted that one should wait for a full review, hunting there was a nasty surprise.. yet I didnt
I guess we have a pretty good idea of the composition of Mars dirt by now... Plus there's naff all water to make the soil claggy and sticky. We can knock up some fake Marsian soil on here and do extensive testing. I suspect that the flexible wheels would soon shed off any bits of grit that might stick in the holes.
Lunar dust, due to the lack of atmosphere allowing micro meteorite impacts abrading and fusing, is an absolute pain the arse though. It's like tiny fractals of razor sharp glass.
Your braking distance will be drastically increased, but at least you'll give off some cool-looking bright white sparks as skid down the highways!
Indeed, regardless of its heat-activated memory properties, Nickel Titanium can be bent far further than steel and still return to its original shape. Anyone who has played with a pair of titanium spectacle frames knows this.
If it ends up in landfill then you're recycling it wrong. I do believe that end-of-life products should be taken back by the manufacturer though, so that they gain efficiencies of scale in dismantling many examples of the same device en masse.
If you spend £500 on it rather than £100 then it's likely to be a faster, better constructed device that you'll look after more and be less inclined to change in a year.
Whether the roughly £1/day price difference twixt a budget and high end phone is worth it for any individual is a function of how much they use the phone and for what, and their bank balance of course.
I used to use my laptop so much much that I bought an £80 mouse, and I have never regretted it (Logitech Darkfield). The factor of extra comfort and convenience it provided was multiplied by the hours of use.
The article itself was useful enough, but yeah the headline was just odd for a tech blog - for the last thirty years us computer and gadget buyers have known a faster/cheaper/better device will arrive on the market a week after we've bought our new toy.
Embargoed reviews are only issue if you want to buy on launch day, surely? Or am i missing something?
It seems no hardship to wait a week or so for a more in depth review.
You can unlock the 5T whilst it's lying flat on a desk by using facial recognition, if you're willing to take the hit on security (it isn't a depth sensing system like Apple's).
What would be good is if this facial recognition system was only enabled in more trusted places, e.g within range of your office or home WiFi networks.
I agree with Phil though - Bluetooth unlocking doesn't seem a smart idea for the reasons he's outlined.
@Mr Hartley - thanks for the tip re. micro USB > type C converters, I'll look into it.
I've found my the gauge and length of USB cables have a bearing in how quickly my phone charges, and I've heard not all USB C cables are created equal. When I've found some well reviewed / tested ones, buying a good dollop of em would be a good idea.
If I do go the iPhone route, my observations of friends' Lightning cables is such that I'll reinforce the ends with self amalgamating tape or polyurethane mastic before using them.
Colonel Kurtz in Apocalypse Now:
"They train young men to rain fire upon people but they won't allow them to write 'fuck' on their airplanes because it is obscene"
Biting the hand that feeds IT © 1998–2017