Re: He's in good company...
Wasn't it Richard Feynman who cheerfully admitted to attending strip clubs?
4803 posts • joined 21 Jul 2010
Wasn't it Richard Feynman who cheerfully admitted to attending strip clubs?
It's not indestructible, but the only time you really need to worry about setting it down on a surface is if you are in a workshop or a building which is having work done - dust from diamond cutting disks can damage it.
That's how I've put two very light scratches on the face of my watch. By contrast, the stainless steel bezel that surrounds it is covered in little scratches from over a decade of carelessness on the part of its owner.
I've also heard of diamond jewellery damaging sapphire-faced watches, if you or your SO are into bling.
>Saying students shouldn't use computers so much is like saying (back in my day) that kids shouldn't watch so much television.
When he spoken about this before, Ive's point was that was that it was usually better for the designer to shape a piece of clay or foam by hand than it was to use CAD and a 3D printer. By spending time working with the shape and holding in their hand, the designer can get a better feel for the object.
There is a lot to be said for being able to sketch well by hand - still a very fluent way to get 'hard copy' out of your head. CAD is absolutely essential to bringing a mass-produced product to market, but not at all stages of the design process. CAD is designed to be easy to use. CAD is designed to hold your hand through established manufacturing processes.
There are analogies to other fields; Keith Richards saying that he always judges a new song on an acoustic guitar before plugging in, or those authors who still use a typewriter... spell checkers, squiggly green lines and talking-feckin-paper-clips are not essential.
As a child, my piano teacher had a Yamaha DX-7. I had more fun exploring the funny noises it made than actually learning how to express music though my fingers. As an adult, some more musical ability would bring me more joy than a grasp FM synthesis.
Maybe. At least it gives you some tactile feedback so you don't obscure the lens with your finger when you take a picture. True, it does protrude, and therefore might cause an issue when putting it your pocket. Perhaps.
You could say exactly the same of a phone with big raised volume buttons - "Great!" you think, if you live in cold climes and regularly wear gloves. "Nasty!" you think, as the phone's buttons snag on the pockets of your linen trousers on the way to the beach.
Product Design, like Engineering, is often about making decisions - compromise is inevitable.
>Surely good design should stick in your memory?
Not necessarily. A traditional 'double diamond' bicycle frame is a piece of good design, but it isn't memorable.
Often the best pieces of design are those that we don't notice, because they haven't annoyed us.
We notice some impenetrable plastic blister packaging because it annoys us. Doubly so if the product it contains is a pair of scissors.
We notice a piece of furniture with sharp corners because we have just bashed our shin on it.
We notice a car stereo because it has, for some stupid reason, two buttons to change the volume instead of a simple knob.
If you are going to notice a well designed object, the best thing you can think of it is "Ah, that's simple - why didn't I think of it?", or "Why aren't they all like that?". I had a scroll-wheel music player before the iPod was introduced - a Sharp MD722 MiniDisc player. I then had an iRiver H320 - technologically superior to its rival, the 3rd gen. iPod, in every way - but it would have been much nicer to use if it had a scroll-wheel to navigate long lists of albums.
In the early nineties, I visited Czechoslovakia as it was then. Nearly everywhere had single-handled mixer taps (up/down = on/off, left/right = hot/cold) which we hadn't yet seen in the UK. The advantage was clear - easy to use with soapy hands, or even with your wrist if your hands were covered in the very much you wished to wash off. The advantages easily outweighed the extra costs associated with what must be a more complicated mechanism than a tradition tap. We were left with the thought "Why are these everywhere here, but unknown in the UK?"
>"I would draw the similarity between Architects and civil engineers. Clothes designers and tailors, Car designers and production engineers. In all cases the former has the freedom to think without the constraint of what's possible or practical The latter professions then go. How the fuck do we build this.. and at that price, by next wednesday."
I think you've just identified the difference between an Industrial Designer and a Product Designer - Ive is of the latter camp.
Here's an example: The designer of the first Sony Playstation insisted on a vent design that required the injection mold tooling to pull away in two directions when the case was being formed, even though the engineers had wanted just a simpler, cheaper perf pattern. The designer, a Mr Teiyu Goto, knew things that the engineers didn't - he knew where Sony where aiming to pitch the product in the market against the incumbents Sega and Nintendo (Sony had been working with Nintendo on a 'Playstation' in 1990, but Nintendo pulled out). The strategy extended to the marketing of the Playstation, as well as the games commissioned for it (such as the clubber-friendly WipEout) - creating its image as a console for young adults, not just children. The Playstation hardware and software accounted for nearly 25% Sony's profits for 1997, so Mr Goto was vindicated. He went on to design the first VAIO desktop and laptop.
A Product Designer will understand the production techniques, and will be able to make an informed decision on whether a design decision is worth the extra production cost - or other costs, such as battery life vs weight. The engineers won't have all the information to make these decisions - they won't know, for example, projections of many units will be manufactured over the next 12 months. The key points here are teamwork and communication between experts in different fields, and for that to work the designer (or as Dieter Rams calls it, the 'Form Engineer') needs at least enough knowledge to converse with these experts. He needs to understand their input, and to communicate his/her views to them.
An Industrial Designer just makes a pretty case to stick over the box that the engineers have already made. As William Gibson will tell you, the first Industrial Designers were recruited from Broadway, as they were theatre set dressers.
- Digital Dreams - The Work of the Sony Desaign Centre 1999 ISBN 9780789302625
>"We can only conclude from this that Jonny Ive is cunt of the first order"
Er? So, your logic:
Ive is offended by badly designed objects. On the basis that there are far, far worse things in this world, this makes him a "cunt of the first order" according to you. Um... so what order of cunt do you consider the people who are committing the murders and torturing etc, since the very idea of downgrading their offensiveness offends you?
>I'm assuming he talks about the exterior design.
He's talking about the whole phone. The design work might be divided into teams, but ultimately the user will use the phone as a single undivided object, so the design process should bear that in mind. At a very simple level, the software should work with the hardware human input devices.
>However it does have some serious industrial design issues.
He's not an industrial designer, he's a product designer.
>Less obvious is the lack of a keyboard causing you to have to resort to soft keyboard.
Resort to a soft keyboard... any number of hard Bluetooth keyboards from the tiny to the full size, so you can choose one that suits you.
You might even wish to support the young lad who used an Arduino to make a 'chorded typing' key case for phones.
>The most obvious is that the display glass extends over the whole surface of the device. That makes it very likely to break when it falls down
This is true of many touch-screen phones. Again, it credits the user with enough wit to customise it to their specific situation.
er... Jony Ive uses many things other than computers, so therefore we can assume he uses many things that aren't made by Apple. Toothbrushes, cars, shoes, ovens, pencils, knives, whatever.
Shit, when he started at Apple, he didn't even use an Apple-branded computer for his design work because that kind of CAD software wasn't available for Mac OS (it may well have been still on the mainframe).
I don't own any Apple kit, and I'm still offended by bad design - in hardware or in software. Sometimes when using using a product you just get the sense that the designer doesn't use this thing themselves - because if they did, they wouldn't have made it so annoying to use.
Bad design is the standard British light switch - a 100mm x 100mm square of plastic, and the only useful bit is 12mm x 25mm, of which only a 12mm x 10mm will actually do anything and even then it has sharp uncomfortable edges and requires some force to actuate. The French have switches where the entire 100mm x 100mm surface is a switch, so the lights can be turned on with your elbow as you enter a room whilst carrying a tea tray.
>Yeah, and they [phones] were far more usable back then, with the stylus, than they are now.
Comparing multiple fingers to a single stylus is akin to comparing oranges to a single banana. The stylus gives you more point accuracy and perhaps pressure readings, but the use of multiple fingers gives modifiers. Horses for courses: a drawing app will be better with a stylus, whereas an app that simulates an audio mixer may work better with fingers. Sometimes you use a mouse, sometimes you use a joystick.
It is clear to anyone with a stopwatch that the act of removing a stylus from a phone (or from your pocket, or from behind your ear) incurs a time penalty... looking for the damned thing when you've dropped it even more so. In product design, this is known as an 'offline' issue - an aspect of a product's design that affects the user when they are not actively using it.
Now, some older phones are more efficient for some tasks - one could literally navigate an old Nokia blindfold, because the menus were numbered, so [Menu]  * would bring up the voice recorder before you'd brought the phone to your lips.
* These are made up numbers, so don't blame me if you accidentally set your old Nokia to Japanese.
There are a few definitions of 'Workstation'. In CAD, for example, it would mean that the exact-same system had been tested by the vendors of your CAD software, and would likely mean a 'Pro' graphics card (even if it was the same silicon as a much cheaper gaming card, its drivers would be different - but I believe the silicon is a bit different these days). Mission-critical simulations would require ECC RAM and a compatible CPU, too.
Most of the time consumer parts (GeForce or Radeon instead of Quadro or FirePro, i7 instead of Xeon) will work well enough, but there is a business case for spending more to make sure problems don't arise at the worst possible moment.
The article asked us commentards for ideas for a possible podcastabout system building. The discussion above between CADmonkey and others suggests that the Reg could have an article/thread about the pros and cons of buying a system versus 'rolling your own'.
I haven't used Scan myself, but I remember that they used to advertise heavily in the MCad magazine (dead tree).
The last time I helped buy all the components for a 'roll your own' system, we took ideas (and bought some parts from) QuietPC.com. Again, I can't vouch for their systems, but I have no reason to believe that they are not competent. Their systems are guaranteed.
The machine was 100% silent. Lovely. SSDs mean no disk noise. An i7 3770 S has a lower TDP than the K variant that overclockers like (and well within the 95W rating of the NoFan CR-95C fanless cooler we used - it's heavy and pricey). As it was a machine for audio and music, the Intel HD Graphics 4000 was more than good enough for purpose - so no discrete GPU to cool.
Here's the thing - the researching, choosing, ordering and building the components took some time, as did seeking out and installing the latest drivers and a little bit of troubleshooting. There is a fair chance that QuietPC's (or whoever's) complete system markup is good value compared to your time. (Though I enjoyed the project, working with my friend).
>So maybe the fruity types are working night and day to get it to run on ARM architecture, enabling a hybrid iOSX.
It's been done already in its Darwin form: http://www.theregister.co.uk/2012/02/07/mac_osx_on_arm/
If Apple were to try an OSX tablet, it would be for a reason - perhaps by focusing on the productivity software that might benefit from a touch / stylus interface. This might include Apple's own music and video productivity suites, or 3rd-party software such as Photoshop.
>That strategy worked out beautifully for Microsoft.
It could be said that the iPad itself was treading where MS had been before (WinXP Tablet Edition) - suggesting that the devil is in the implementation. Apple haven't made any radical changes to OSX ( a la Windows 'Metro', or MS's war on menus) but instead they have gently introduced some iOS features such as 'pinch to zoom'.
>If it's more powerful than a three year old MacBook then why does it run a cut down phone OS instead of OSX?
The iPhone is powerful enough to run OSX, but it wouldn't be an optimal experience for the user. The underlying OS would work, but the UI wouldn't. A good number of Android phones are powerful enough to run OSX, too.
Apple will have their own business reasons for not making an OSX tablet or whatever, but I would be surprised if they haven't compiled OSX for ARM as an experiment- as they always did for OSX on x86 before they left PowerPC.
> they are not mainstream beause Android is useful as a touch UI. iOS is useful as a touch UI. they don't translate back well, and vice versa for traditional OSs.
Absolutely. Wasn't there a once a Windows laptop that could switch to Android (for quickly checking one's email inbox without draining the battery)? I seem to recall the software attempted to make it easier to swap documents between the two OSs. Or maybe I ate too much cheese before bedtime.
Apple's approach is to keep one UI per device, and to use iCloud and 'Continuity' to allow a person to start writing an email on an iPhone, and finish it on their Mac - without digging into the 'drafts' folder.
The idea in the article - use an Apple TV to let an iPhone ape a Mac - is amusing because the first AppleTVs could be made to run OSX, essentially making them low-powered MacMinis. http://www.appletvhacks.net/2007/04/01/mac-os-x-running-on-apple-tv/#.VGNBdvmsXQo
With things like Intel's NUC form factor, we essentially have headless laptops that could be slung in a bag and plumbed-up to the nearest TV.
Another thing that tickles me is that nobody in this thread has cited the 'netbook' formfactor - it's as if they never existed! You wouldn't want to write a novel on one, or even browse the web for long, but you could if you had to - for a device that would fit in a big jacket pocket. Tablets and 'ultrabooks' largely stole their lunch, but they are handy for connecting external peripheral devices and cheaper than 'ultrabooks'.
>Perfect reply to this load of dingo kidneys.
It is not a bad reply, but perfect? Personally, I feel that the discussion below that this article has prompted is useful, not least because we have different understandings of what 'tablet' and 'laptop' entail.
>I still consider them as PCs though.
I'm sure you're not wrong localzuk.
My point is just that some define a PC by its OS, some by its form factor, some by its architecture. All valid.
Ultimately, banks and call centres care about the ergonomic working position of their workers for fear of lawsuits. They are serious (or at least serious enough to tick the check boxes on a form to cover their asses) about chairs, desk heights, monitor heights and glasses for those staff who need them - it is just cheaper to look after staff in this way than it is to compensate them for RSI.
How that is achieved - local ARM or x86, thin client, full office suite, web form or proprietary software, whatever- doesn't really matter. The example given in the article - a phone connected to a projector - is no different in concept to phone connected to a VDU and a keyboard. A phone without a screen is just a little ARM box. A monitor with a computer built-in is an 'All-in-one'. A tablet on a stand is just an ARM-based 'All-in'one' PC with a funny OS. A 'desktop' is just a VDU, a box and a cable! However, people are just as valid if they choose their definitions based on pragmatic considerations such as: Does it run this application? Can I connect that device to it?
The definitions are a little fuzzy, is all.
>So you are saying that a tablet would be more suitable if a few more alterations were made to it to make it more inline with a traditional OS?
Kind of. What I was getting at is that the words we are using are poorly defined, and might lead us to argue when really we are in broad agreement. There is also a difference between the concept and the current executions.
For example, it is only convention that makes us assume a 'tablet' is ARM-based running a touch UI on top of a OS that hasn't been designed around local storage and peripheral devices. However, there have been x86 XP tablets around for years, and there are ARM / x86-based laptops that don't really do local storage (Chromebooks). I have even seen x86 laptops running a Linux distro specifically to connect to a VPN and prevent local storage (for security reasons).
A keyboard and a tablet (placed at the correct height) is, from an ergonomic typing perspective, closer to a desktop than a laptop. However, this solution is sub-optimal when it comes to transporting it (it takes a little longer to stow away than just closing the lid and picking it up).
Maybe one solution is to have ARM tablets that can act as monitors for grunty x86 CPUs housed in keyboards (return of the Amiga/Spectrum Atari ST form factor!). Who knows?
Talking about the death of the PC is a bit silly - things tend to evolve rather than go extinct - but it is has made us think!
With respect, a tablet with proper software and enough grunt could be just as good at photo-editing as a laptop, if not better. See Cintiq Hybrid, or Modbook
The issue is with the current crop of software, not the hardware form-factor itself.
>I hate to break it to you, but your local bank won't be doing all its work on tablets any time soon. They might augment certain roles with them, but the bulk will still be a traditional PC or similar. The local call centre? That'll continue to use something PC-like for a while yet.
It looks like you have just identified two environments where thin clients would work rather well. I've worked in a call centre, and the rows of XP-based Dells were doing not much more than running legacy I.E forms. There was certainly nothing they did that that a low-powered ARM device couldn't handle. Keyboard plus mouse plus monitor plus modest processor is all that's needed.
Banks similarly, especially when you consider their data-protection obligations.
>"I *could* use a bluetooth keyboard and a tablet for all my ssh and text writing needs. I probably will if there is an emergency and I have that handy. I won't, because a laptop is so much better for doing that when I am about. I am writing this on a laptop keyboard because I can type, and hopefully proof-read, this so much easier than a tablet. If I could type as well on a tablet, that would be *in-spite* of being on a tablet. Not because it is the best tool for the job."
There is no inherent reason why a tablet plus keyboard couldn't be a better solution than a laptop for your use case. At the moment, a tablet isn't suitable for you because of the touch-centric nature of its UI and applications. If these were fixed, than a tablet plus keyboard would offer you:
- Your choice of keyboard... chiclet, mechanical, number keypad, whatever you want
- Screen placed independently of the keyboard for a better typing position. (Current laptops are already an ergonomic compromise compared to desktops)
-Being able to proof-read away from your desk. A lot of us currently don't proof read on a monitor, but print out hardcopy and grab a coffee. A tablet or e-reader could emulate a 'red crayon'.
I'm not for a moment saying you ditch your laptop now, but only that some things aren't written in stone.
> A similar-spec desktop costs less than a notebook, and much faster desktops are available if the goal is raw processing power.
Here's the thing: You are unlikely to be using all that extra processing power all the time.
If you want more power to save time, you either build a CPU/GPU-cluster (again, to get value for money you want to be running it all the time) or you take advantage of a cluster that someone else has already built - i.e you rent processing power from the cloud.
>I like the term 'workstation' as it defines where these devices still provide for a strong need - work (ie, content creation)
Bus stations are where buses stop.
Train stations are where trains stop.
On my desk I have a workstation...
>Hardcore gamers. (There strife for power is never ending)
>CadCam (was mentioned the article)
>Video Editing ( was mentioned the article)
I'd be interested to see some rough breakdown of how desktops are used... my guess is that most of them are fairly low end for general office tasks, followed by enthusiast gaming machines, followed by intensive productivity workstations (CAD, video-editing).
One trend worth noting is that for the last few years, new Intel CPUs have been geared towards energy efficiency instead of raw grunt - they are already fast enough for most tasks.
Gamers do push the limits of GPUs, but some may trade that against size and noise to run a 1080 TV in their front room (SteamBox). The other drivers here are ultra HD displays, multiple monitors and maybe the Occulus Rift... Most modern games don't benefit from any CPU faster than an i5.
CAD can be run on a laptop, and the big number-crunching - rendering and simulation - can be farmed out to an array of GPUs or even the cloud. Hell, some CAD can be used over the cloud - there are some advantages (pay per use, easier security administration, cheaper local machine, team collaboration tools).
... I might need a diagram or two:
The university explains that circulators work by breaking the symmetry in a wave transmission between two points.
For sure, its very hard to get excited by terms like "QD backlit LCDs", but they are only marketing shorthand for tangible benefits that can be expressed in numbers, such as:
I'd be interested in the technologies that can provide a huge dynamic range in the displayed brightness - but to get the benefit then the content would have to be shot, processed and delivered on HDR kit.
For now, just reflect on how cheap room-filling televisions are these days. The communal experience is still valid, be it for a good movie, co-op video games or a thought - and discussion-provoking - documentary.
>I'll throw my hat in the ring of What? What is wrong with the article?
I believe Destroy All Monsters was making a point about the Giger sculptures being based on his work for Scott's 1979 film Alien, and not Scott's 2012 film Prometheus. The newer film does contain 'Space Jockeys', and alien spacecraft of the same design, but the jockeys aren't in the seat as shown, and the alien craft isn't 'Derelict'. Whilst Alien is generally considered to be a very good film, Prometheus was not liked by everybody - to put it kindly. Indeed, some fans of the 1979 film consider Prometheus to be dreadful.
Giger had been working on an adaptation of Herbert's 'Dune' before 'Alien', as had Chris Foss (famous for the airbrushed cover art of many an Asimov paperback, and for the black and white illustrations to The Joy of Sex) and Dan O'Bannon. After that project collapsed, the three of them worked on Alien.
> Obviously I'm not a Giger habitual fan
Friendly note - if you research him further, the images may be NSFW.
Apparently it's water resistant, not waterproof, so contact with water is fine, but it doesn't like any head of pressure (from either being submerged or subjected to a jet of water as in a shower).
I'm not in the Apple eco-system, but if I was I would wait for MK II. Personally, I wouldn't require a connected watch to do as much as the iWatch (an 'iWatch Nano' would be a better fit), but for the point it occupies in the features/size/battery space, it largely appears well designed, software- and hardware-wise.
>Who wears a watch these days?
Anyone who doesn't spend their days in rooms with with clocks, evidently. I thought you were into horseriding, or is that just Mrs jake?
The same site as AC has linked to also has fixes for E.T, if you want to roll up your sleeves and play with a HEX editor:
- it highlights the location, Terminator-style, of that 6mm Allen key that is hiding amongst other tools.
- It lets me order parts there and then, before I forget.
-It 'draws' lines and points on surfaces - centre of face, midpoint between edges etc - for cutting and drilling.
- It flashes when someone enters the room - often u'd can't be heard over the sound of machinery.
... if they were under £50.
-Being able to refer to instruction manuals and data sheets hands free
- Record a sequence of 'which screw came from where' during disassembly.
-Replay the above in reverse when I put stuff together again
-Protect my eyeballs whilst operating powertools etc.
-Make voice notes of key measurements, to be used ion CAD later on. Or better yet, make it a 'mini-Kinect' system, to assist in 3D scanning and measurement.
I wouldn't wear them in public though, any more than I would wear my boiler-suit down the pub.
Bicycles were originally the playthings of the rich.
After a while, they became affordable and allowed people who could never have afforded a horse to make trips to the next neighbouring towns and back in a day. This led to marriages between people who otherwise would have never have met, with effects on the British gene-pool. Decades later, the image of thousands of workers commuting by bicycle became almost a big a symbol of communist China as Chairman Mao.
Okay, there are some big gaps in my analogy, but attacking private space travel merely on the grounds of 'rich man's playthings' doesn't hold.
>"Sorry, what was the point you're trying to make because as far as I can see you failed miserably since you can't seem to differentiate between human sexuality and legal statutes"
And yet Boltar, I made the same argument but without invoking the law, and you had no response for me.
An adult individual using a vulnerable individual for selfish ends, and likely damaging them in the process, is wrong. Where (gay) men differ is that they are past their formative years, and are in a position to make a judgement about whether they will enjoy and benefit from whatever is proposed to them.
There are many other differences, too... paedophiles attracted to young children have had issues with the development of their brain... blunt head injuries before puberty are not uncommon in their case histories. This is not true of homosexual men.
>But being "gay" is a lifestyle choice.
No, it isn't. Being gay is just how someone feels about who they fancy - they don't 'choose' it, any more than I choose to find some women attractive. You might be able to make a weak argument for calling gay sexual relations as a lifestyle choice, but only if you say that having heterosexual relations outside of marriage is a 'lifestyle choice' - the latter is also disapproved of by some parts of society. You don't choose how you feel; you choose what you do.
Generally speaking, I'm not always fond of people being overtly sexual in public - be them straight or gay, but I that is just my taste... just as I might not find a brash or loud person to my taste. But hey, that's just the way they are. I don't like people bragging. I also don't like people forgetting to indicate at roundabouts, or playing music on their mobile phones in pubs.
>"It's as repellent to disapprove of someone because of their sexuality"
>>Really? How come we lock up paedos then?
It is simply a matter of being able to give informed consent. Children cannot do so. Adult men, and adult men (and adult women) can.
Therefore what adult men do in private is their own business. If children are being abused by adults, the society has a duty to intervene, since we should protect the vulnerable.
True, us adults can hurt each other, but we are beyond our most formative years. As adults, broken hearts (and carpet burns) will heal over time.
>We got much bigger problems in the world. You know, a raging closet homosexual with nukes who is invading countries.
That reminds me of an old Stephen Fry column... after citing examples such as Alexander the Great and Lawrence of Arabia, he arrives at the tongue-cheek-conclusion that ' yes, gays should be kept out of the military - because they are too bloody good at warfare!". His point being that whilst he was pro-gay, he was anti-war.
"Apple sauce, bitch!"
- Ben Affleck in "Good Will Hunting 2: Hunting Season'... sorry, i meant 'Jay and Silent Bob Strike Back' https://www.youtube.com/watch?v=nnESedN4vSI
>Because lets face it, unless you had a hidden camera following the subject for his entire life it's basically just made-up 3rd-hand bollocks.
Yeah, but then David Lean's 'Lawrence of Arabia' was a very good film, yet still it differed from T.E Lawrence's book... which itself might have differed from reality at points, and certainly diverged from common syntax.
I think people watching the film will, like you Yugguy, be aware that is not a documentary.
What disgruntled said. Films are more interesting if the protagonist isn't an unalloyed saint.
Is Arctic Fox an American?
In the UK the word 'cunt' is used to insult mainly men. From what I have seen of US film and TV (Kill Bill part 2, The Wire, Weeds) it is only ever used to insult women.
>I strongly doubt the issue is the script. Maybe they're finding out what sort of chap Jobs was
So... you're saying Jobs was a 'baddie'. Actors enjoy playing baddies, don't they?
If you're suggesting that jobs was a 'baddie', but the script was portraying him as a 'goodie', then surely the issue, from your perspective, is with the script?
Maybe Jobs was neither a goodie or a baddie, but just an imperfect human being like the rest of us. Maybe the actors think it might be a redundant role, since there is video footage of the Steve Jobs presenting products. Maybe Christian Bale has just received a call about work from his old mate Christopher Nolan, since the latter has just wrapped up his latest film?
>maybe they'll fix the ships computer with a iPhone from outside the ships hull
From the director who doesn't use email, writes his drafts on his fathers typewriter, and even used 35mm film for the 'Skype'-like video-chatting in the film?
Whatever you say.
>If they are looking to make a "Professional" device then why would anyone believe this to be an iPAD architecture?
Are you conflating the underlying OS with the User Interface? The most professional UI is the one that is most fit for purpose - so an iPad like device with a fancy digitiser (see 'Modbook, or 'Wacom Hybrid') might be the better device for a graphic artist than a Macbook. A musician might find an iPad a better device to use as a control surface, and find it perfectly secure and reliable enough.
Currently we tend to associate ARM architecture with touch interfaces, and x86 architecture with keyboards and mice - but that is largely to do with the power constraints of mobile devices. It is not written in stone.
OSX, like NeXT before it, has run on different hardware architectures in the past. 3rd party applications might have to be rewritten for an ARM OSX, but they would have to be modified anyway in order to work well with a touch/stylus interface.
Microsoft too are dabbling with x86/ARM agnosticism.
Weird. My midrange Xperia P was updated twice, first to ICS and then to Jelly Bean. There was some Sony software on it but it wasn't all rubbish, and they didn't mess the stock Android around very much (compae to Samsung).
One can take 'somthings' word for it, or one can check XDA to get an idea of how vendors release updates.
Can't find a Nexus 5? Get an LG G2 for the much the same money. Still considered to be a good phone.
>why pay the prices Sony and Samsung charge for the handset (as good as they are compared to the TCO of a contract) when I can get something like the OnePlusOne for half the price?
Shop around mate.
The Z3 Compact's lst price is £450, but it can be had for around £350 - sometimes less.
Try looking at this 'deals' forum:
Or look like a mosquito, like the drone found in Iain M Banks' Consider Phlebas.
>(They aren't around today, but the fossil record shows that the dragonfly design works at much larger sizes).
There was once a lot more available oxygen in our atmosphere... it was this allowed very large insects to breathe. They don't have lungs, and rely on little tubes to take in oxygen, so its a surface area / volume thing.
>I can just see 10,000 people at a sporting event suddenly launching these things...
Well, that's still preferable to people launching water bottles full of piss, no?