Jay would say:
“Affleck, you da bomb in Phantoms, yo!”
And to be fair, the films he has directed, Gone Baby Gone, The Town, and Argo are pretty good. As an actor, he works well David Fincher's Gone Girl.
6074 posts • joined 21 Jul 2010
“Affleck, you da bomb in Phantoms, yo!”
And to be fair, the films he has directed, Gone Baby Gone, The Town, and Argo are pretty good. As an actor, he works well David Fincher's Gone Girl.
>What in buggery do we do with a country that naturally produces Daily Mail readers? Nuke it from orbit?
Well, if you are going to take that option, then you have nothing to lose by trying some slightly less drastic ideas first. Um.... widespread dispersion of LSD and MDMA? If this experiemht fails, then drop the bomb and sterilise the Petri dish.
But serioulsy, compare the attitude of the Red Tops in the 1980s to today.... they no longer pick on homosexuals, trade unionists, commies, blacks or whoever in the way they did then. Its true that anti-immigrant rhetoric is on the rise - in pubs, just as it is mirrored in the papers - but that appears linked to people not feeling well off.
Basically, if people feel happy and hopeful they are nicer to each other. If people feel naffed off and oppressed, they want someone to blame.
Self driving cars would be a threat to auto manufacturers because not as many cars would be built- after a car has dropped you off, it would then go pick somebody else up. More people would travel in one car, because lift-sharing would be easier - unless of course you pay extra to avoid the fellow plebs. Fewer cars would be damaged in accidents. Cars would require less maintenance because they would have fewer cold starts, their engines would spend longer at their optimum rpm, and traffic control systems would eliminate start-stopping at traffic lights.
I'm an Android user, as I said. However, experience has taught me that the users of a platform are the best critics of it.
I don't use a Windows phone, but the little of it I've seen I've liked. I don't know how I'd find if I used it or longer.
I have iOS-using friends who have 'jailbreaked' their devices because they havewanted more than the stock UI/OS offered.
I've never had enough incentive to 'root' my Android phone.
My bad. Phablets *popularised* by Samsung. The Dell Streak was being sold at discount for years after its release.
>How galling it must be to be an Android ODM and cram in interesting new features
Interesting, yes. Useful? Not always.
- Waterproofing (Samsung, Sony). Useful
-Stylus input (Samsung Note). Useful
- Eyeball Tracking (Samsung). Interesting. Potentially useful accessibility applications for users with impaired motor function.
-Heart Rate Sensor (Samsung Galaxy). Umm...
The trouble for the Android OEMs is that if they do introduce an interesting new feature and it becomes popular with consumers, there is little to stop other Android OEMs from doing exactly the same. Product differentiation is short lived. Example: Samsung first making 'phablet' phones. Everyone else soon does the same.
> Stone Age UI
"The Stone Age didn't end because we ran out of stones"
UI schmueye... I'm an Android user, but even though a Lollipop update has been waiting for me to install it for a few weeks, I haven't yet bothered - the current KitKat UI is just fine so I'll let good people on the internet test the update for me first. "If it ain't broke..."
OSX hasn't drastically changed for an even longer period of time.... and given the wailing and gnashing of teeth about Windows 8's UI changes (I'm still on 7), Apple must feel a little vindicated.
I'll leave it to iOS users here to comment on whether iOS has any major annoyances that need fixing.
Design is judged purely on being fit for purpose.
A paring knife can be well designed, you don't judge it on its ability to carve up a chicken.
>- Plastic so not as durable but a lot lighter
According to the article the Macbook weighs 0.92 Kg. According to Lenovo.com, the Yoga 3 weighs 1.2Kg
>Can you actually buy any other laptop (with a £1000 start price) with just a VGA resolution webcam?
How many buyers of this Macbook don't already have an iPhone? If I had to guess, I'd hazard that most Macbook users will already own an iPhone and use its camera instead.
Anyway, who wants to see me in HD? I've only ever seen Tommy Cooper, Charlie Chaplin and Peter Cook in SD, yet their facial expressions came through.
Now, in an ideal world we world have the best and biggest of everything. Unfortunately, engineering is about priorities and compromises.
>It's an iPad with a keyboard. Credit where it's due: Apple know their market.
A good fraction of it, yes. "Yeah, it's like the iPad you're used to with its 8hr battery and high res screen, but it's got a keyboard" is a pretty clear and simple way to communicate this machine's strengths (and compromises) to potential buyers. Some of them might think to themselves: "Well, the only time I pick up my laptop instead of my iPad is when I want to type something, and this thing isn't much heavier, so.... hmmm maybe".
It is only for a good fraction of Apple's market though - they do still make Macbook Pros and Airs.
The use/charge patterns of tablets has provided data about the way *some* people will use this Macbook. If my phone is charging and I decide I want to use it, I unplug it, do what I want to do, and then plug it back in again. It is not expected that people will not often use this laptop whilst it is plugged in - indeed, with its battery life, it will be rare that they have to - so the clear benefits of the MagSafe connector would rarely come into play anyway
In the future, more people will use USB-C monitors and port extenders (remember, everyone is moving to this standard). Unlike power bricks, these live on the desk not on the floor, so a tugged cable won't drag the laptop off the desk.
Many people are used to phones and tablets that don't need ports.
[I have an Android phone with microSD and USB OTG - both of which I have only used once. If I got my arse in gear and finally sorted out my home storage system and network, I wouldn't even have plugged my phone into my laptop.]
It is intended to show intent, just as the original Air (no ethernet!!) and the Bondi Blue iMac (no floppy disk drive!!) were in their time - one got another USB socket for MK 2, the other became more sensible looking. In time, it will be given a faster CPU, and the rest of the world will have adopted USB C.
Just think of it as an iPad that is good for emails - for some people that is just fine. Others can buy a Macbook Pro, Alienware or whatever best fits their own personal needs.
...what is the provenance of those photos? They look like they have been blown-up from thumbnails! :)
A timer for heating.
OK. Do you tell it what time you want it to come on, or do you tell it what temperature you want it to be at 7:15am?
The latter type is smarter, more consistant and more efficient.
"Anything invented before you were born is just the way of the world. Anything invented before you have reached the age of thirty is new and exciting, and you can probably get a career in it. Anything invented after you are thirty is new-fangled rubbish and you should have nothing to with it. "
- Douglas Adams
In the UK, households can *not* have their water supply cut off for non-payment.
Yep, the physical key can still be used as backup - just as with most cars.
Curious observation: US tech blogs often occasionally feature stories about high-end physical locks being defeated by Biro-lids or paper clips... or how the physical key can be recreated from a photograph of it.
No lock is perfect - I've seen combination locks on doors with what is widely knows as Shiny Button Syndrome, the buttons used to enter the code having become shiny through use. SBS makes it clear to an intruder that they only have to try 16 combinations instead of 1000 (given the code is four digits long)
>using remote locking systems in their cars - How many of the newer ones are garbage security?
The newer ones attempt to do more, and are poorly implemented. However, the Rolling Codes remote locking system has been near-standard in cars for over a decade.
> Surely the smartlock would require access by nfc token/phone or by a command prompt from a phone - requiring a command to be input by the screen?... so hands free is still not an option.
An NFC tag doesn't require the user to insert it into a lock, just have it in proximity - this is demonstrably an easier action to perform. Think of users with arthritis, for example.
Yeah, for *some* car owners. However, remote locking has been a near-standard feature on vehicles for over a decade, and so I stand by my statement that consumers are used to it, and appreciate its benefits.
A poor implementation doesn't discredit the concept, especially when there is a history of it being implemented successfully.
A key is fine if you have a free hand. Place yourself in the position of someone with hands full of shopping and a small child in tow....
A lot of people are used to using remote locking systems in their cars. The peace of mind that would come from locking your house door and knowing that the gas hobs are off and the iron is off is not to underestimated.
The concept is sound. The devil, as always, is in the details of the implementation.
A lot of people already possess remote locks on hire-purchase - the locks on their cars. The consumer is accustomed to walking away from a vehicle, pressing a keyfob, and having the car confirm that all the doors are locked and all the windows are closed.
>If only the hands free devices didn't have the same "look at me, I'm a dork" effect.
Well, there is this Sony device, the size of a USB dongle - it is a Bluetooth handset can be held to the ear like a small phone, or have a wired headset plugged into it:
If the US gov was that fussed about your genetic data, they would just snaffle it from that lab that analysed it in the first place.
Ween them off their dependency on [a smartphone OS that isn't based on selling their data]?
How altruistic of you. You're not in the advertising industry are you?
Exactly. Under this proposed system, the user is in possession of their digitised DNA data and can choose to send it on to any researcher *if* they wish.
People are right not to trust the NSA / insurance companies etc, and people who warn about the dangers and the slippery slopes are correct to do so. However, a lot of public good can come from sharing data. It is shame that people's data has been abused by governments and companies, since it makes people reluctant to contribute their data when it might be appropriate.
Having freedom over your data means the freedom to share it if you choose.
Apple's business model is charging money upfront for hardware, content and software. Whilst no organisation can be completely trusted, at least they have an incentive to not share their users' data. I also prefer the influence they attempt to exert on the US giovernment in this regard, compared to Google or Facebook's ad-based data-hoovering.
I'm sorry to hear of her accident too. I was going to write exactly what 1980s_coder did.
All we really know is that it is hard to guess the rate of technology. It might be the case, in a few years time, that this technology is improved in every way, and can be usefully deployed in in people who still have the use of one eye. It could be used to display a zoomed image, or an infra-red image, for example.
As prosthetics, glass eyes have been with us for centuries. A prosthetic eye would be the ideal housing for the auxillary parts of this system, namely the optics, IR transmitter and battery. Cosmetically, this would be better than any existing glass eye, because by necessity a system would have to be developed to keep it aligned with the user's other eye.
100mm² in the article refereed to the proposed device size for *human* eyes, not the devices already tested in eyes of rats.
So, if we estimate a human eye to be roughly 5-15 x the diameter of a rat eye, and you say the rat test device was 1mm in diameter, 100mm² sounds about right.
Another direction:look at the diagram of a human eye. You know how big your pupil is. Use that to gauge the diameter of the optic nerve, and the way the optic nerve spreads to cover the retina. Now ask yourself how you would imake a device to interface with that nerve; interfacing at the root would be too fiddly, interfacing at the branches would waste surface area of the device.
A device just smaller than a contact lens sounds about the right size.
>100mm² seems a tad large?
That's equivalent to a square with sides 1cm (10mm) long - that's smaller than a contact-lens. That seems to in the right ball-park for something that has to interface with optical nerve endings.
Aesthetics. I'm serious - if you read the background to the patent application, it because QR codes don't look very nice. That is the reason.
Now, I agree with you - for many operations I like to have confirmation that it has worked. That is, until the technology matures and become so reliable that any conformation is largely redundant.
>As is my utter utter disdain for all things crApple.
Does your disdain stem from their business practices, their products, or some small selection of their customers?
-Apple can play hard-ball in their business practices, true. So do others in the business when they can.
-Their products are actually pretty good for many use-cases. Rival products may suit others better.
-Their customers are normal people, including idiots and posers but many good folk too.
Personally, I find Apple interesting because of the unique position they hold in the market - they can move quicker for a few reasons. I am also a product designer - which paradoxically means that my industry doesn't use Macs because our software hasn't been available for OSX (even though it was on UNIX in the 90s, it is pretty much Windows-based these days).
The reasons some engineers don't use Macs dates back to the 80s.
>"strawberries aren't berries"
It used to be that plants were named for their properties, not their relation to each other. This is why New World chillis are called peppers, even though they are not related to the Old World spice 'pepper' in any way.
Already covered to death in a different thread. No blood oximeter works well through some tattoos.
> allergy enriched watchstraps
Not an allergy as such. Not specific to Apple. A sweat issue for some users with a specific strap, who probably haven't worn a watch in years. Easily solved by using a different strap, from Apple or a 3rd party.
Breaking news: Apple Watch doesn't measure the heart rate of some double-amputees.
> As were common shapes such as rounded edges.... It's all old hat now.
Okay Baldrick, one more time:
The rounded corners were an example of a "Design Patent", which is not the same thing as a (proper) "Utility Patent". Indeed, in the UK we use the term "Trade Dress" instead - an example would be the unique shape of the Coca-Cola bottle. It is unfortunate that the USPTO uses the term "Design Patent" because it evidently confuses people. That said, it does make it easy to spot the people who comment without educating themselves first.
The USPTO does need some serious revising, though.
Quite. This is not specific to any specific watch brand, Apple included. The material used and the fit of the strap are factors.
Conventional watch straps are available in materials including titanium, gold, stainless steel, ceramics, leather, Kevlar, cotton, Nylon, rubber, silicone etc.
YMMV. Nothing new here.
It apes the function, not the method. Patents are how something is achieved, not what is achieved.
Example: Most loudspeakers use a diaphragm and a moving coil to make sound. NXT made speakers that make sound, but hold patents on using piezoelectric panels to do so.
It depends upon context and range. I worked in workshops where the unit is always mm, even for dimensions of over 1m. It avoids cock-ups of the "I thought you said..." variety. It is common to specify technical drawings as using mm, and thus a key dimension would be labelled as 1200mm and not 1.2m. It cuts down on mental processing overheads.
I'm sure you can appreciate that similar conventions exist within the context of hospitals, for good reason.
Why would Apple make a phone that used more energy than necessary to power its radios when battery life is a competitive selling point?
It would more sense for the phone to disable its own WiFi, Bluetooth and 4G when they are not being actively used - like the Stamina Mode on some Sony phones.
Given the size of this proposed case, you might as well get a cheaper 'battery case' and increase your phone's battery life 50% +.
Sony will be using Android, LG using what was once Palm OS, and Samsung will use Tizen.
Maybe Firefox will end up on Tesco's home-brand TVs?
Sony's Xross Media Bar is an example of a TV UI that is based on what you mention - four direction buttons, [Enter] and [Back] etc. making it suitable for traditional IR remote controllers, as well as Sony's games console controllers. http://en.wikipedia.org/wiki/XrossMediaBar
However, why base a modern TV UI around the limits of these traditional Human Input Devices when touch-screen devices are so cheap? Or indeed 'free' if one assumes the user already has a modern mobile phone (if they don't, what's a £40 Android device to a £500+ TV?). One can then browser the Electronic Programme Guide without obscuring the currently playing programme, or bring up a virtual qwerty keyboard to search for content.
The nicest way to watch Youtube or iPlayer videos on a TV is to find them on a tablet or phone, and them then 'send' them to the TV. If the TV doesn't have this functionality built in, a games console or inexpensive dongle will add it.
Don't get me wrong, a traditional IR remote is good for adjusting the volume or flicking between a few favourite channels, but a touch-screen is better for more involved functions.
Sony will soon be using Android on their TVs, just as LG and Samsung use a former mobile phone OSs on their TVs - Web OS (from Palm) and Tizen respectively. (so I'm not sure who Firefox think they're courting).
... some Personal Media Players with the WMC logo on them? Part of MS's strategy around 2003 was to have portable devices with the WMC interface on them.
In practise, most people used iPods or other devices specific propriety user interfaces. Or they installed RockBox on their device.
There are a lot of solutions out there, varying in capability, ease of use, reliability and cost.
Do you need to change your software for the time being? How many tuners do you need? Is buying new dedicated hardware out of the question? Do you have any special requirements, such as automation? Are you planning on buying a new TV in the near future - since some of them can record to USB media?
avforums.com might be a good place to look and to ask.
It was facing competition from dedicated DVRs, Video on Demand, more dubious sources of content from the internet, inexpensive media-playing dongles, older games consoles re-purposed as the same (indeed, that's how XBMC got its name), Blu-Rays a more practical source of HD video for many people...
The Media Centre interface worked well with a traditional IR-remote controller for local media. However, I find that touch-screen tablets and phones work better as remote controllers for selecting on-line content on a TV - because a virtual qwerty keyboard makes searching for content easier, and content and be selected and queued *before* it is sent to the big TV screen (the interface isn't obscuring what is currently playing). This functionality is available with many a combination of iOS and Android devices, modern games consoles and dongles like Chromecast.
The integration of human-approximating language patterns into amanfromMars1's communication modules has come on by leaps and bounds, I see.
Or women with three breasts.
HAL didn't malfunction. He followed the orders that were given him. Unfortunately for his human crew, his original mission orders were added to at the last minute by some secret and poorly thought-through orders, and HAL resolved the conflict in accordance with orders he was given. As a computer, he performed perfectly.
If HAL knew that ""[human] Performance decrements, memory deficits, and loss of awareness and focus during spaceflight may affect mission-critical activities" , it would only support his decision to remove unreliable factors (humans Poole and Bowman) from the situation.
Quite why HAL killed the sleeping mission specialists, I don't know. They couldn't have interfered with HALs successful completion of the mission if they were hibernating.
>Just make sure there's enough spare brains in the fridge so they don't eat each other.
Just send the fridge of brains to Mars orbit, then interface with surface robots. Brains weigh less than bodies, so can be sent more quickly thus reducing exposure to cosmic rays.
Okay, we we're a long way from having serviceable brain-machine interfaces, but it is the 'bloody difficult' category, not the 'violates the laws of physics" zone. Initial research into it is paid for by other people (i.e Military and Healthcare), looking to make better prophetic limbs.
>Throw away the ascientific EmDrive, then we will talk.
How is it not scientific? They observe results that they can't explain, so it is tested by other teams, and then tested again but less room for error. By following this path, eventually either the physicists will have learnt something new, or the engineers will discover what they have overlooked in the experiments (i.e an explanation for the observations that doesn't give the physicists such a headache).
There is still room for error, but less since the most recent tests in a hard vacuum.
I'm not saying that it works, but we don't yet know that it doesn't work.
>All of which leads me to the only sane conclusion: in order to get to mars safely, we need to use the moon as a spaceship.
There is also the Mars Cycler concept promoted by Buzz Aldrin:
As I understand it, you get a bloody great rock into a cyclic path back an forth between Earth and Mars, and then human craft hop on at one end and hop off at the other. The mass of your rock isn't an issue, so more shielding can be used.
As regards going faster, do you remember those tests of an Electromagnetic Drive the Reg reported on? Last year thrust was observed without the expulsion of any propellant. Last month, it was tested in a vacuum and thrust was still observed.
Please note, despite the linkied article jumping enthusiastically to potential spaceflight applications, this isn't yet a prototype engine; engineers still may have overlooked some alternate cause of the observed 'thrsust', or physicists have some retrhinking to do.