As an iPhone owner
I hope Apple copies that power intensive apps screen. That would be handy to have, though not if I had to have it nagging me to shut down apps. Is that screen a part of Android or is it something Huawei added?
12862 posts • joined 12 Feb 2011
Shoving a card in a slot isn't any slower than waving it, unless you thinking slowing you down by 0.8 seconds is a problem. What slows you down is not inserting in the slot, it is inputting a PIN. In the US many banks have limits of about $50 before you need to sign or enter a PIN, below that you just swipe and go.
Quite why anyone would think that waving a card that can be easily skimmed by someone close to you (like on a bus or waiting in a line) and with a laptop bag size of equipment skimmed from 15-20 feet away is more secure than sticking a card in a slot so that one requires a PIN and the other doesn't is beyond me. More people need to do that I guess for your bank to realize that NFC is less secure than a chip w/o PIN, not more secure.
So as always, NFC is a solution looking for a problem. The solution to your problem isn't NFC, it is your bank not requiring a PIN for small transactions.
What's wrong with swiping a card?
NFC is and always has been a solution looking for a problem. Embedding it in a card reduces its security. Adding it to a phone provides nothing that couldn't be done just as well with Bluetooth which has been in every Apple and Android phone since day one.
I will say I am surprised at how many there are here criticizing NFC. I remember these threads a few years ago and it seemed like there was a lot more support for NFC, and a lot of criticism of Apple for not jumping on the bandwagon. At any rate, I'm skeptical that Apple's failure to implement NFC is responsible for the lack of uptake. It certainly hasn't helped, but I think NFC would have failed even if it had been present on the iPhone 5, because people don't want to pay for stuff by waving their phone around.
Its really not much of a gamble, the US has itself to blame for this by offering a tax holiday a decade ago. Had it never been offered, there wouldn't be so many companies playing this game, because there would be less hope it will happen again.
If a company with overseas earnings uses them to buy assets overseas (like a manufacturing plant/equipment, or a company) then that money need never be repatriated, and never gets taxed at the full US rate. That's why a lot of people think this leads to offshoring of jobs. If a company has US manufacturing plants and they need more (or to replace old ones) they need to spend US-based money to do so. Thus it is a lot cheaper to build that new plant overseas, because that money has been taxed at a much lower rate.
The problem is that even if they get Ireland to change their taxation structure, they have to get all the countries to do so to fix the problem. I'm sure the accountants at Apple, Google, Microsoft, Cisco, and so on have already identified a plan B and possibly plan C and D for any change to the tax laws in any country they operate in.
This would be best addressed by treaty, but it would be hard to get signatories since countries are always trying to get an edge over each other. Ireland probably gets more from the scheme they allow (which requires a few employees be based in Ireland, and probably some sort of tax is paid, even if it is minimal) than they'd get if they didn't offer anything special to enable it.
Just as an example of this, I live in a city with a decent sized suburb. My city had a department store in a mall that was past its best years, but was still viable due to that department store. The suburb offered property tax breaks of $15 million over ten years for them to build a new store and move five miles.
For the example given, it wouldn't matter because that dumpster is only going online for a split second a couple times a month. If you want a device to be connected to the cell at all times, that's a different matter.
Maybe they need a hybrid pricing scheme for IoT devices, which charge for minutes connected to the cell, along with data transmitted, to encourage them to be efficient in their usage of resources. And ideally not all try to spit out their data at 12:00:01 every night.
You think Intel will never update Thunderbolt? If it goes optical like they originally planned (then killed for cost reasons) it could easily handle 16 lanes of PCIe 4.0.
Or Apple might totally change the architecture of the next Mac Pro. Considering how rarely it is updated, they might not come out with another one until PCIe 4.0 is released anyway :)
Michael, I agree if computers were perfect I wouldn't need to do that. Wouldn't need backups either, as they'd use AI to know what stuff I delete I'm going to want again, along with finding free cloud backup services while I sleep and updating my files to the cloud the second they change.
I look at it as cheap insurance. If it took 10 minutes to do I wouldn't bother, but it is quick and painless, and provides a simpler way to look at an older version of a document. If I wanted to use the built-in versioning I would have to look up how to do so. It would take a lot of 4 second manual version saves to add up to the minutes I'd take doing that, so I may come out ahead anyway :)
Why not? That way you lose only those changes you made in the last update - it was able to load to make those changes, so it'll load again (after possibly downgrading the software if a more recent update has a bug that prevents loading of your previously OK n-1 saved version)
If you spend hours making the changes, that's a bad thing, but better to lose hours of updates than months or years worth.
If I change a document, I save it as document_06092014 (reverse the 06 and 09 if you're not American ;))
We have the ability to store terabytes of data, so "wasting" it with a bunch of old versions is cheap insurance for problems like this. It takes about 4 seconds extra when saving, so one time avoided going through the TL;DR process in the article more than makes up for it.
Mere instants after the big bang is presumably when dark matter would have formed, too. If it is this rare to create hexaquark particle we've only done so now, what are the odds we'd create one along with whatever it would need to bind with to be stable, and have them collide to form that stable particle?
For that matter, if we did create a tiny blob of dark matter, how would we know? We can't detect it, after all. Perhaps we might be able to tell from the missing energy...anyone know?
Perhaps creating new types of matter will require orders of magnitude more energy than we're able to muster so far. What if the smallest unit of dark matter is a billion times more massive than a neutron? Be pretty hard to find with the LHC.
Just because a hexaquark particle in isolation has a short lifetime doesn't mean that would necessarily be so if we were able to get it together with other exotic or non-exotic particles. What happens if we try to bind a hexaquark and tetraquark together in an exotic atom? Maybe they'll stabilize each other, similar to how binding with a proton stabilizes a neutron.
Who knows what we could learn by experimenting with such material. Maybe that's what dark matter is. Or maybe it could be used someday to build materials with far superior physical properties to ones made of normal matter. We won't know if we don't try, and if we just said "woop de doo" anytime we discovered something new because it wasn't immediately useful at that time we'd still be fighting wars using the thigh bones of animals as weapons.
I'm pretty sure it would be against RFCs to choose "random" MACs for scanning that are assigned as permanent MACs to other devices. Luckily there's a simple way around this - dedicate one of the OUIs assigned to Apple for scanning. We have 65,536x as many MAC addresses as IP addresses, so using one for this purpose wouldn't be a big deal, even if everyone else did the same.
Perhaps there already is a "public" OUI which could be used for this type of thing, similar to the 10.x.x.x and 192.168.x.x blocks for private IP addressing.
Obsoleting older hardware?
You mean like how iOS 7 introduced in September 2013 didn't support the 3gs introduced in June 2009? Except that iOS 6.x has received two updates since, the most recent a few months ago, so while the five year old 3gs is no longer gaining any new features it isn't left out in the cold for security issues.
Go find me a SINGLE Android, Windows or Blackberry device dating from 2009 that is still receiving updates from the manufacturer in 2014. Or a single device dating from 2009 that was still receiving updates in 2012, for that matter.
If you live in the US, you can hardly complain about the capital gains rate, which is pretty low. The way I look at it, the more taxes you pay the less you should complain, because it means you've got more left over after taxes than other people. I've got a friend who complained about his 8 digit tax bill, and was upset when I told if I ever had that I'd go out and celebrate on April 15th because it meant I'd have earned enough that I'd be retired :)
It is a DEVELOPER conference. They had a lot of stuff to announce for developers, with the new iOS 8 features targeted at them, along with the Swift language. Sometimes they've announced products at the developer conference, but that's usually to cover up for when there isn't much going on that would interest developers.
When they want to announce a new product, or the update to the iPhone or whatever they send the press "invitation to a special event" type emails and we have to listen to the press endlessly speculate on what will be announced instead of just waiting a week or two for the actual announcement.
Educating people with a bit about how computer programming works so they understand that they just follow a sequence of simple instructions that can be as lengthy as required would be a good thing.
Those who are interested can go further and actually learn to code, but there's no more need for every kid to learn how to write yet another sorting program. It would be akin to forcing kids to learn how to maintain their car. While some might think it would be a good thing for everyone to know how to change a tire or change their oil, most people are happy to pay someone else to do that and those who want to know can learn on their own or take auto shop in high school (do they still offer that in US high schools?)
I remember articles about chatbots (and that's what this is) fooling over half of people. That doesn't make them intelligent, it makes the people they fooled stupid, gullible or apathetic (do the testees get any sort of reward for guessing right?)
It was rigged by claiming it is a boy from another country, which would allow people to excuse the sort of obvious mistakes a computer will make as being due to him not being a native English speaker, and not being an adult. Let's try it again with some Londoners thinking they're talking to someone who has lived his whole life in London, some Texans thinking they're talking to someone from Texas, etc. It is a lot harder to fool people if they use the kind of wording, slang and expressions they would expect, know places and landmarks, public figures, and so on.
Even if you fooled 100% of people what you'd have would be a program that's good at carrying on conversations. It would be noteworthy, but hardly a measure of intelligence. Until the computer can truly understand what is being talked about, rather than simply coming up with a likely reply to what is being typed at it, the whole thing is silly.
Turing developed his "test" when people didn't really have any idea what machine intelligence would consist of. He assumed that to carry on a conversation the computer would have to be intelligent. He didn't foresee the ability to have a database of gigabytes worth of facts available at the "fingertips" of the computer that will allow it to fake its way through a conversation well enough to fool people. Passing the Turing test is no more proof of intelligence than beating a person at chess by iterating through all the possible moves is. When computers start coming up with original ideas and inventions unprompted and unprogrammed, THEN they'll be ready to kill all the humans and run the world.
The article implies that development won't happen in the UK at Oxford or elsewhere if the law isn't changed like California's is. That's a load of bull. The cars are still in an early stage and won't be for sale for many years. What's wrong with having someone in the driver's seat during development? Presumably they are nowhere near a stage where the cars will go off for test drives with no passengers at all, so there's no reason that passenger can't be in the driver's seat and have a steering wheel and brakes available to him.
Google developing the car without the driver's seat was a publicity stunt, it has no purpose in the world right now when they are so far away from being able to offer this as a product. It isn't like Google Glass where they can release something for what is essentially a public beta. It will take law changes dealing with liability and such. Until that time, there's zero benefit to having a law that allows cars with no one in the driver's seat, or even to build such cars.
Don't care how cool it looks or how nicely it integrates with a smartphone, etc.
I would consider getting something that helps with fitness, since I'd only wear it during workouts. I've looked at the available options, but nothing has struck me as all that awesome. Maybe if it did something useful while sleeping I'd wear it then too.
Anyone working on stupid stuff like notifying you of a text on your phone that's in your pocket is wasting time and designing features for geeks to show each other "oh cool, like what I can do" but not providing anything real people want.
I like how Intel is saying it trounces the competition when its "preferred benchmarks" are used. Have we forgotten the AnTuTu business already? I hope not!
As for seeing a lot of Intel tablets, that's because Intel is buying market share (now they call it "contra revenue" instead of "market development funds", but it is the same thing)
Care to back that up with some data? Don't forget to account for like-with-like, so you can't count say a buffer overflow in a bittorrent client since Windows doesn't ship with one!
Also don't forget that one of the benefits of Microsoft's monthly patch scheduling is that they can roll up a lot of fixes into a single security advisory, while on Linux they are patched as they're found. Windows might have 7 IE fixes in a single monthly bulletin, while if Linux had 7 vulnerabilities found in a month it would account for a lot more bulletins.
Space exploration is difficult and risky, trying to make it as safe as flying a commercial jet makes it so expensive that we can't afford to do it.
People seem fine with loss of life in war, or even during basic training when there is no war. Why are they so worried that someone might die pushing the boundries of human endeavor? All the people who came across the ocean to the US in the 1600s and 1700s, or crossed the US to the west in the 1800s endured more risk than astronauts do today. If they were as risk averse as the leaders of our space program, the US would still be wilderness populated by native Americans living in teepees.
That there are seemingly no shortage of people willing to volunteer for a one way mission to Mars where you knew you would last only a few years at best, and a few seconds at worst shows how out of touch NASA's administration and congress are over this.
Which Google doesn't want to do, so they're giving them all individually take it or leave it contracts. Google didn't even approach Merlin, because they wanted to use their power to screw the little guy.
I wonder how many of those who are apologizing for Google on this would say the same thing if it was Apple who was handling all the little guys take it or leave it deals to continue to be carried on iTunes?
The only attackable (as opposed to NULL dereference crash or DoS) one if you're not using DTLS requires a server running 1.0.1 or 1.0.2.
While Apple is using 0.9.8 on its clients, even if it were also doing so on its own servers, there are plenty of OpenSSL servers out there iOS and OS X devices may connect to that are running 1.0.1 or 1.0.2, leaving them just as vulnerable as everything else. At least they don't use OpenSSL in Safari or Mail, so it isn't as easy to hit this problem as it is on other platlforms that use OpenSSL everywhere.
Politicians were (and still are) deathly afraid of getting blamed for making a wrong decision, and trying to make us safer is seen as the "safest" political choice, so they can claim they did something.
Look at the Benghazi situation, and how much worry (granted mostly partisan) there is over a handful of deaths (not to dismiss them, but it hardly compares to 9/11) Imagine what would have happened to Bush if there had been another big attack several years after 9/11, or to Obama if there had been/will be another during his administration?
They keep these programs secret because if there's a big attack, they can release some details and say "look at everything we've been doing, but even then the terrorists got around it, its not our fault!"
Remember all the rumors about how amazing Amazon's tablet was going to be, and it turned out it was just another me-too Android tablet with the Googly bits removed?
The shills are out in full force since AMZN has dropped by nearly 30% so far this year, so they're hoping to find fresh suckers to buy the stock that is still massively overvalued, considering Amazon's utter inability to turn a profit despite ever increasing revenue.
I think that movie, and Lake Placid, spoiled me as I'm utterly unimpressed by a 16 ft long 900 lb croc. When I read "monster croc" I was thinking there might be some 40 foot beast in prehistoric times. That would be a reasonable size considering how big some of the other megafauna used to be.
If that doesn't work out, another solution would be covering the front windows with removable monitors that display images collected by cameras on the outside. If the cameras get damaged/dirty or a monitor malfunctions, it can be removed and set aside and they're no worse off than they are today.
Or the pilots could wear HUD helmets like in the F35, and they wouldn't need windows at all.
When it "flies home", does it follow the exact same path it took to get to that point, or go directly? What about obstacles or areas it shouldn't fly over, or a low flying helicopter? Or running out of fuel if it takes the same path?
That sounds like a good idea in theory, but I'm not so sure it is that ideal of a solution in practice.
Drones ought to be required to have in their design that if they lose radio contact with the controller or suffer any sort of mechanical failure that the engine is shut off, a parachute deployed, and it falls gently to the ground causing no damage.
Well, little damage - if it drops on an expressway it might cause a bit of trouble...
Once people are no longer allowed to take the law into their own hands, you can't let people be truly "free" in the way you suggest, because it is too easy to dodge responsibility for one's actions.
Or do you really believe that many drone owners wouldn't just leg it if they lost control and the last thing they saw was it veering towards the big glass window on someone's house or a playground full of kids?
It wouldn't matter if Intel is bankrolling it, the big TV makers are all pushing to 4K for several reasons. One, because it is actually more expensive to make the large pixels necessary for "only" 2K once you exceed 60" or so. But far more importantly, because they think it will drive upgrades of TVs as people see their old HD sets as obsolete. It won't, of course, 4K is 3D part deux, but in a couple years 4K will be a "for free" feature in TVs, just like 1080p displays went from a premium feature to something included even in all but the very lowest end TVs.
Of course we paid for that by Panasonic dropping plasma, which was a heavy price to pay...
The idea that it will drive demand because people take 4K pictures is laughable. Professional/avid amateur photographers, sure. People taking 4K selfies or vacation photos with their iPhone or GS? Yeah, right!
I will be forced to finally upgrade my nearly six year old Q9400 based PC when I can buy a quality 32" 4K monitor/TV at a reasonable price. But I'll probably just toss in a $70 graphics card, since I can't see any reason why I should care about upgrading the rest of the system. Sometimes 8GB is tight, and that's as high I can go with the 945G chipset, but that's because Firefox sucks, not because I actually need more than 8GB.
Sorry Intel, but the idea that you'll sell Broadwell based PCs because of 4K support is a huge stretch.
Most won't notice. Those who do can download the Google Search app and use it instead, just like they can download the Google Maps app and use it if they prefer it.
Actually, given the announcement that iOS8 will add secure APIs for apps to communicate and replace OS level functions such the keyboard, it might be possible to have searches on the iPhone work via the Google Search app and maps functionality work via the Google Maps app. So this might be much ado about nothing - those who care can use what they want, those who don't are steered away from Google.
Yeah, but considering they've been stamping out 14nm chips for months now, and 10nm is the next step, she's only talking about "seeing their way clear" to chips they're going to be mass producing not much more than a year from now.
That's hardly going to reassure people that Intel believes Moore's Law can continue...
Backing up the 200TB library of congress? Too bad there aren't hard drives that can store a few terabytes on their own available for next to nothing, then any public library that cared to could replicate the library of congress. Or that there aren't cloud providers who measure their storage space in millions of terabytes, who could replicate it hundreds of times all over the world.
While using bitcoins as a backup mechanism is moronic, changing the algorithm used so useful work is done instead of meaningless computation would be a great improvement. If people are going to invest millions in custom hardware, at least have that custom hardware producing results that are useful to the scientific world.
As a side benefit, if there isn't a single algorithm but a range, either that custom hardware would have to be more general purpose, and thus useful to scientists solving other problems, or bitcoin mining wouldn't be taken out of the hands of regular people and relegated to specialized ASIC rigs only.
It would be funny if botnet operators hoping to make a buck mining bitcoins stumbled upon the solution to some major scientific problem. Think about all those bots harnessed for good, instead of (only) enriching the criminal element.
There's a "simple" solution to 4G Glass. Replace your drywall with metal lath plaster walls. Windows with triple pane, ultra low-e treatment on the inside and outside layer. Unless those windows overlook a tower, no cellular will penetrate that building :)
Biting the hand that feeds IT © 1998–2019