Re: 9,223,372,036,854,775,808 sha1 calculations
There are a total of 150 bits different, by the way.
123 posts • joined 5 Mar 2008
There are a total of 150 bits different, by the way.
They have published the two documents, I've downloaded them and confirmed their claims.
They are 422435 byte PDFs, differ in 62 of those bytes, have the same sha1 hash, and show different contents (the background colour). I haven't checked if there are any other differences in the display.
I haven't worn a watch since I got my first GSM phone (Ericsson T10s) that showed the time in around 1999 or 2000.
But I went into half a dozen stores at the beginning of November trying to buy an iWatch and no one had the Series 2 for sale, only for display. I'll try again next week when I pass through Dubai airport.
You won't sell many if they're not actually available.
Of to argue about spaces vs tabs is to miss the point of the article. However…
I really don't care which, or how many spaces, or how wide tabs are. Just put a suitable description string in a comment in the file and Emacs (or one of the many editors that also understands it) will automagically put in the appropriate number of spaces or tabs or some combination thereof whenever you press the return key, or the tab key near the beginning of the line, or the semicolon at the end of the line.
Once that comment is there, I don't have to worry my silly little head about spaces or tabs again.
The Red Arrows have always used a cheap trainer, which seems like a shame for such a populous, rich, and important country!
In my opinion even the RNZAF's A4 Skyhawks were a better display plane, let alone the RAAF's F/A-18s.
The USN Blue Angels now use the F/A-18, before that the A4, and before that the F4 (which the UK used to have a few of).
I think the F/A-18 (especially the F model) is the best display aircraft in the world right now, because:
- real fighter size and power
- not too expensive to purchase or run at $60m, vs $100m for Typhoon, and probably $40m for new Hawks.
- impressive high-alpha manoeuvres and very tight loops and turns
Here's my own shitty iPhone 4 video of an RAAF F/A-18 from a few years ago. Not being zoomed in too much (ahem) gives a good perspective on how small the display box is.
I think the UK should bite the bullet and use the Typhoons. Would be impressive. And the UK can afford it.
The article is ridiculous. The Alpha (even the original 21064) was perfectly good at bit-bashing and implementing any compression format you wanted.
Thanks to Anton Ertl, here are some timings of decompressing a 3 MB deflated file to 10 MB (i.e. gzip, zlib etc) using zcat:
0.351s 280m clocks 800MHz Alpha 21264B (EV68) 2001
0.258s 310m clocks 1200MHz ARM Cortex A9 (OMAP4) 2011
0.224s 240m clocks 1066MHz PPC7447A 2003
0.116s 230m clocls 2000MHz Opteron 246 2003
The fastest uses 25% fewer clock cycles than the slowest, and the slowest is not the Alpha.
Yes, the Alpha is the slowest in absolute time, but it's also the oldest and lowest clock speed. The Alpha was always considerably higher clock speed than its rivals made at the same time, at least until the Pentium 4 (which was high clock speed but not fast per clock).
What probably happened is Microsoft took some generic compression CODE (not format) that used byte reads and writes and simply re-compiled it for the Alpha, which didn't have byte instructions. It's not hard to rewrite the code to read 32 or 64 bits at a time and then use the (very good!) bit-banging instructions to split them into smaller parts. But that would take someone a week to do. You'd think Microsoft would have the skill and manpower to do that work, but apparently not.
The resulting code, by the way, would run faster on pretty much every other CPU too.
I don't get this "could not design a diesel engine that would meet the stricter US emissions standards" thing.
If it meets the standard in a particular software mode then … just leave it in that software mode all the time. Job done, you've designed an engine that meets the standards, no cheating.
So they CAN do it. Presumably that has some adverse affects, such as hurting drivability or power. But they can do it.
> Is more than one car in 10,700 on the road a Tesla, or less? I don't know but my money is on "far less."
According to wikipedia: "An estimated 71,000 Model S cars have been sold in the United States through April 2016". Plus maybe another 5000 Model X. Call it 75000 total.
75000 * 10700 is just over 800 million.
Are there more than 800 million cars on the road in the USA, with population 320 million? No. It's about 260 million, including trucks and buses etc.
So, by that measure, Teslas are involved in 3x fewer accidents than the average vehicle.
I would bet that Teslas are driven at least an average amount of miles, and probably quite a bit more than average.
"Autopilot" seems like a perfectly reasonable name for it to me. At least if you know what autopilots in normal small planes -- or even $50m ones -- actually do.
The autopilots found in virtually all aircraft will stupidly hold the altitude and direction you tell them, and will happily fly you straight into the side of a mountain or another aircraft.
Remember Germanwings Flight 9525? That was an Airbus A320, with one of the most sophisticated autopilots you'll find in a civilian aircraft. But tell it to fly at 100 ft and that's exactly what it will do.
Traditionally an autopilot's most fundamental task is keeping the plane level and relatively straight so the pilot can spend a few seconds or a minute or two reading the map.
Second is altitude hold. Simple ones just let you maintain the current altitude when you turn it on (same air pressure). More complex let you program a target altitude and will automatically climb or descend to it (but in small planes you have to watch the speed and adjust the throttle yourself).
Third is heading hold. Simple ones just maintain the current heading when you enable it. More complex let you move a "bug" on a kind of compass display to set a new heading. Historically, that's twin engine or small turboprop territory. More expensive still will automatically follow a path directly to or from a "VOR" or "ILS" radio beacon.
Fourth is speed control. Historically this is definitely only bigger planes. Otherwise you set the throttle yourself and you get whatever speed you get.
That's the historical state of the art, up until well under 20 years ago when GPS started to get approved for aircraft navigation. That lets you program in more complex and less restricted flight paths.
But it still knows nothing about mountains. At the most, some autopilots know about the Minimum Safe Altitude (MSA) in the current area -- that is an altitude 1000 ft above the highest point in a 25 mile radius.
As for avoiding other aircraft, the "TCAS" system was introduced around 1990 (but very rare then). This allows planes to pick up the radar transponder replies from other aircraft, determine where they are, and warn the pilot if there is a possible collision. This is now required for aircraft with more than 19 passengers, but in almost all cases it only gives a voice warning to the pilot who must still manually respond. It is also limited to telling the pilot to climb or descend to avoid the other aircraft, never to turn.
In 2013 Airbus announced the availability of retrofitting integration of TCAS with the autopilot across their range. The A380 (only) had the ability for a few years already. Given the reluctance of airlines to spend money, and the size of fleets, this will have made its way into only a tiny percentage of planes by now.
Just want to mention a couple of things I noticed in comments:
-- the current standard Raspberry Pi model (Pi 3) is in fact 64 bit ARMv8 already. There is not yet OS support for 64 bits .. but that's kind of related to this article :-) I expect there will be soon enough. Of course there are still older models in service, and the Zero is not even ARMv7 (no Thumb2).
-- the armel vs armhf issue seems to have been decided for a while. All the common Linux distros seem to have already adopted armhf. Tizen is still armel. That's probably the most important one. Interestingly, armel vs armhf doesn't affect the kernel interface. No syscalls pass FP values directly as arguments, so the kernel is absolutely agnostic on this issue, and you can run both armel and armhf userland on the same kernel at the same time. I know this: I'm doing it every day to do armel dev on Pi and Odroid boards running armhf distros of Linux.
The Pi3 is quite nice. I have one on my desk. But I also have an Odroid XU4, possibly the best bang for the buck in ARM boards.
The XU4 is $74 vs $35 for the Pi. But it's not actually twice the price, because you have to add at minimum an SD card to both, which is going to cost you a bit for a decent sized fast one. You also have to add a power supply to the Pi, which is probably going to set you back $20+. The XU4 comes with a 20 W power supply.
If you want a headless server to SSH into then you're done. Otherwise, add keyboard, mouse, monitor to both and the price difference starts to look trivial.
Once that's sorted out, the XU4 also has:
- about 2.5x the CPU power in each the main 4 cores (not even counting the LITTLE ones)
- 2 GB of fast DDR3 RAM vs 1 GB
- gig ethernet vs 100
- four or five times faster SD card interface (when used with a good card)
- PLUS an eMMC interface that is twice as fast as the SD card
- USB3 vs USB2
What's missing on the XU4?
- WIFI, Bluetooth, and 3.5mm sound out (but HDMI has sound)
The XU4 competes closely per core with a similar MHz core/core2. Think "White MacBook" of 2008 MacBook Air or something like that. But it's got twice as many (fast) cores.
"It's mainly Emirates that try to squeeze 3-4-3 into a 777, most carriers stick with 3-3-3."
That seems to be contradicted by ...
... and ...
... both of which show 3-4-3 in Air NZ 777-200ER and 777-300 with 17.1 width and 32-33 pitch.
True, the older 777-200ER V1 shows 3-3-3, with 17.8 width. I don't believe any of those remain in service?
By contrast, Emirates A380 has 18.0 seat width in economy in 3-class config. For comparison A320/321 I've flown recently include Jetstar (17.8 or 17.9) and Aeroflot (18.0).
"carriers (well most of them) want to operate smaller aircraft more frequently. gone are the days of one flight per day to popular destinations"
I'm not so sure about that.
Here's a shot of the departures board in Auckland airport last Saturday while I was waiting to board EK413: https://pbs.twimg.com/media/Ci4oS5CUUAAOQU3.jpg
Not one, but three A380 flights departing from the same far-flung small city to Dubai within a 45 minute period, one each via Sydney, Melbourne, and Brisbane.
My flight was, I guess, about 90% - 95% full on both the Auckland to Sydney and Sydney to Dubai legs.
I don't dispute that the A380 is ugly, but there is nothing else that I'd be prepared to spend a total of about 18 hours in. The economy class seats are the biggest and most comfortable in the sky. The competitors (and alternative Emirates flights too) B777s have the most cramped seats in the sky -- Boeing designed it for 3-3-3 seating but the airlines forced them to squeeze in 3-4-3. As a result, 777 seating is WORSE than what you have in B737 or A320 for puddle jumper flights.
Note: I haven't yet encountered a B787. I believe the environmental conditions are similar to the A380 -- quieter, slower decompresion and recompression, and higher humidity than previous planes. I don't know about the seats.
WHY is very slow deflation -- at a rate matching the general increase in wealth and productivity -- a bad thing?
STUFF gets slowly cheaper. What's wrong with that?
I fear that Tim has got a bit confused about things such as gold and bitcoin.
They are not the only kind of money, they are only the cash. Both still allow the creation of money via entries in ledgers.
Tim is correct that as we get richer we need more cash -- or at least a greater *value* of cash. Both gold and bitcoin can increase in value arbitrarily so that even though there is a fixed amount of each, the value of that amount can increase.
For this to work, the cash has to be infinitely divisible, so that the smallest unit of cash can still buy one cheeseburger, not 1000 cheeseburgers as a minimum.
Gold is in theory infinitely divisible, but it could get a bit impractical when you need an electron microscope to count your cash to buy a cheeseburger.
Bitcoin is in actual fact infinitely and conveniently divisible.
Note: the Satoshi is 1/100,000,000th of a bitcoin, and is currently the smallest unit. That in itself will last us a very very long time of expansion of the value of bitcoins before it become a problem. But if it ever does become a problem, all that is needed to fix it is a simple software update of the bitcoin protocol.
... this could be one reason that Apple does so well since St Jobs came along and axed dozens of indistinguishable models of computer and replaced them with the 4-way desktop/laptop vs consumer/pro matrix of models. (plus choice of RAM and disk sizes within each model)
They've continued this with iPods, iPhones and iPads. There are currently five models of iPhone (5s, 6, 6+, 6s, 6s+) and five of iPad.
I don't even know where to start when someone asks me to help them decide on a Windows computer or Android phone. And I'm a computer processional working on the guts of Android (the Java compiler/runtime) for a major Android phone manufacturer!
Yes, in NZ all lanes are legally equal, and in some way separate roads. It is regarded as polite for slow traffic to keep left, but I don't think it's enforceable.
I once got pinged for using a motorway onramp to overtake, there being three vehicles driving persistently side by side in the three lanes for some km. My offence, apparently, was changing lane from a proper motorway lane onto the onramp -- not the overtaking itself.
In fact Samoa, (formerly a NZ territory and still very closely linked), changed from driving on the right to driving on the left on 8 September 2009. They also by the way changed time zones by 24 hours in December 2011, to be in the same day and NZ & Australia instead of nearly a day behind like Hawaii (and American Samoa).
In NZ there are distinct "passing panes" in which the fast traffic merges with the slow, and "slow vehicle bays" in which the slow traffic merges with the fast.
The problem with slow vehicle bays is that no one driving a car ever thinks of themselves as "slow" (trucks are better about this), so I very often end up overtaking two or three cars on the left side, using the slow vehicle bay.
Excuse me while I take a Sharpie to every AA NiMH cell I've ever owned...
Do you know what you sound like? I'm reminded *so* much of those MS-DOS die hards when Windows 95 was already out.
Early Android was pretty much crap, but even an iPhone fanatic has to admit that Lollipop and even Kitkat aren't terrible. There's no real reason to switch from iOS to Android (or Mac to Windows) but for those without an investment in apps and accessories there isn't a lot of downside either.
Recent Windows phone too.
But anything pre-iPhone is positively primitive. We've got the cheap computing power now, there's no reason not to use it. The better phones now have more computing power than Mac and Windows laptops had when the iPhone came out.
Eddy, yes and that is exactly why the Dow should be ignored. Price weighting makes no sense whatsoever. Market cap weighting is the only meaningful way to make an index. Ridiculous that a stock having a split should change their contribution to the index, or move the index at all.
This is pretty silly. Most of the bugs found on OS X were in SSL, bash and so forth that are present on Linux as well, just not in the *kernel*.
Is there any evidence that this shonky outfit actually invested people's funds in anything, let alone in bitcoins? It sounds like it's just a pure con for the greedy.
The reporting seems very unclear to me whether people paid in dollars, or bitcoins, or what.
HK$400,000 is about US$50k. If that is promised to pay out 90 bitcoins, that's only about US$20k at the moment, though it was up to $90k at this time last year.
The first is that the only resource from the Earth we're actually using up is energy. And the actual atoms still exists even after we've thrown things away.
Eventually, our garbage dumps will become the highest concentration sources of iron, copper etc and we'll start mining them.
The other reason in that the multiplier between physical resources used and economic value can be as close to infinite as you can imagine.
Suppose a comedian stands up on a stage for an hour, and then sells the recording to 100 million people as an internet download for $10 each. That's a billion dollars of value created from the expenditure of how many resources? A Big Mac, pretty much.
What on earth is this "of course"?
SpaceX already has massively lower costs than ULA even if they throw away every Falcon 9 after use, just like ULA does.
If/when they do manage to reuse engines or stages, that will drop their costs massively again.
But they don't *need* it. Let alone "of course".
Interesting the amount of ad hominem against Mr Page. An argument equally as invalid as "consensus".
No, that's not really the case. Even under the most extreme predictions, it is vastly cheaper to adapt to changes in climate than to try to make deliberate adjustments to the climate of the whole world.
I would place a reasonable sporting bet that it will be cooler in 2030 than it is now ... maybe not as cool as 1990, but maybe 1995.
I should get pretty good odds on that, right?
Have SpaceX actually said they'll attempt a landing with the Deep Space Climate Observatory launch?
A landing attempt means keeping fuel in reserve, and not using it for the primary mission, which subtracts from the available launch performance.
Geosynchronous or interplanetary launches normally require the full Falcon 9 performance capability.
ISS and other low earth orbit launches don't require maximum performance, and that's when they've been trying the flyback experiments.
Yes, it's very sad that there is no quad core option now. However I don't think it was Apple's choice to make :(
If you check the aforementioned I5-4260U, I5-4278U, I5-4308U, I7-4578U CPUs you'll find they all use the BGA-1168 socket and the I7-4578U is the newest and highest spec'd chip for that socket. In order to sell a quad core machine Apple would have to use a different motherboard, which isn't really sensible for what would no doubt be a low volume machine, even as Minis go.
The quad core chips at the moment use socket BGA-1224. (and have a 35W TDP rather than 28W, but that shouldn't be a show stopper).
I'm sympathetic to the argument that you can build a PC cheaper. My current (i7 4790k) and previous (i7 860) OS X work machines have been home build Hackintoshes using the fastest chip available -- and then overclocking it -- at a fraction of Mac Pro (or even iMac) price.
However, the iMac is very small unobtrusive and quiet. These new ones also use *6* Watts when sitting idle (http://support.apple.com/en-us/ht3468), which is what most home servers do most of the time. My home made PCs use around 75W sitting idle. Over the course of a year that's a difference of 600 kWh, or about 100 quid less electricity for the Mini. As the 1.4 GHz one only costs 400 quid in the first place, you're probably ahead vs a home made PC within two years.
To comment on a few messages at once...
Before the bicycle, the average person married someone born within two miles of them. With the bicycle, it jumped suddenly to 10 miles. (just adding numbers to the point already made) To extend it, my parents were born 500 miles apart (thanks to summer work experience in a university course), while my fiancee and I were born 10250 miles apart (thanks to internet).
The money spent by Branson and his rich customers is not burned. Or at least very very little of it is :-) 99% plus goes to pay the salaries of a lot of people, who in turn buy good and services from yet other people.
The energy cost for a human to get to orbit is about same same as flying from Sydney to Los Angeles on a 1970's 747. Airliner efficiencies have improved since that calculation, but it's certainly still less than flying from Sydney to London, which probably half the population of Australia has done at some point. The main difference is that a 747 can be reused an hour after it lands. The second difference is that a 747 at takeoff is half plane, half fuel and passengers, while chemical rockets need to be 90% (staged) - 95% (single stage) fuel at takeoff. Put those together and there's no physics or economics reason for a ticket to orbit to cost more than a flight in 1st class from Sydney to London.
Yes, what Branson did ten years ago with SpaceShipOne is pretty much what the USAF did in the early 1960s with the X15. The difference is Branson & Paul Allen spent $30 million, while the X15 program cost $300 million in 1960s money, which would be similar to about $2.4 billion today.
Good grief ... I spend NZ$19 a month for more phone service than I need (especially minutes and SMS). In 24 months that's $456 or US$377 at today's rate.
The Fire is only $0.99 on an $80 per month two year contract. A total $1920.99 commitment. Without a contract it's $449. And locked to AT&T.
Tripp Lite patent: http://www.google.com/patents/US7094086
Apple patent: http://1.usa.gov/1sQGcQ1
A quick read (could be mistaken) seems to show a difference in the Tripp Lite one having the tongue slide ether way, while the Apple one is about materials that allow it to flex to either side while maintaining contact.
AMD had a great run. The K6 and K6-2 were great (though I never owned one .. I kept a Pentium Pro 200 Linux machine for quite a while).
I built an Athlon 700 in early 2000. And then a few years later in early 2004 an Athlon 3200+. Both matched or beat Intel's chips at the time, and for less money.
And then AMD saved us all from Intel's vision of 32 bit x86 forever (and the awful Pentium4 at that) and Itanic if you wanted 64 bits.
I never did own a true AMD64. The Core 2 Duo was just as fast in a laptop and on the desktop, and by the time that felt slow my best option seemed to be (in Jan 2010) building a quad core i7 and overclocking it.
And I've just this week built an i7-4790K box (reliably doing 4.6 GHz on 4 cores, 4.9 GHz single core) that should see me for the next few years.
AMD doesn't have anything to touch the i7, and certainly not for $400. (they're ok if you want a low end box, but I don't)
It's very very sad. I'd buy AMD by preference if they were at all close.
If you look at their code, it's just some statically allocated multidimensional arrays, some simple nested loops (corresponding to array dimensions), and a small amount of arithmetic and occasional if statement.
The C code doesn't give you the safety net of array bound checking, and the for loop syntax is annoying to the uninitiated, but there's really not a lot of difficulty in using C or C++ in this way. Certainly the development time difference between it and Python would be absolutely minimal.
Interesting that their test Mac is a 2.3 GHz quad core i7 (model unspecified), while the Windows machine is a quad core i7-3770 at 3.4 GHz, and yet the execution times are pretty much identical or even faster.
The conclusion from that is the Mac is a laptop with massive SpeedStep taking it up to to around 3.3 or even 3.5 GHz (current 15" Retina MBP).
Python sounds reasonable. But I'd be likely to do it as a cron job, written largely in bash. It might well call a bit of python for some reason or other.
p.s. I am well acquainted with optimising the sh*t out of things in C (or asm if necessary) having, for example, been involved in writing a commercially successful (right up until iPhone and Android came out) compiler to turn Java games into native code on cruddly little feature phones with 1 MHz CPUs and as little as 400 KB of RAM.
I've got an iPad with an A7 in. It benchmarks and feels nearly identical to my 2.5 GHz core2duo laptop. That's plenty for a MacBook Air style laptop even today.
Where, by "over a year", you mean "the last TWO generations of iPhone and iPad"?
It's not about whether this particular model is or is not a good or mature one. It's about long running numerical calculations being pretty much inherently unstable unless heroic measures are taken to organize every step in them for minimum error. Just getting a simple gaussian elimination right is a complex and specialized task.
We KNOW that typical climate scientists are not numerical analysis whizzes. Recall that in FOIA2011 email 1885.txt, Phil Jones admits to not knowing how to calculate a linear trend in Excel (or anything else):
It's very likely that other, more mature, climate models used in publications:
1) have serious numerical analysis instabilities, to the point of being GIGO
2) have never been run on more than one kind of computer
3) would show exactly the same kinds of problems if they were
And I have a few times.
A bit tough if you're only going to have one quid on the first day. Easy if you get five pounds at the start. If you've only got to achieve a £1/day overall spend rate then you can have your choice of rice, pasta, potato, flour each day. Ingredients for a home made loaf of bread are about 30p. Spices to make it interesting are dirt cheap once you acquire them as you use very little.
Avoiding feelings of hunger will be easy.
Getting enough energy intake will be easy.
Even getting enough protein will be easy with eggs and cheese.
Getting a long term healthy balanced diet won't be at all easy, but doesn't matter if it's only five days.
Flywheels are really great at absorbing or providing huge amounts of power over short timescales: a few seconds to maybe a few minutes. They're also great for a very large number of charge/discharge cycles. I don't think there's any successful example of using them to store energy for days.
A Trojan IND17-6V battery weights 188 kg, stores 7.21 kWh of energy (if you take 4 days to extract it, about 5.5 kWh if you extract it in one day). It is designed for 1500 cycles at 80% discharge depth or 5000 cycles at 20% discharge depth. It costs about $1250.
So two of these batteries will hold as much energy, weigh 380 kg (840 lb), cost $2500, and last 4 - 12 years depending on how far you discharge them each time on average.
Clearly, both newspapers had simply read early Apple UI guru Bruce Tognazzini's purely "what if?" blog post from a week ago:
It doesn't take any more than that to explain the close timing in the media, and *certainly* it doesn't mean Apple has a product on the verge of release.
It would be out of character for them not to have been playing with the idea in the labs for a few years already though, waiting for the gating technology to hit.
It's entirely possible to make a video unavailable in only one country while others elsewhere continue to watch and comment on it.
"Remove the cycles and look at the underlying trend."
The problem is we don't have enough hard data to know whether the underlying trend is accelerating, or just part of a longer cycle.
We do know that there have been some pretty amazingly big temperature extremes in the last 20 or 30 million years. And that the Earth has not raced off to become another Venus or another Mars in the process … it's corrected itself somehow and come back to what we have now.
20 or 30 million years seems like a very long time compared the couple of thousand years of recorded history, until you realize that mammals have been around for 200 million years and great apes (i.e. very like us) for 40 million years. They didn't have our technology to help them adapt to extremes, but they survived them.
The point that natural cycles, if they exist (as the Met Office is now agreeing with skeptics on), can both accelerate and stall temperature rises is an excellent one.
The alarmists are now saying "sure temperatures are stable now, but that just means they'll go up in a hurry later on".
That's likely to be correct, but misses the point.
The point is that what we care about is the possibility that warming is being caused by humans AND will be harmful AND that we could do something about it by decreasing our activities. If it turns out that the rapid warming in the 80s and 90s wasn't caused by us then there will be very little that we can do about any future rapid warming anyway, and we'll just have to learn to live with it.
Climate scientists work out all the natural things that they can think of that affect the climate, put them into mathematical models, and then just ASSUME that any temperature rise in excess of that was caused by us, and specifically by our CO2 emissions. This has in the past caused them to think that doubling CO2 from pre-industrial times could cause a temperature increased of 3 degrees or 4 degrees or even in some scary reports 6 degrees.
If you instead assume that there is an approximately 60 year natural cycle that is flattening temperatures now and caused them to rise faster in the 80s and 90s (and decrease in the 50s and 60s), and continue to assume that all the rest of the unaccounted-for change was caused by CO2, what you come up with is that a doubling of CO2 levels from pre-industrial times might cause a 1.5 degree temperature increase.
That's a lot less scary than 4 or 6 degrees. And we've already seen 0.8 C of that.
There is also the possibility that there are more yet longer cycles not yet accounted for in the models. Good temperature records only go back 150 years, but there is considerable anecdotal evidence of an approximately 1000 year cycle that we're in the rising part of.
There is no such single thing as "considered statistically significant". It depends what your purpose is.
Among real scientists 2 sigma or 95% is considered to be the point at which you say "oh, that looks interesting, lets research it more to see if it's real" (i.e. try to add a few more sigma on to the probability.
The following article shows how physicists deal with such statistics, and don't consider something to be proven unless it reaches 5 sigma or 99.99994% probability that the observations didn't happen by chance.
That discussion is largely in the context of deciding whether or not Higgs' boson is real, a subject of absolutely no practical importance to anyone not in that academic field.
That something as serious as hurting the world economy to the tune of trillions of dollars and condemning millions of the most vulnerable people to starvation and death via raising the prices of food and energy should be decided by mere 2 sigma evidence is utterly ludicrous.
I roast my own in a popcorn machine too. Green beans cost about 40% of what roasted beans cost in 200g quantity (I can't buy more otherwise it's stale before I use it), but before roasting they keep no problem for many months. It takes about ten minutes to roast a batch and it's super fresh.
Thanks for the twitter link to the research: http://t.co/HhvDJuAk
The paper's three scenarios assume an old IPCC prediction of 1.8°C to 4°C temperature increase by 2100, which was based on the sudden increase from 1980-1998 continuing. No one believes that any more.
The actual temperature increase by 2100, if any, is much more likely to be similar to the last century overall, with its rises and falls (e.g. 1940-1980), and therefore considerably better than his "best" scenario.
Aron also raises an excellent point, that wild coffee has clearly survived for many thousands of years, and quite probably across several ice ages and interglacial temperatures warmer than today.
The Ethiopian "plateau" with altitudes from 1300m - 3000m (an 11 C temperature range at 6.5 C/km atmospheric lapse rate) would seem to provide an ideal environment for the wild plants to be able to naturally migrate upwards and downwards tracking temperature changes.
I quickly found one coffee producer in Ethiopia whose web site says they are selling wild coffee, not cultivated, growing at altitudes of 1750m - 1850m, which is below the middle of the plateau altitude range (i.e. in the colder half of the survivable climatic temperature range).
The Arduino documentation is actually really opaque on just what Wiring is. They try to aim things at "artists" who are scared of programming.
What Wiring turns out to be is a source code compatible subset of both Java and C++, along with a library containing classes and functions for memory management, timing, serial I/O, digital and analogue I/O, and so forth.
You can write exactly the same code and run it in either a Java or a C++ environment. This is done with the aid of a preprocessor: in the case of Java the preprocessor wraps your code (variable and function definitions) with scaffolding for a class containing everything; in the case of C++ the preprocessor adds declarations for your functions at the top of the file. In both cases the Wiring library is automatically imported.
When you are using the Arduino IDE there is in fact nothing preventing you from using full C++ rather than merely the Wiring subset. The IDE (which is written in Java) uses a standard copy of gcc (targeting AVR or ARM as appropriate) for the code generation and what runs on the Arduino is machine code.
You can easily link in your own assembly language routines if you want.
I'll bet this is happening now because SO MANY people have installed the beta of iOS6 (possibly as many as have ICS on Android!), and many are then complaining about how bad the data is in the new (non Google) Maps application.
The beta is NOT there for random people to use in real life — it's there for developers to test their own apps to make sure they will be compatable with the new OS, and to start to use the new APIs. It's perfectly reasonable for the maps data to be strictly sample data at this stage.
If it's still bad when iOS6 is released to the general public then that's the time to complain, not when you jump through hoops to install an early access version meant only for developers.
Biting the hand that feeds IT © 1998–2017