Re: cash is not the only money
WHY is very slow deflation -- at a rate matching the general increase in wealth and productivity -- a bad thing?
STUFF gets slowly cheaper. What's wrong with that?
110 posts • joined 5 Mar 2008
WHY is very slow deflation -- at a rate matching the general increase in wealth and productivity -- a bad thing?
STUFF gets slowly cheaper. What's wrong with that?
I fear that Tim has got a bit confused about things such as gold and bitcoin.
They are not the only kind of money, they are only the cash. Both still allow the creation of money via entries in ledgers.
Tim is correct that as we get richer we need more cash -- or at least a greater *value* of cash. Both gold and bitcoin can increase in value arbitrarily so that even though there is a fixed amount of each, the value of that amount can increase.
For this to work, the cash has to be infinitely divisible, so that the smallest unit of cash can still buy one cheeseburger, not 1000 cheeseburgers as a minimum.
Gold is in theory infinitely divisible, but it could get a bit impractical when you need an electron microscope to count your cash to buy a cheeseburger.
Bitcoin is in actual fact infinitely and conveniently divisible.
Note: the Satoshi is 1/100,000,000th of a bitcoin, and is currently the smallest unit. That in itself will last us a very very long time of expansion of the value of bitcoins before it become a problem. But if it ever does become a problem, all that is needed to fix it is a simple software update of the bitcoin protocol.
... this could be one reason that Apple does so well since St Jobs came along and axed dozens of indistinguishable models of computer and replaced them with the 4-way desktop/laptop vs consumer/pro matrix of models. (plus choice of RAM and disk sizes within each model)
They've continued this with iPods, iPhones and iPads. There are currently five models of iPhone (5s, 6, 6+, 6s, 6s+) and five of iPad.
I don't even know where to start when someone asks me to help them decide on a Windows computer or Android phone. And I'm a computer processional working on the guts of Android (the Java compiler/runtime) for a major Android phone manufacturer!
Yes, in NZ all lanes are legally equal, and in some way separate roads. It is regarded as polite for slow traffic to keep left, but I don't think it's enforceable.
I once got pinged for using a motorway onramp to overtake, there being three vehicles driving persistently side by side in the three lanes for some km. My offence, apparently, was changing lane from a proper motorway lane onto the onramp -- not the overtaking itself.
In fact Samoa, (formerly a NZ territory and still very closely linked), changed from driving on the right to driving on the left on 8 September 2009. They also by the way changed time zones by 24 hours in December 2011, to be in the same day and NZ & Australia instead of nearly a day behind like Hawaii (and American Samoa).
In NZ there are distinct "passing panes" in which the fast traffic merges with the slow, and "slow vehicle bays" in which the slow traffic merges with the fast.
The problem with slow vehicle bays is that no one driving a car ever thinks of themselves as "slow" (trucks are better about this), so I very often end up overtaking two or three cars on the left side, using the slow vehicle bay.
Excuse me while I take a Sharpie to every AA NiMH cell I've ever owned...
Do you know what you sound like? I'm reminded *so* much of those MS-DOS die hards when Windows 95 was already out.
Early Android was pretty much crap, but even an iPhone fanatic has to admit that Lollipop and even Kitkat aren't terrible. There's no real reason to switch from iOS to Android (or Mac to Windows) but for those without an investment in apps and accessories there isn't a lot of downside either.
Recent Windows phone too.
But anything pre-iPhone is positively primitive. We've got the cheap computing power now, there's no reason not to use it. The better phones now have more computing power than Mac and Windows laptops had when the iPhone came out.
Eddy, yes and that is exactly why the Dow should be ignored. Price weighting makes no sense whatsoever. Market cap weighting is the only meaningful way to make an index. Ridiculous that a stock having a split should change their contribution to the index, or move the index at all.
This is pretty silly. Most of the bugs found on OS X were in SSL, bash and so forth that are present on Linux as well, just not in the *kernel*.
Is there any evidence that this shonky outfit actually invested people's funds in anything, let alone in bitcoins? It sounds like it's just a pure con for the greedy.
The reporting seems very unclear to me whether people paid in dollars, or bitcoins, or what.
HK$400,000 is about US$50k. If that is promised to pay out 90 bitcoins, that's only about US$20k at the moment, though it was up to $90k at this time last year.
The first is that the only resource from the Earth we're actually using up is energy. And the actual atoms still exists even after we've thrown things away.
Eventually, our garbage dumps will become the highest concentration sources of iron, copper etc and we'll start mining them.
The other reason in that the multiplier between physical resources used and economic value can be as close to infinite as you can imagine.
Suppose a comedian stands up on a stage for an hour, and then sells the recording to 100 million people as an internet download for $10 each. That's a billion dollars of value created from the expenditure of how many resources? A Big Mac, pretty much.
What on earth is this "of course"?
SpaceX already has massively lower costs than ULA even if they throw away every Falcon 9 after use, just like ULA does.
If/when they do manage to reuse engines or stages, that will drop their costs massively again.
But they don't *need* it. Let alone "of course".
Interesting the amount of ad hominem against Mr Page. An argument equally as invalid as "consensus".
No, that's not really the case. Even under the most extreme predictions, it is vastly cheaper to adapt to changes in climate than to try to make deliberate adjustments to the climate of the whole world.
I would place a reasonable sporting bet that it will be cooler in 2030 than it is now ... maybe not as cool as 1990, but maybe 1995.
I should get pretty good odds on that, right?
Have SpaceX actually said they'll attempt a landing with the Deep Space Climate Observatory launch?
A landing attempt means keeping fuel in reserve, and not using it for the primary mission, which subtracts from the available launch performance.
Geosynchronous or interplanetary launches normally require the full Falcon 9 performance capability.
ISS and other low earth orbit launches don't require maximum performance, and that's when they've been trying the flyback experiments.
Yes, it's very sad that there is no quad core option now. However I don't think it was Apple's choice to make :(
If you check the aforementioned I5-4260U, I5-4278U, I5-4308U, I7-4578U CPUs you'll find they all use the BGA-1168 socket and the I7-4578U is the newest and highest spec'd chip for that socket. In order to sell a quad core machine Apple would have to use a different motherboard, which isn't really sensible for what would no doubt be a low volume machine, even as Minis go.
The quad core chips at the moment use socket BGA-1224. (and have a 35W TDP rather than 28W, but that shouldn't be a show stopper).
I'm sympathetic to the argument that you can build a PC cheaper. My current (i7 4790k) and previous (i7 860) OS X work machines have been home build Hackintoshes using the fastest chip available -- and then overclocking it -- at a fraction of Mac Pro (or even iMac) price.
However, the iMac is very small unobtrusive and quiet. These new ones also use *6* Watts when sitting idle (http://support.apple.com/en-us/ht3468), which is what most home servers do most of the time. My home made PCs use around 75W sitting idle. Over the course of a year that's a difference of 600 kWh, or about 100 quid less electricity for the Mini. As the 1.4 GHz one only costs 400 quid in the first place, you're probably ahead vs a home made PC within two years.
To comment on a few messages at once...
Before the bicycle, the average person married someone born within two miles of them. With the bicycle, it jumped suddenly to 10 miles. (just adding numbers to the point already made) To extend it, my parents were born 500 miles apart (thanks to summer work experience in a university course), while my fiancee and I were born 10250 miles apart (thanks to internet).
The money spent by Branson and his rich customers is not burned. Or at least very very little of it is :-) 99% plus goes to pay the salaries of a lot of people, who in turn buy good and services from yet other people.
The energy cost for a human to get to orbit is about same same as flying from Sydney to Los Angeles on a 1970's 747. Airliner efficiencies have improved since that calculation, but it's certainly still less than flying from Sydney to London, which probably half the population of Australia has done at some point. The main difference is that a 747 can be reused an hour after it lands. The second difference is that a 747 at takeoff is half plane, half fuel and passengers, while chemical rockets need to be 90% (staged) - 95% (single stage) fuel at takeoff. Put those together and there's no physics or economics reason for a ticket to orbit to cost more than a flight in 1st class from Sydney to London.
Yes, what Branson did ten years ago with SpaceShipOne is pretty much what the USAF did in the early 1960s with the X15. The difference is Branson & Paul Allen spent $30 million, while the X15 program cost $300 million in 1960s money, which would be similar to about $2.4 billion today.
Good grief ... I spend NZ$19 a month for more phone service than I need (especially minutes and SMS). In 24 months that's $456 or US$377 at today's rate.
The Fire is only $0.99 on an $80 per month two year contract. A total $1920.99 commitment. Without a contract it's $449. And locked to AT&T.
Tripp Lite patent: http://www.google.com/patents/US7094086
Apple patent: http://1.usa.gov/1sQGcQ1
A quick read (could be mistaken) seems to show a difference in the Tripp Lite one having the tongue slide ether way, while the Apple one is about materials that allow it to flex to either side while maintaining contact.
AMD had a great run. The K6 and K6-2 were great (though I never owned one .. I kept a Pentium Pro 200 Linux machine for quite a while).
I built an Athlon 700 in early 2000. And then a few years later in early 2004 an Athlon 3200+. Both matched or beat Intel's chips at the time, and for less money.
And then AMD saved us all from Intel's vision of 32 bit x86 forever (and the awful Pentium4 at that) and Itanic if you wanted 64 bits.
I never did own a true AMD64. The Core 2 Duo was just as fast in a laptop and on the desktop, and by the time that felt slow my best option seemed to be (in Jan 2010) building a quad core i7 and overclocking it.
And I've just this week built an i7-4790K box (reliably doing 4.6 GHz on 4 cores, 4.9 GHz single core) that should see me for the next few years.
AMD doesn't have anything to touch the i7, and certainly not for $400. (they're ok if you want a low end box, but I don't)
It's very very sad. I'd buy AMD by preference if they were at all close.
If you look at their code, it's just some statically allocated multidimensional arrays, some simple nested loops (corresponding to array dimensions), and a small amount of arithmetic and occasional if statement.
The C code doesn't give you the safety net of array bound checking, and the for loop syntax is annoying to the uninitiated, but there's really not a lot of difficulty in using C or C++ in this way. Certainly the development time difference between it and Python would be absolutely minimal.
Interesting that their test Mac is a 2.3 GHz quad core i7 (model unspecified), while the Windows machine is a quad core i7-3770 at 3.4 GHz, and yet the execution times are pretty much identical or even faster.
The conclusion from that is the Mac is a laptop with massive SpeedStep taking it up to to around 3.3 or even 3.5 GHz (current 15" Retina MBP).
Python sounds reasonable. But I'd be likely to do it as a cron job, written largely in bash. It might well call a bit of python for some reason or other.
p.s. I am well acquainted with optimising the sh*t out of things in C (or asm if necessary) having, for example, been involved in writing a commercially successful (right up until iPhone and Android came out) compiler to turn Java games into native code on cruddly little feature phones with 1 MHz CPUs and as little as 400 KB of RAM.
I've got an iPad with an A7 in. It benchmarks and feels nearly identical to my 2.5 GHz core2duo laptop. That's plenty for a MacBook Air style laptop even today.
Where, by "over a year", you mean "the last TWO generations of iPhone and iPad"?
It's not about whether this particular model is or is not a good or mature one. It's about long running numerical calculations being pretty much inherently unstable unless heroic measures are taken to organize every step in them for minimum error. Just getting a simple gaussian elimination right is a complex and specialized task.
We KNOW that typical climate scientists are not numerical analysis whizzes. Recall that in FOIA2011 email 1885.txt, Phil Jones admits to not knowing how to calculate a linear trend in Excel (or anything else):
It's very likely that other, more mature, climate models used in publications:
1) have serious numerical analysis instabilities, to the point of being GIGO
2) have never been run on more than one kind of computer
3) would show exactly the same kinds of problems if they were
And I have a few times.
A bit tough if you're only going to have one quid on the first day. Easy if you get five pounds at the start. If you've only got to achieve a £1/day overall spend rate then you can have your choice of rice, pasta, potato, flour each day. Ingredients for a home made loaf of bread are about 30p. Spices to make it interesting are dirt cheap once you acquire them as you use very little.
Avoiding feelings of hunger will be easy.
Getting enough energy intake will be easy.
Even getting enough protein will be easy with eggs and cheese.
Getting a long term healthy balanced diet won't be at all easy, but doesn't matter if it's only five days.
Flywheels are really great at absorbing or providing huge amounts of power over short timescales: a few seconds to maybe a few minutes. They're also great for a very large number of charge/discharge cycles. I don't think there's any successful example of using them to store energy for days.
A Trojan IND17-6V battery weights 188 kg, stores 7.21 kWh of energy (if you take 4 days to extract it, about 5.5 kWh if you extract it in one day). It is designed for 1500 cycles at 80% discharge depth or 5000 cycles at 20% discharge depth. It costs about $1250.
So two of these batteries will hold as much energy, weigh 380 kg (840 lb), cost $2500, and last 4 - 12 years depending on how far you discharge them each time on average.
Clearly, both newspapers had simply read early Apple UI guru Bruce Tognazzini's purely "what if?" blog post from a week ago:
It doesn't take any more than that to explain the close timing in the media, and *certainly* it doesn't mean Apple has a product on the verge of release.
It would be out of character for them not to have been playing with the idea in the labs for a few years already though, waiting for the gating technology to hit.
It's entirely possible to make a video unavailable in only one country while others elsewhere continue to watch and comment on it.
"Remove the cycles and look at the underlying trend."
The problem is we don't have enough hard data to know whether the underlying trend is accelerating, or just part of a longer cycle.
We do know that there have been some pretty amazingly big temperature extremes in the last 20 or 30 million years. And that the Earth has not raced off to become another Venus or another Mars in the process … it's corrected itself somehow and come back to what we have now.
20 or 30 million years seems like a very long time compared the couple of thousand years of recorded history, until you realize that mammals have been around for 200 million years and great apes (i.e. very like us) for 40 million years. They didn't have our technology to help them adapt to extremes, but they survived them.
The point that natural cycles, if they exist (as the Met Office is now agreeing with skeptics on), can both accelerate and stall temperature rises is an excellent one.
The alarmists are now saying "sure temperatures are stable now, but that just means they'll go up in a hurry later on".
That's likely to be correct, but misses the point.
The point is that what we care about is the possibility that warming is being caused by humans AND will be harmful AND that we could do something about it by decreasing our activities. If it turns out that the rapid warming in the 80s and 90s wasn't caused by us then there will be very little that we can do about any future rapid warming anyway, and we'll just have to learn to live with it.
Climate scientists work out all the natural things that they can think of that affect the climate, put them into mathematical models, and then just ASSUME that any temperature rise in excess of that was caused by us, and specifically by our CO2 emissions. This has in the past caused them to think that doubling CO2 from pre-industrial times could cause a temperature increased of 3 degrees or 4 degrees or even in some scary reports 6 degrees.
If you instead assume that there is an approximately 60 year natural cycle that is flattening temperatures now and caused them to rise faster in the 80s and 90s (and decrease in the 50s and 60s), and continue to assume that all the rest of the unaccounted-for change was caused by CO2, what you come up with is that a doubling of CO2 levels from pre-industrial times might cause a 1.5 degree temperature increase.
That's a lot less scary than 4 or 6 degrees. And we've already seen 0.8 C of that.
There is also the possibility that there are more yet longer cycles not yet accounted for in the models. Good temperature records only go back 150 years, but there is considerable anecdotal evidence of an approximately 1000 year cycle that we're in the rising part of.
There is no such single thing as "considered statistically significant". It depends what your purpose is.
Among real scientists 2 sigma or 95% is considered to be the point at which you say "oh, that looks interesting, lets research it more to see if it's real" (i.e. try to add a few more sigma on to the probability.
The following article shows how physicists deal with such statistics, and don't consider something to be proven unless it reaches 5 sigma or 99.99994% probability that the observations didn't happen by chance.
That discussion is largely in the context of deciding whether or not Higgs' boson is real, a subject of absolutely no practical importance to anyone not in that academic field.
That something as serious as hurting the world economy to the tune of trillions of dollars and condemning millions of the most vulnerable people to starvation and death via raising the prices of food and energy should be decided by mere 2 sigma evidence is utterly ludicrous.
I roast my own in a popcorn machine too. Green beans cost about 40% of what roasted beans cost in 200g quantity (I can't buy more otherwise it's stale before I use it), but before roasting they keep no problem for many months. It takes about ten minutes to roast a batch and it's super fresh.
Thanks for the twitter link to the research: http://t.co/HhvDJuAk
The paper's three scenarios assume an old IPCC prediction of 1.8°C to 4°C temperature increase by 2100, which was based on the sudden increase from 1980-1998 continuing. No one believes that any more.
The actual temperature increase by 2100, if any, is much more likely to be similar to the last century overall, with its rises and falls (e.g. 1940-1980), and therefore considerably better than his "best" scenario.
Aron also raises an excellent point, that wild coffee has clearly survived for many thousands of years, and quite probably across several ice ages and interglacial temperatures warmer than today.
The Ethiopian "plateau" with altitudes from 1300m - 3000m (an 11 C temperature range at 6.5 C/km atmospheric lapse rate) would seem to provide an ideal environment for the wild plants to be able to naturally migrate upwards and downwards tracking temperature changes.
I quickly found one coffee producer in Ethiopia whose web site says they are selling wild coffee, not cultivated, growing at altitudes of 1750m - 1850m, which is below the middle of the plateau altitude range (i.e. in the colder half of the survivable climatic temperature range).
The Arduino documentation is actually really opaque on just what Wiring is. They try to aim things at "artists" who are scared of programming.
What Wiring turns out to be is a source code compatible subset of both Java and C++, along with a library containing classes and functions for memory management, timing, serial I/O, digital and analogue I/O, and so forth.
You can write exactly the same code and run it in either a Java or a C++ environment. This is done with the aid of a preprocessor: in the case of Java the preprocessor wraps your code (variable and function definitions) with scaffolding for a class containing everything; in the case of C++ the preprocessor adds declarations for your functions at the top of the file. In both cases the Wiring library is automatically imported.
When you are using the Arduino IDE there is in fact nothing preventing you from using full C++ rather than merely the Wiring subset. The IDE (which is written in Java) uses a standard copy of gcc (targeting AVR or ARM as appropriate) for the code generation and what runs on the Arduino is machine code.
You can easily link in your own assembly language routines if you want.
I'll bet this is happening now because SO MANY people have installed the beta of iOS6 (possibly as many as have ICS on Android!), and many are then complaining about how bad the data is in the new (non Google) Maps application.
The beta is NOT there for random people to use in real life — it's there for developers to test their own apps to make sure they will be compatable with the new OS, and to start to use the new APIs. It's perfectly reasonable for the maps data to be strictly sample data at this stage.
If it's still bad when iOS6 is released to the general public then that's the time to complain, not when you jump through hoops to install an early access version meant only for developers.
I was asleep and ordered my Pi about two hours after orders went live on February 29.
On April 27 I got an email from Farnell (nz.element14.com, actually), saying "Having successfully passed its CE compliance testing, we can now confirm that your Raspberry Pi will be dispatched week commencing 28-May-12."
I haven't heard anything more since, so hopefully that's still true.
While the price is similar to Arduinos intended for general prototyping (e.g. the Uno), they're not really the same thing at all.
On the one side, Arduinos go down to much lower prices, with a bare chip with Arduino boot loader available for about 5 EUR. And Arduinos are suitable for hard real-time programming as they don't run an operating system at all — you're programming the bare metal.
On the other side, the Pi has at least 100 times more CPU power and 100 thousand times as much RAM. And while the Pi has a few pins of digital I/O, it has no analogue I/O at all.
The Pi is much more comparable to something like an old 700 MHz - 1 GHz Pentium III box that you can pick up used for about the same price (I have half a dozen of them in use as routers and so forth so I know). The Pi is much physically smaller and uses less power. But it's lacking any ability to add multiple ethernet cards or the like.
I'm an instructor at a gliding club in New Zealand. We do about 50 credit card transactions a month for trial flights worth on average around £100 each. In the last five years we've gotten internet (and electricity!) in our clubhouse, but at this transaction level online authorization machine rental plus transaction fees would cost us far more than the fees for our old zip-zap machine. We get a considerable reduction in fees by phoning in each transaction to get an authorisation number, so it is pretty much as safe for the card issuer as an electronic transaction would be.
The situation may well be different for organisations with a lower average transaction value.
Can't argue with the result, but that seems like a crazy flight path to use to get maximum distance! That thing very nearly didn't pull out of the dive.
Dick Smith NZ just in the second half of 2011 stopped stocking components. I'm actually surprised they lasted that long. I remember that even in the 90's Radio Shack in the USA didn't have components any more. That Dick Smith did surely meant that it was a personal non-economic decision by someone in upper management.
I'm going to miss it, but It's a bit rich for Mr Dick Smith to complain about management decisions 30 years after he sold the business.
Is the write of the article really not familiar with Steve's history of previous rants? I used to read them religiously, back in his Amazon days.
Stevey's Drunken Blog Rants™
This should be an absolutely painless transition for almost everyone.
Virtually all Arduino programming is done in C++.
They call it "Wiring" but that's just:
- a standard library that provides interface functions and symbolic constants for the I/O pins, timers, USB etc
- a preprocessor that adds a #include for the library header and prototypes for all your functions so you don't have to care about the order
- a driver main() that calls setup() and then repeatedly calls loop()
- the avr-gcc c++ compiler
You already have to select which board you are compiling for from a menu. Different boards are using different AVR chips with varying MHz, varying amounts of flash and RAM and EEPROM, and different constants for the I/O devices. The ARM chip just has more and faster of everything.
There will be an "Arduino Due" item in the "Board" menu which as well as the normal selection of the appropriate library header file will select the arm-gcc compiler instead of the avr-gcc compiler.
No ordinary end-users are likely to have written anything in AVR assembler. 99.9% of everything will Just Work.
@Ojustaboo, you're 90% right. The only thing I take issue with is that while Apple is making big profits at that price (and could lower it if they needed to), the other manufacturers are barely breaking even, if that.
The reason is that Apple is buying – hell paying in advance, or even giving suppliers the cash to build the factory with – components and assembly services in such volume that they get much better prices than anyone else.
Here in New Zealand I've had such a smart meter for 2 1/2 years. At present it's only reporting one reading a day at 4 PP, via Vodafone. I don't know whether it's 2G or 3G but I do know it's got a microsim like an iPhone 4.
It's really quite useful to be able to go to the web site and see what usage has been:
The main reason, for me, for getting the smart meter was so that I could get access to cheaper night rates for electricity. If you have separate metering in the day and at night (11 pm - 7 am in this case) then the day rate is the same as for people with a single meter, and the night rate is considerably cheaper (about 25% - 35%, depending on time of year).
I've been able to make some changes that have meant this winter I've used around 45% night and 55% day, while in the autumn (and it should be now again in spring) it was 60% night and 40% day.
That's a considerable saving.
Man who sells middleware for Android predicts Android will win.
While 50% of Mac buyers have never bought a Mac before, it must surely be more like 80% - 90% of iPhone and iPad buyers who have never bought a Mac.
Quite a lot of them will have owned an iPod but as cumulative iOS sales surpass cumulative (non touch) iPod sales that is changing too.
One on't cross beams gone owt askew on treadle?
Something that hasn't been mentioned is that both the 68000 family (from the 68010 onward) and the PowerPC family supported full hardware-assisted virtualizaton without the kind of partial-emulation hacks needed in VMWare and without the active cooperation of the guest operating system as required by the likes of Xen.