Re: <Evil Grin>
been through that. Actually, I add coffee cold turkey (for one day only!) to the jet lag, and the relief on the second day is good enough to kill the jet lag much faster than it would normally go.
2538 posts • joined 6 Sep 2007
These appear since the times of MS-DOS each one selling one and the same, quite stale by now, snake oil. Well I guess there must be some use for it, otherwise the idea would have died already. Or is it the combination of uneducated investors + slick marketing, up to the same old trick?
Another good point. A well-presented ML will typically return a prediction, which is interpreted as probability value. If the value is exactly 0 or 1 then either the model is broken or rounding error got in the way. Typically, for a great quality model and where the match is perfect, the value might be up to 0.97, perhaps 0.98. It is up to humans to actually read this value and think "hmm, this could be something else with 3% probability".
But then, humans just love survivor bias : if the ML was right more than 10 times in a row we stop paying attention and replace thinking with generalisation. It is good thing that some researchers are actually pushing the probability value closer to 1, but we also need to start paying attention. Because it will never be 1 and yet it will be frequently interpreted as such.
I think the premise above in false. Everyone who worked on this for a little will understand pretty well why machine learning models are so brittle. They are nothing else but heuristics trained to recognize a particular correlation. The meaning of heuristic is pretty clear. The correlation is also a hint although more subtle - the fact that something "looks like" a bottle only means that there is a strong correlation between a particular set of pixels and a "bottle" label, nothing else (in particular, it does not mean that the pixels actually show a bottle).
How that correlation was arrived at? By some heuristics. How does that heuristics work, actually? Whoa, back off, we just threw lots of data at it and some got stuck. How did we arrive at the situation where ML is treated as an oracle? A friend pointed at this recently http://backreaction.blogspot.com/2019/03/merchants-of-hype.html
Actually, we are looking at the collision of two asteroids, and the bits which coalesce are the larger asteroid pulling itself back together (literally, by gravitational force) after the collision. Which means that if we tried to break apart a large asteroid on a collision course with Earth, this is likely to happen with the asteroid, if left for sufficient time.
@Ian Michael Gumby
Either you do not understand how the law works in large parts of the world (including the US) or I somehow missed that this decision would not set a precedent. In case this is the latter, please kindly show the source.
In case there is a precedent, the decision basically means that whoever first grabs copyright for a function name and signature will be able to prevent everyone else from using the same name and signature in their own code. Which is very bad news for developers, because there are only so many sensible names and signatures (as well as code idioms and design patterns) which we can use to solve a common reappearing problem. It also makes it very difficult for developers to change jobs or contribute to open source projects because anything we have done in the past becomes huge legal liability.
I receive those every day. Actually few every day, if I bother to look into spam folder - and if not, they help my email hosting to improve spam filters.
Not an anon because there is nothing to be ashamed of since the "facts stated" are obviously entirely made up. Sometimes I am tempted to try to hack these back (tracing bitcoin wallet activity is plausible, there might be also interesting traces in email header) but can't be bothered. Someone will, eventually.
Cool story. Reminds me of writing Motif code under Windows 3.1 because that's what Bentley Systems Microstation used. I learned a thing or two back then, and never got to use that particular skill again because it was decided we should use the newly fangled MFC from Microsoft for everything else. Which nearly turned me away from programming, but that's another story.
I think that this not something Bezos can "let it go". Not without firing all the lawyers in the IP department and risking the wrath of SEC. The shareholders need Amazon to be protected trademark, and the necessary component for that is registering all the relevant DNS names. The new TLDs are just a money grab from ICANN because they knew exactly that this would happen, for the reasons set above.
The attraction is in the lower running cost if you know how to manage it. Here is an example: I am running a few websites (as a hobby, there is only "cost" side and exactly zero revenue from that, not even ads). These websites are not on my home internet connections since that would subject them to the vagaries of Openreach infrastructure (even AA ISP cannot guarantee 100% connection uptime, especially if there is no redundancy). So instead I pay for a virtual machine in some datacentre, where these websites are hosted, and that's one attraction of the cloud. The other is that I do not actually have to pay for a whole virtual machine if I've spent a little time to re-architect these websites - I could run them on containers which would be cheaper. Yes, it would be nice if I could ship my own machine to some datacentre and have someone keep it up and connected, but that would be more expensive than renting a small(ish) VM or running a container. Then there is a matter of off-site backups - we all need them, right? But where do we keep these backups? The cloud is an ideal solution, just rent some disk space somewhere and sync your daily backups there. Granted it is not "computing" but "storage", yet the underlying incentive is the same - access to another datacentre for stuff which is too expensive to maintain on your own, properly.
Icon appropriate for paying for this cloud stuff, every month, because it *is* cheaper not having off-site backups and running websites on home internet connection rather than in some datacentre.
It has two benefits:
1) no silly indentation rules
2) static type safety
Now, Go for data science ... I am afraid it currently does not have nearly as many useful libraries as Python does, and you are unlikely to be able to use it in a Jupyter notebook. So, perhaps stick with Python (but learn Go anyway).
You can't decide that you won't pay off the full balance at the end of the month we call that "charge card" over here, one example is American Express. The card issuer profit is obviously not from the interest but from the annual fee paid by the client.
Where the updates are initiated by IT because they're needed to patch some risk or move off some component that's reached maintenance EOL if you can't get agreement then go to the top team yourself and point out the risk and that you can't accept responsibility for any consequences of postponement.
... and when you do so, do make sure to include a printout to Equifax hack postmortem. Not a link - hard copy so they have no excuse for not knowing the dangers of delayed patching.
Many applications don't use multiple threads very heavily. Yes, because you need either 1) embarrassingly parallelizable algorithm (fitting within existing imperative programming paradigms) or 2) a new programming paradigm which limits data sharing between threads. Without either of these, the horizontal scalability of your application is severely limited by the Amdahl's law. New languages like Go or Elixir, or frameworks like Akka go some way towards 2), but few programmers can be bothered.
Are you familiar with the term "tech-illiterate"? That's what most directors at established banks are. And they are the only people with the authority to make architectural decisions. That might not apply to one of the upstart banks like Monzo or Starling, but I am yet to learn more about how they work.
One of my favourite charities "Water Aid" also takes a keen interest in sanitation. Having learned a little about how the world works outside of my immediate surroundings, I can understand why.
Biting the hand that feeds IT © 1998–2019