Re: Comcast sucks schweaty monkey bung.
if you're in the bay area, you should check out sonic. Yes, it's DSL, but it ended up being faster and cheaper than comcast for me.
249 posts • joined 11 Aug 2009
ok, I get it, capitalism is a screwed up, miserable system and by any sense the cash Gates accrued off the labor of others and abuse of market position is absurd and inequitable. But while Gates certainly benefited from the system, he didn't create it and his story isn't much different from that of other billionaires. I think it's worth considering what he's doing given where he stands and looking at that in comparison to others in the same position.
Lots of wealthy people donate to charities of various sorts, often as a tax dodge, or as a means to further some political ambition, or for good publicity (usually something that resonates well with a first-world audience like breast cancer awareness).
What I think sets Gates apart is that he's spent significant effort focusing on a few areas that will have a major impact in the quality of life and the life expectancy of the vast majority of our world's population. Sanitation and malaria research are not glamorous, and as we see here, efforts to improve sanitation in remote locations in particular are more likely to draw sniggers and childish remarks about drinking shit than adulation. I can't say that in his position I'd be doing anything near as worthwhile or with nearly the same persistence. For that at least, I think Gates deserves some credit.
stop, just stop. this is factually incorrect. Human visual dynamic range is about 20 stops accounting for rapid adaptation but not long adaptation (this is what you can see looking at a scene). Long adaptation including the constriction and dilation of the pupils extends us to about 30 stops.
I'm sure there's some advanced business thingy-ma-bobber here that I'm just not following, but when I hear brand value, I think wanting to buy a product just because of the brand.
It would seem to me that Coke, McDonald's and Marlboro have most of their 100BN+ market caps tied up in their brand. The product itself has little else to distinguish it. Google by contrast relies much more on the competitive merits of its technology than on brand. And, I can't remember the last time I heard someone want to buy something because Microsoft made the product (outside of ecosystem lock-in which is different).
What's the deal?
Full page ad in today's Wall Street Journal, some German SSL vendor taking the opportunity to big-up themselves at the expense of the Open SSL team and open source in general:
I'm no raving open source lunatic, but I must say that Open SSL has made major contributions to the security of the internet by making it easy and cheap to provide encrypted web services. One major vulnerability in nearly 20 years doesn't change that.
Open source projects including the Linux kernel, GCC, OpenSSH, R, Apache, Perl, and SQLite are pillars without which our current mad technological rush wouldn't be possible. The developers both professional and volunteer deserve credit for making such important and useful software, even the spotty seventeen year old contributors this guy seems so obsessed with.
Do I really believe the "more eyes makes better software" line? No, but having been developing "professional, commercial products" for a while now, I certainly don't believe that open source is fundamentally disadvantaged on quality relative to code that's frequently developed under extreme time and financial pressure.
Hats off to the open source community for the good work they do and its wide benefits. Using heart bleed to tar everyone involved, or even the OpenSSL project itself to push commercial software is low and cowardly.
In college, one of my friends focused on embedded and fault tolerant systems. In one of the graduate level classes he took, a student wrote up a paper walking through the math behind Lamport's Byzantine Generals paper. Apparently, the original is notoriously challenging, and the student's digest was sufficiently helpful to warrant publication on its own merits.
Have also heard that the support in Latex for the integral symbol was its own PhD for some student. Not sure if that's apocryphal or not, but certainly a great story.
Surprised it took this long for him to get the Turing award-- well deserved.
at $40k per patent, they're not exactly making a ton of profit over the costs of filing, compensation for their employees, legal fees, etc. Now twitter has a horde of 1000 patents they can defend themselves with. And volume matters when people start threatening litigation-- easy to have a team of laywers pound out defenses against one or two patents, much harder to conclusively fight 1000. Regardless of the patent system being completely screwed up, it's with us and it seems to me like Twitter probably came out well ahead on the deal.
A lot of the pain of running an internet service isn't the coding per-se (bunch of scripts, some sql, some php, whatever floats your boat), it's the configuration, installation, maintenance, backups, etc. that are needed to keep the backend up and running. A while back I worked at the world's tiniest telco (a video voip service) as one of three backend developers. We had about 12 IT people to sustain operations, maybe 5 of whom were fairly capable network engineers. If we'd been an earlier stage startup, having PaaS, DBaas, etc. would have been great options.
I don't think you understand capitalism-- to be a capitalist, first and foremost, you need to have capital. A union doesn't have capital, it's a body for organized labor. And yes, unions raise wages, just as bosses try to keep them down. You don't need to refer to anyone as recent as Marx, Adam Smith had something to say about this in fact, to the effect that it was no coincidence that the owners everywhere always have tried to break up unions while getting their own agreements in place to keep down labor prices.
If I'm understanding correctly, you've got fractional currency pricing and potentially units, and the tally isn't matching exactly what you calculate by hand.
To handle these issues on a large scale, some finance and accounting packages use (and may be required to use) decimal floating point libraries. I don't know enough about your situation to say if this could be a factor or not.
By the end of the first page I was shocked that this might finally be an Orlowski article I fully agreed with. Of course, the usual libertarian nonsense got trotted out at the end as the solution and normal service was resumed.
Besides the fact that copyright protection (at least for major corporations) is in a stronger position than it has ever been, there's nothing that prevents Google from requiring me to "grant them a non-revocable interest in my data privacy". Same crap as now. As Pete 2 and Ian Michael Gumby above point out, individual rights are only as strong as the ability of the individual to fight for them.
Seems like the pattern that has worked in the past to reign in bad practices from large industries, be they food manufacturing, alcohol, medicine, finance, or housing has been to create specialized regulatory bodies with regular audits. It isn't perfect, it's expensive, and justice can be slow. On the other hand, I can drink a carton of milk without testing it to see if it's been "supplemented" with Melamine. I can open an account with any FDIC listed bank and not need to worry about a run in the next financial crisis. I can buy a bottle of hooch and not only know that it isn't going to make me blind and crazy, but I can trust the ABV numbers to keep my consumption sane.
A data privacy ensuring agency would certainly be a challenge to create as the technologies, policy issues, and business processes are all quite complicated, but I don't think there's a simpler solution. Having everyone go through the equivalent of a witness protection scheme every few years as suggested previously would be harder still, prohibitively expensive, and would likely be more disruptive and unpleasant to the people being located than having Google and Facebook pimp their data.
I was under the impression that the majority of clients of the original silk road were middle aged folks who felt they'd reached a point in their lives where they shouldn't have to deal with shady characters in dark alleys and were willing to pay a premium for a low-effort high quality service.
Grownups these days. . .
stopped using firefox when they went all awesome-bar on me.
Seems like all browsers suck now and want to send your url entry to google as a search while you type, or think they are smart and automagically bung in a www or .com when you're travelling to an internal host. I just want a browser that looks like firefox circa 2005, but has decent privacy/security controls and doesn't leak memory like a sieve. Is that so much to ask?
Not sure if it's common in the EU, but here in the states several of the headache meds (I think Excedrin in particular) have a goodly dose of caffeine for exactly that reason. Of course there's the negative side-effect of getting people who wouldn't otherwise consume caffeine addicted, but that just helps business, right?
What the heck does the twitterverse have to do with making a great router? I'm on the dividing line between Gen-X and Gen-Y and I think most of my older peers use Facebook and twitter more than I do (approximately not at all). Seeing a drab-gray company trying to get hip with the kids gives me the creeps. It's ok to be drab and gray and make high-end enterprise gear that you sell for a juicy margin.
Global interconnects take multiple clock cycles due to propagation delay. My vague recollection is that the propagation speed is ~0.3c (due to the RC coefficients). If the conversion to and from photons is fast enough, this kind of thing could unblock one of the major limits on processor scaling, allowing bigger/badder cores with better single-thread performance.
Anyway, always cool to see this kind of fundamental work going on.
surely it is an enormous, prehistoric shark caught by a chilly glacial current during its struggles with a terrifyingly tentacled cephalopod back in the jurassic. Global warming has weakened the ice just enough that their battle can resume. That's right, a mega-shark vs. a giant octopus. Scientists would investigate further, but they're afraid there might be snakes on the plane.
Jake, I've got mixed feelings about the badges myself, but the thumbs do serve a valuable purpose in my opinion, and having both directions is an important part of it. In real life, if I say something, people may not respond to me with a considered response, but simply smile or laugh, or instead frown or storm out of the room. The thumbs provide a similarly light-weight means of response. I'm sure I've occasionally made the inflammatory post-- the forums would be pretty crummy to use if everyone who felt one way or the other about it had to post "me too" to that (see some of the old forums with Eadon as an example of how this can go wrong).
It strikes me that your personal grudge against the thumbs might be that you receive consistent negative feedback through the medium. Possibly you don't get this in real life-- it strikes me based on your posting history that you are a person who has a good deal of authority and possibly people are fearful to tell you when you're being an ass. Maybe instead of railing against a feature on a website that has no real impact to you, it might be worth considering why you are drawing consistent negative reactions when I would be willing to bet that the vast majority of thumbs on the site are up rather than down.
I don't think you're a troll and I do think you frequently have good contributions to the forums, but the constant one-upmanship and boasting about how amazing your life in Sonoma making everything yourself and expressing your contempt for people who don't live as well or aren't able to do as many things as you do is tiresome and childish.
If there wasn't a typo in the article, these guys are doing 50% against the spread-- that sounds like they could just be getting lucky since the casinos do a damn fine job of setting the spread to make those bets pan out nearly 50/50. I'm not getting the sense these are Nate Silvers here-- what am I missing?
one silly, one serious:
"Bathed in his currents of liquid helium, self-contained, immobile, vastly well informed by every mechanical sense: Shalmaneser. Every now and again there passes through his circuits a pulse which carries the cybernetic equivalent of the phrase, 'Christ, what an imagination I’ve got.'" (John Brunner, Stand on Zanzibar)
Seriously though, this seems pretty cool. I could see this being akin to databases enabling hordes of engineers and *gasp* business people who weren't experts in efficient storage and retrieval, redundancy, coherence modeling, multithreading, etc. to take advantage of concerted effort by a small few who are. Machine learning has way more cool applications than the people who understand it deeply enough to implement could possibly support piecemeal, and they are unlikely to understand all the subdomains in great enough detail to be the right people to guide that work. I'll be very interested to see how this develops.
true, but in Google, I think Microsoft have an adversary with equally bountiful pools of cash and engineers. Commiting to burning money and playing creeping death I think mainly works when you're bigger than anyone in the market you're trying to enter. I just can't see MS enticing much of anyone to their platform.
The el-reg diagram is brilliant. Thanks Richard!
And the bathroom reservation thing is actually not a bad dang idea. Would have liked to have something like that today-- had to try four floors across two buildings to find one that wasn't being cleaned or occupied. I'm thinking we could integrate the stall door locks with a simple sensor, have a web reservation form, simple badge scanner, reservations time out after a few minutes. Then management could track people spending too much time on the throne. . . hmmm, lots of great applications here :-)
I want some of whatever you're smoking. Yes, this is a cool technology, but it's not a lot different from what is done in the mobile space (where both NVIDIA and Intel play). Even when AMD clearly had the better parts, was first to 1GHz, first to 64 bit, and was crushing Intel on all the benchmarks, they didn't manage to translate that into market leadership. Now, a lot of that is because intel are cunning bastards with an enormous amount of business inertia making their position hard to attack, but that hasn't changed any in the last 10 years.
much as we all like to make fun of senior management, this is only of those executives that had malware-- it's an aposteriori, not an apriori distribution. We have no idea based on this information what fraction of executives surf porn from their work computers or whether it is higher or lower than non-executives. Concluding that executives are "complete berks" isn't supported by the evidence.
seems like the right thread to complain about an ad for some online game called "heroes" that seems to be showing up on my work computer in IE, both on the main page in the gutter and inline in some of the articles. Shows three scantily clad female characters. Definitely not appropriate for me to have up in the office. It's hard enough finding women applicants for programming jobs, I don't want to be contributing to the problem.
This sounds like it's related to the utterly mind-bogglingly "brilliant" dark silicon work. Good job on the good professor for getting himself a ton of publicity (and presumably funding), but it seems pretty clear he hasn't ever worked in say imaging where tons of effort is put into killing bad pixels because people are actually very good at catching them and find them offensive.
My guess is that they are assuming that you take your existing test machines, characterize hard and transient failures per die, look at under-voltage characteristics either in simulation or final silicon, and use that information to optimize power and die yield. Now how putting a dot is sufficiently informative to distinguish between "I can tolerate 3lsb error in this calculation" and "the result of this calculation can be completely bogus (e.g. NAN, INF, -INF, 42, . . .) 97% of the time I have no clue. If you knew that you had an adder that was say, stuck at zero in the LSBs you could guarantee the first, characterization of delay and running at an appropriately over-aggresive lower voltage can be done to meet the second constraint.
At any rate, this all seems like a useless crock. Another day passes and I remain grateful that I dropped out of academia.
good points. Ceramic (as used in the sharpener you suggest) is much less likely to screw up a blade than carbide. Personally not a fan of that style sharpener, but maybe because I've seen too many people mess up nice cutlery.
this is what I normally use-- takes and holds a great edge, unquestionably dishwasher friendly, and not at all expensive.
I have used Wusthof and Sabatier knives in other people's kitchens, but they've always been disappointingly dull in comparison to my $20 knife. Which goes to the general point of this discussion that you can have the most expensive and best equipment, but if you don't treat it right it won't perform.
you're getting some potentially really bad advice above.
If you knives are stainless steel, dishwasher use may be ok-- in fact, many restaurant grade knives are designed for this. However, some of the best blades are high carbon steel (much sharper). You don't want any water sitting on carbon steel (rusting) and it tends to be more sensitive to chemicals.
Also, would recommend you be careful about sharpening. You don't want to actually do this often, you want to steel your blade at every use and sharpen as needed. Sharpening is a really good way to permanently destroy a blade if you don't know what you're doing and there are lots of cheap and not so cheap sets that will help you with that. In particular, most of the sharpeners that have two carbide bits (sometimes disks, sometimes rods) you pull your blade through and the electric sharpeners are recipes for destroying your edge. I've successfully used and would recommend the Lansky youtube demo here, but in general you shouldn't need to sharpen a lot and if you're about to buy a high-end knife, it should come from the factory with a pretty darn good edge. Read the manual for any sharpener you get carefully before you go and use it. Or, visit your local butcher and find out who they get to sharpen their knives. The key though, is a nice, long, steel (longer is easier to use and less work) with consistent, low pressure strokes.
if you can make your problem look like a graph, it is pretty much assured that there's either a known polynomial time solution or it's np hard. The good thing about these kind of problems is there's no "right" answer, so you can take your pick from a variety of reasonably good heuristics.
Fun example: was considering a while ago an AI for a turn based game where monster hordes would try to maneuver through the dungeon to cut off the free space available to the player with the hopes of eventually surrounding him in an indefensible area. So how do the monsters move to optimally cut down the fraction of the dungeon that can be reached?
In professional work, I have generally tried to find approaches to problems that avoid getting into NP hard nasties. Simplifying the problem statement so that something in linear or log time works instead is good if you do work with real-time systems.
Overall, I agree with the second poster-- even those engineers who do work with algorithms spend the majority of their time on unglamorous tasks-- plumbing, buffer management, refactoring, validation/verification, etc. A well written kernel is generally small and tractable, a production system typically is not. I think this may have something to do with why algorithmics get a lot more attention in our schools than software engineering. Frankly, I'm not convinced you can teach software engineering in an environment where everyone gets the same exact problem and is assessed objectively on individual performance. Building a big system is ultimately a group endeavor and possibly beyond the scope of a semester course.
I think the problem MS seem to have in the mobile space is the same issue my mildly creepy uncle has. MS seem unable to understand the market's perception and expectation of them and continue trying to be "down with the kids" (the ads for Kin are a great example). I'm sorry, but no one under 30 wants an MS device unless it's Xbox branded. Now, if they targeted business users that might be a different story. The world sees MS as boring, but that can also be written "a known quantity" and "trusted". MS should stop worrying about the kids, get over their midlife crisis, and embrace their status as an established member of the business community.
This just happened to me!
Except that the comment was less obviously a "to do"
two end cases to handle in an algorithm, he handled the second, but had the detection for the first as well. The comment said something like "increase gain here" followed by several lines apparently modifying some gain parameters (but not actually increasing anything).
similar story here in the US a few years back
oh, and look, it was the same architect. And the guy admits to knowing that there would be problems in both cases. The arrogance and stupidity boggles the mind. "who cares if you scorch people in vegas?"
don't buy the 1million times smarter argument at all.
Almost every time a claim about a good measure of intelligence has been made computers have eventually done a better job (with a few notable exceptions).
Computers are now better than the top humans at chess, jeopardy, chip layout, optimization and path planning, mechanical assembly, specialized vision applications (spot the tanks), library science (index the web), weather prediction, stock market prediction, and probably more I'm not thinking of.
Where they aren't yet even are things like natural language parsing, artistic endeavors, and general vision applications.
Also worth noting that the human brain is about 14,000 times the volume of a single die. So a more fair comparison would be an average human against the Oakridge Titan. Transistors are pretty frickin' good already, it's the power, cooling, and interconnect on the large scale that needs work.
Biting the hand that feeds IT © 1998–2019