121 posts • joined 21 Sep 2006
Continued fact denial
Hey Orlowski! Why aren't you covering the emerging evidence that the cores of units 1, 2, and 3 melted down and burned through the pressure vessels? You are very selective about the facts you choose to report. Too selective for a journalist. Exactly selective enough for a fact denier though.
"But the stricken reactors coped well"
Recent examination revealed that the cores of units 1, 2, and 3 melted down. Unit 2 definitely melted through the steel pressure vessel, and the investigators believe the same occurred on units 1 and 3, though direct inspection is not yet possible due to high levels of radiation, water, and damage from the hydrogen explosions.
In what sense can the stricken reactors be considered to have coped well? I guess they didn't actually blow their lids off and spew burning graphite and fuel into the environment like at Chernobyl. That's a pretty low bar for comparison.
"The UK's gas-cooled reactors ... give longer timescales for remedial action. "
Although melting has ceased, normal cooling has not been restored to any of the striken units at Fukushima #1 to date. The cooling that has been done amounts to sticking a hose into the building and pouring water over the radioactive slag at the bottom of the pressure vessels. Modern reactor designs would have to be robust indeed if only ambient air cooling was available. I hope this is the case, but I very much doubt it. In a "modern" reactor, if it became necessary to stuff a firehose into the reactor to achieve cooling, I suspect the touted safety features would look a good deal scarier.
Can't we return to reality? Fukushima was a diasaster of engineering. The reactors were not designed to withstand conditions known to be possible at their site. An obsolete, dangerous design was permitted to continue operating long after the lessons of Three Mile Island should have resulted in its closure. Three uncontrolled raactors destroyed themselves in a nuclear meltdown. Millions of liters of highly contaminated water are leaking into the environment, and no assessment of health risks from this contamination is yet possible. The fact that measured, direct health effects are yet modest does not lessen the severity of what went wrong, or excuse poor design and regulatory decisions.
Whether or not you think nuclear power is cool, the only supportable path is to acknowledge that Fukushima was a disaster, so we can learn from it.
which is worse?
Which is worse, media jackels capitalizing on our fears, or fact deniers self-importantly spouting opinions as truth?
Safety systems failed because the reactors were under-designed and obsolete. Recovery procedures failed. Reactor pressure vessels, which may or may not have survived the initial quake, subsequently failed decisively from hydrogen explosions. Workers suffered prompt injuries from radiation burns during the cleanup. The ultimate toll among the workers remains uncounted. Radiation leaked into the soil and seawater. That's just so far.
An earthquake or tsunami is news on the day that it happens. But if you survive the shaking and the wave, it's over. The nuclear incident at fukushima #1 is news because it is still happening. The extent of the radiation leakage is worse each day than it was the day before. The extent of the mechanical damage is still increasing each day due both to discovery of previous damage and to new damage caused by fires and explosions. The number of people affected is still growing. The situation remains out of control, though a china syndrome event appears to have been averted by last-ditch efforts. We don't yet know how bad it will get, though we know it almost got really, really bad several times.
Denying that the fukushima #1 disaster is news because it is not currently the biggest disaster is just silly. The significance of news can only be observed (let alone judged) in the rear-view mirror of history. The attempts by Messrs Orlovski and Page to label the disaster a non-even when it isn't even over are doomed to embarrass them. Not that they'll notice or care from their comfy isolation half a planet and a whole magical worldview away.
What would a serious nuclear incident look like to Mr Page?
The nuclear industry itself identifies any loss-of-coolant accident as a serious incident even if no radiation is released. But not Mr Page.
I expect the many industrial customers unable to ship parts into or obtain parts from Japan's sensitive just-in-time supply chain due to electricity disruption may cite a nuclear disaster to their shareholders. But not Mr Page.
TEPCO shareholders must already view the multi-billion-dollar total writeoff and cleanup costs for four destroyed reactors as a financial catastrophe. But not Mr Page.
I rather expect the still-displaced residents of Pripyat and the birth-defect victoms of Kiev view Chernobyl as a pretty frikkin' serious disaster. But not Mr Page.
So I gotta ask, "Mr Page, what level of civilian death and economic consequence would you describe as representative of a significant nuclear disaster?" What toll separates 'engineering triumph' from gross failure in your own mind? Can a nuclear reactor fail at all?
different point of view
You could write the same article from another point of view
1. The fukushima facility was underdesigned to withstand only moderate earthquakes, and without consideration of tsunamis. In the most seismically active place in the world, in fact the place that gave us the word tsunami, they built a nuke plant on the coast. What were they thinking?
2. Backup systems failed due to the unanticipated tsunami
3. Three reactors suffered loss of coolant accidents, the worst category of (initial) accident
4. Two reactors probably suffered at least partial core melting (the facts aren't in yet), which scraps the reactor.
5. Two reactors had to be flooded with seawater due to failure of the deionized water supply, which basically means these reactors are scrapped due to corrosion, instead of providing 10 more years of service.
6. The cost of dismantling three damaged reactors full of highly radioactive fuel may be substantially more compared to the cost of removing normal spent fule rods and decommissioning a reactor at end of life.
This is not engineering success. It is design failure. It is engineering failure. It is financial failure. The fact that so far the radiation exposure has been minimal is good, but it hardly excuses the other failures. Nuclear plants stand out for having these expensive failure modes that are not seen with any other kind of power generation.
The world was just beginning to think they could trust nuclear power again. New reactor designs may even be safer. Now, I don't know. If no high-level radioactives are released, and if the cost of scrapping three damaged reactors is not too extreme, maybe someone will still make that risky bet. Hope they're more careful than the Americans, the Russians, or the Japanese, 'cause all those countries have seen enough failures.
"Watson is made up of 90 4-socket IBM Power 750 systems with 360 8-core POWER7 processors running at 3.55GHz with 16 GB of memory per server. The systems are connected together via 10 GbE networking"
No, not at all a supercomputer. How could even a stupid carbon-based lifeform mistake a 2880-core distributed processing system for a supercomputer. I've got three on my desk right here. By next year, there will be one in your wristwatch. Super indeed. (snort).
it's the end of the world, I say
I seem to remember Captain Picard mentioning that the economy of Earth was almost destroyed when the matter replicator was invented, until people learned not to covet "things".
Many manufacturers find replacement parts to be a very profitable business, and will want to copyright any number of wear items so they can keep the price up.
Someday society are going to have to deal with the problem that maximum profit is not in the public interest, though some degree of profit arguably is.
ruined by bad advice
Right. "memory fences" would have been a correct answer. atomic<T> would have been. Volatile is an incorrect answer.
The first 90% was so wonderful, that it was sad to see it end that way. I suppose VS will claim that Lord Peter gave the wrong advice intentionally because the lady was a windows zealot, but the intent wasn't clear to me.
pay us like doctors
If developers are to be board-certified like doctors, and are to be liable for malpractice like doctors, then they will have to be educated like doctors (8 years of college, 4 years of supervised apprenticeship), and they will have to be paid like doctors. No outsourcing to india. No ragamuffin non-college-grads.
Raise your hand if you're naive enough to think this is ever going to happen.
When you do this thankless task, you will find that some words are in common use in parts of the US, but unusual in other parts. Plus some people have larger vocabularies than others.
The Queen's English appears to far more profane than you would see in most US publications. But perhaps that's just Register charm school training.
Most words you can find by googling "define: foo"
The word that gave me the most head-scratching was "quango".
clown-car drive by
North Korea's cyber-attack (assuming it is NK) is just like their nuclear weapons and their rockets. They kinda almost work, only to fizzle, fall into the sea, or mildly annoy some federal webmasters on their july 4th vacation. North Korea are like scary clowns stuffed into a car, threatening a drive-by shooting. You don't know whether to laugh or be afraid.
I think I shall continue to laugh until NK manage something more like an actual missile, or actual nuclear blast, or cyber-attack that actually takes something important off-line. If NK ever get to that level of scary, then I vote we unleash all the Republicans at once. The resulting threat-storm should surely cause even scary clowns to cry. If not, well, it's not 1953 anymore. Even the Russians and Chinese find NK a pain. The next war will be far more one-sided.
Iran is way scarrier, if only because they're closer. Same problem with being clowns though.
It's embarrassing. My parents' generation had enemies (Russians, Chinese) that were actually scary. How am I going to hold up my head when old dad whines, "In my day..."
earth to Intel...come in Intel
Atom is fast enough. It is faster than machines that we thought were fast in 2001. It'll do. Intel doesn't want you to hear this secret. Intel wants to figure out how to use up the horsepower of its thousand dollar chips, so you'll have to pay $2k for a PC like you willingly did in 2001. Intel are begging software developers to think up something to do with all those cores, only the workloads are still mostly serial, so it's no-go.
Netbooks are the look-over-here-it's-not-really-the-same-only-cheaper product. Customers want a laptop. Only they want it to weigh two pounds and run 8 hours on a battery, so they can actually carry it around. Sony wants you to pay $2k for a laptop that is thin and light, and the VAIOs are awesome if money is no object. But is the VAIO actually FIVE TIMES as useful as a $400 netbook? Is a Macbook Air FOUR TIMES the computer?
People are now arguing over whether they want to pay $400 or $600 for a decently portable computer, and how much compromise they will accept to get it. But here's a secret. We already know people will buy computers with 10 inch screens. They did it in the '90's. We already know people will buy computers without optical drives built in. Like the iPhone. Like older laptops. These are not impossible tradeoffs. What people are saying they do not want is 2 hour battery life and 8 pounds of bulk to lug around, just so the already imperceptible time it takes to refresh their browser screen can get even shorter.
This sucks for Intel and for Microsoft, but they better get used to it. Average prices are going nowhere but down, even as features improve. I'm having trouble feeling sorry...
Anyone notice that Indian Outsourcers say it's *bad* for the U.S. to protect domestic jobs, but *good* for Indian Outsourcers to take them? Since trade wars serve nobody's interest, India should not have fired this first shot.
Anyone notice that it's middle-class Americans who want to keep (their) jobs in America, and the rich who mostly employ those cheap foreign nannies, groundskeepers, etc? Short-term thinking.
Keeping jobs in your country is in your country's national interest, even if it causes prices to rise (or not decline). The payroll dollars (or pounds or rupees for that matter) get spent locally, creating jobs and adding to tax coffers locally.
I guess if India wants to buy castoff soviet-era defense hardware to spite us (that's U.S.) they should feel free to do so. But as the capitalist exploiters say, you get what you pay for. Particularly the North Korean goods have still not been shown to work at all. Probably the Indian defense establishment is smart enough to figure this out. Can India build their own military industrial complex? Well, they've figured out how to do the bribery and corruption part, so I guess they can have a world class mess.
Pike and Orlovski get it wrong
Pike's opinion piece (no citations, etc) says that other studies have shown electric cars to be about 3x as efficient as petrol cars, but then says that power plants only deliver about 36% of the energy in their fuel to the plug, so there is approximately zero efficiency gain.
Ahem, what about the cost of mining, refining and transporting the petrol? What about power plants that do not burn fossil fuels? Pike's weak-minded analysis exploits a fluke in the numbers to stoke an opinion. There is no methodology here.
We need more efficient electric cars, AND we need more efficient power distribution, AND we need more renewable power sources. This would be true even if we could continue to externalize the cost of carbon emissions.
"Shih rejected a suggestion that the Eee brand was becoming diluted as the firm slapped its the three letters on everything..."
Wait. Is "Eee" 3 letters, or only one...or two.
I'll just go now.
usual agile nonsense
It is typical for the agilistas to rail against anyone who doesn't drink their kool-aid, and so typical for them to create a straw-man to attack.
Software architects are the guardians of the design-it-up-front process. Agilists demonize this process as the Dreaded Waterfall Process. Thus architects must be demonized as Not Listening To Customers, so that their successes can be dismissed and their failures focused upon. Only it turns out that designing things up front works too. It's especially appropriate when you are going to have only one release (think space shot), or when your only means of communicating with coders is through a legal contract (think offshoring, think government and defense industry).
Prescriptive agile methods like XP claim it's impossible to do up fron design, but that is only true if you don't know how, and don't want to learn. Users needs don't change that much. It's only our understanding of their needs that changes. It evolves, it matures from no understanding at all to full understanding. It can mature in cycles of writing and discarding code, or by thorough analysis. This is the design-it-up-front secret that is heresy to the agilistas.
Someday we will come to the post-agile world. Process evangelists will someday lighten up and acknowledge that up-front-analysis can streamline coding. They will discover that user needs can be sussed out with techniques already written down and taught, but not in common use. They will come to understand that the XP practices are not the only or necessarily even the best set of practices, even though they are vastly better than no practices. They will encourage dev teams to ask WHY they should use each tool, not just whether they should. And they will learn to collect and act on feedback from their sprints, which is the forgotten half of the reason why interative design works.
Unfortunately, I'll be retired by then.
poll workers ROCK!
My mother-in-law used to be a poll worker in Seattle. She happened to be a Republican, in a Democrat-leaning area. This made her very valuable, because certain activities had to be supervised by workers affiliated with two different parties (to prevent hanky-panky don'tcha know). She was very proud of how hard the poll workers worked to provide a fair and honest election.
I love the grand inefficiency of elections. We can determine the outcome with a statistical sample of a few thousand eligible voters, but that's not how we do things. We build an immense machine to run flat out for a single day (yeah, I know about early voting, it's a metaphor), then tear it down. The inefficiency is as necessary as liberty, and as beautiful.
it's management, stupid
So, engineers who don't like their job and think they're smarter than their manager think they should be paid by the line, or any other semi-objective but manipulable measure of what they think they do best. Right?
It is management's job to gather together a team that works harmoniously; to work with the team to establish procedures and standards, and, if there is a developer who can't be comfortable working with the team, to make the hard decision and encourage that developer to explore the wider space of opportunities at other companies.
It is also management's responsibility to assess the performance and capability of each member of the team and compensate them in proportion to their contribution. It would be great if there was a single measure of performance, but that's a pipe dream. Design engineers are not milling machines. They do not stand alone. They do not produce at a steady rate three shifts a day. (If they did, software would always be delivered on time and on budget). Devs are imperfect machines; each one unique, with skills and deficits. Live with it.
And it's the individual contributor's job to move on if they aren't happy, rather than trying to impose their style on the whole team, or demand that there be no shared style. Managers are imperfect machines too. They focus obsessively on social aspects of developers interface with each other and management, and ignore devs' terrific skills at template metaprogramming or protocol analysis or whatever. Which is what they should do.
Integrated graphics isn't about what you want. It's about what Intel wants.
The makers still dream about the days when they got $1,000 for a chip. Well, nowadays it's $200/chip. CPUs are getting smaller and cheaper. The only way to get the price they want is stuffing more stuff on the chip. They'll integrate RAM, graphics, sound, anything.
Upgradeability sucks, but they want you to upgrade to a new PC every year or two, like with your cell phone, but more bucks. In fact, they want your PC to become a single sealed unit, just like a cell phone. It's cheaper to make, it will be lighter and thinner, and it will be oh so disposable.
For the makers, the alternative is unthinkable; chips getting cheaper every year until you pay more for the steel in the case than for the magic in the chips. PCs MUST be expensive and difficult to upgrade.
don't know how to go to mars
We don't know squat about going to Mars. We don't know how to keep astronauts from going all floppy and boneless on the 1 year trip. We don't know how to feed them and provide air for them in a feasible way. We don't know how to land a heavy craft in the Martian atmosphere. We don't know how to get enough volatile rocket fuel either delivered to Mars or made on site to get them back out of the gravity well, and we don't know how to keep them from going nuts on the trip home. Yeah, we could manage all this stuff by brute force, given an absolutely unlimited budget, but that's not what we have.
It's not like a 9 day mission to land on the moon, which we can do with spam in a can. The astronauts have to do actual repairs, actual flying, mining, farming, etc., and all by themselves. And they absolutely can't make any serious mistakes or we get stinky spam in our can.
The moonbase may help us figure out some of this stuff. Maybe. Launching the mars rocket will no doubt happen from low eart orbit, but that's not what the moon is for.
if only my tasks were parallel
If I could just type all the letters of this message at one time, I could exploit the parallelism in my multicore chip. The sad fact that Intel doesn't want to hear is that a great majority of tasks are serial. Sure you can embed brilliant bits of parallism into searches and such, and a few tasks can run in parallel, like page updating and networking, but honestly, it's a serial world.
That means Intel won't be able to charge more and more each year for chips unless it finds a way to make uniprocessor performance continue to increase. Sucks for them, great for us.
best title effort ever
"bulgy-bonced battle-boffinry bureau ", best journalistic title effort to date. I may never recover from my laughing fit.
So, wonder why we're not hearing anything about the *next* O/S release from Microsoft (yes, this is irony). We are all supposed to forget that what was going to make Longhorn special was big improvements like the filesystem-is-a-database. Did that have to be abandoned altogether (like because it was too slow), or is it still cooking away unnoticed in some lab waiting to cruelly unhorse linux in the big O/S joust.
why manned missions are important
The best science we can do with a camera is to say, "See, it disappeared, so it must be ice." If we had a guy standing there, he could just reach down, pluck up just the white bit, and drop it in the analytical oven. Case closed in about 15 minutes.
Only we need to know the ice is there first, because we want to use it to make rocket fuel for the return trip. No ice, no return. Oops. There won't be any fake science on this one.
It can be simultaneously true that the world is warming, and that the world is in a 20 year cooling trend. There are several cyclic temperature fluctuations that have to be factored into analysis of the long-term trends. Scientists know this. The global warming deniers hope you don't.
Raw measurements often have to be corrected for measurement biases, like local conditions (is the thermometer in a city, which creates a heat island around itself), or intermediate-term trends (see above). Fiddling the raw data is a part of both good science and bad. And the really tricky part is that what you fiddle depends on what you expect. If a scientist believes warming is occurring, they're going to look for biases that make the real temperature higher than the measured temperature. If the underlying assumption is valid, then the fiddle gives a better, more predictive result. If the underlying assumption is wrong, or the scientist has a political agenda, then the fiddle mucks things up. Both pro- and anti- warming scientists do this fiddling. Then they call each other names for it. The point of peer review is to vet the fiddling.
Of course scientists want to get funding. They say "OMG it's warming!" to get funding from some sources, and they say "It's all nonsense" to get funding from other sources. The point is to keep the lab running and the staff paid. Science is a business too, and all scientists are about equally culpable for wanting to get paid.
Queen's English is a bastardization
Else you'd all be talking like Shakespeare (don't you wish). Beowulf was written in english too kinda, but it is virtually uncomprehensible now. Guess what, language evolves. It is a tool fit for purpose that changes as its use changes. Only an idiot would claim that their dialect has an inherently superior position.
Street dialects and IM-speak are the future of english; regularization of verbs, new vocabulary, respellings. Abbreviations are more temporary; the best ones stick. It's a Good Thing.
Now, the decay in speech clarity, *that's* something worth whining about...
Who's got my coat?
My understanding was the Royal Navy wanted machine-perfect trig tables for navigation, as there are several clusters of wrecks partially attributable to faulty tables. Babbage had a hand in not getting the thing built, doing a bait-and-switch on his even-more-ambitious analytical engine. It was a classic defense procurement fustercluck.
Is anyone else dissappointed by the raw 21st century austerity of the replica? In the 19th century, the gears would have been cast, with elegant spokes and a pebble finish, machined only on the working surfaces. It would have *looked* victorian. In replicating the function, I think the fabricators missed out on replicating the feeling of the piece. The replica looks big and fiddly, but it isn't *pretty*, as it probably would have been.
The whole thing's a metaphor. A microsoft exec wants the thing enough to pay a million dollars for it, and wants it in his living room, and it works, but it's got no soul. Sigh.
author doesn't understand accessors (set/get methods)
The purpose of accessor methods is to abstract the data in your class. Having a protected bullets counter is only scarcely different from making it private and having getbullets() and setbullets() methods.
The *reason* you abstract the bullets counter into accessor methods is so you don't have to have a bullets counter at all (!)
What if you are building a class of automatic weapons, and you want to implement all firing methods in terms of a burst. Your tommy gun fires a ten-round burst. Your pistol fires a 1-round burst. You want to talk about your bullets in terms of bursts, instead of bullets.
Having a protected bullets counter fixes your implementation in terms of bullets, not bursts. You want to write your data access in terms of bursts and the operations you wish to use on bursts, like burstfire() which consumes a burst and reload() which puts a new magazine in, containing however many bursts it contains.
Now, you might decide that the operations you want to support are firing bullets, not firing bursts. In that case, the operations on bullets are expend() and reload(). Now you aren't counting bullets. You're performing allowed operations on your chosen abstraction. It's not the choice of abstraction that matters, it's the use of abstraction.
green card lawyers
I still have my Green Card Lawyers -- Spamming The Globe T-shirt from the first organized effort to stop spam by denying spammers a UUCP connection. This was back in the days when system admins thought they could cut off spammers air supply. 1993 I think. But there was always someone willing to forward spam for money. Sigh.
10 years ago you had cell-phone pirates driving vans full of $100k RF scanners arond city neighborhoods to get cell phone codes to clone. Size and cost is not an obstacle for these guys. Only the cost/benefit matters.
The more value we put in encrypted data streams, the more value in cracking them. It's not like a custom 8-layer board is all that expensive, or requires you to have your own manufacturing plant. Even big FPGAs are less than $100. And your laptop makes a dandy controller.
Welcome to the year when professional engineers go bad for profit.
no help at all == the agile way
So, you are agile if you don't compromise, and build a perfect system, but not so perfect that you do unneeded work. And no advice on how to achieve the nirvana of perfect design.
I now publically cry "bullshit" on the agilists who say "if you are agile, you do it perfectly, and if you don't do it perfectly, you obviously weren't agile." People like this should not be allowed to blog, let alone design important software.
Real software developers should stand up on their hind legs and laugh at agilists, unless they cough up some actual advice on how to do a good job. And I don't mean saying, "Be agile." I mean what, when, where, why and how kind of advice.
All I've read of agile design by people with actual advice to give would cause agile teams to compromise heavily so as to begin earning value quickly. If an awkward API caused issues, it would be cleaned up incrementally, not by an extended design process.
What's up with this contradictory advice?
only the loser commits war crimes
The concept of War Crimes is not valid.
"War crimes" prosecutions are a final outrage, perpetrated by the winner on the helpless loser. In any kind of asymmetrical conflict, the outcome is certain, with the winner using superior weapons, troops, and logistics to brutally destroy the loser. The presumed loser in such an asymmetrical conflict makes a rational choice to use weapons and tactics that the winner labels unfair, because if he fights the war on the presumed winner's terms, the outcome is certain.
No third party could distinguish the asymmetrical kill ratio obtained from overwhelming firepower from the asymmetrical kill ratio inflicted by (for instance) a weapon of mass destruction. In the end, either route results in body bags.
If the concept of war crimes can ever make sense, it only applies between opponents of approximately equal strength, where use of proscribed weapons is available to both sides, and serves only to increase the body count without providing a strategic advantage. It is a leftover notion from world wars. In the modern world of superpowers, it has no meaning.
it's just gotta be hackable
I would think that there must be a way to spoof the machine. Gun in a bag of jello taped to your tummy or in your mastectomy bra, flat knife under neoprene wrap, that sort of thing. The guards will figure it out in a week. It'll be on the internet in a month, and the Queda will be using it in a year. L3 will have their pork by then, and we'll all be left waiting even longer in line because the metal detector is faster than the mm weve machine.
It's all a way to save energy, by making flying so uncomfortable and spooky that nobody will do it any longer.
you don't live in a laptop
I'm a big fan of privacy, but... If it's ok to search papers in a briefcase, it ought to be ok to search a laptop, and for the same reason. Sure, a laptop is getting to equal a fatter and fatter briefcase, but that's not the point. The point is, you are travelling. You aren't at home. You knew you were subject to search. You chose what things to take with you, and what things to leave in a secure place when you left.
If you chose to take a baggie of dope, that wasn't very clever of you. If you chose to take 10,000 child porn images, that wasn't very smart, and it doesn't matter if you printed them out or put 'em on your laptop. If you took trade secrets or sensitive documents, shame on you for exposing them to high risk of theft or loss, because you were *travelling* with them. They weren't secure. Duh.
What kind of excuse is, "uh, um, I forgot to take the dope out of my briefcase, so you uh, um, shouldn't be able to bust me when you noticed my hash pipe on the x-ray. Like, bummer man!" Isn't the argument pretty much the same with the kiddie porn images?
Now, I wonder what the outcome would look like if every file was encrypted, and you said, "Well, duh! It's a laptop. I'm travelling. It might get stolen or lost. Of course everything's encrypted." Reasonable search would not extend to the password, which was *in your mind*. I suppose the NSA could pwn you if it wanted you bad enough, but suspicion of possible stuff is probably not enough unless you're already on a list.
Of course, you'd be the only computer user anywhere and throughout all time ever to not have been stupid with the contents of their laptop, so maybe this constitutes probable cause. Or export of sensitive encryption technology. Or something for which there is a jail-time penalty if they want you.
Paris, because she'd try this argument.
Let's see, T-rays, carbon nanotubes, quantum computers. Check. Can I build a quantum computer using either of these? Hmmm... enough to get funding? Right then.
What about high Tc superconductors. Nope, no longer hot. Put that one back in the bin with cold fusion. (It will make me weep if we eventually discover a hight Tc superconsuctor using carbon nanotubes, that wasn't discovered because the two trends weren't hot for funding at the same time).
Well, there's also the issue that Bagdhad time is about 12 hours off Las Vegas time, so roboflyers are all working in the middle of the night. That'll disrupt your domestic relationships big time unless your wife is a vampire.
It's the perfect recipe for stress. Probably should add that the airbase is probably a very dry toilet of a place, and that your pay probably sucks since you aren't in combat. How many things should you pile up on top of these poor schmucks before expecting them to suffer stress?
Black helicopters, because you don't have a tanks & guns icon.
I'm told navy pilots hate the phalanx antiaircraft guns on Aegis class cruisers, because the phalanx (1) sprays a solid stream of depleted-urnaium milk-bottle-sized projectiles that can ruin your whole day, and (2) tends to creepily track the nearest air target, which is usually friendly helicopters doing takeoffs and landings.
Interestingly as an IT angle, the phalanx gun is also known to occasionally shoot down civilian Iranian Airbus jets. Seems its primitive software only classifies air targets as "friend" or "foe" with no intermediate category for "I don't know what this target is." I wonder if that got fixed?
So when you request that change to the software, do you call it a bug fix or a new feature request?
Does anybody remember when Longhorm was going to be released in 2005, and it was going to have that whizzy filesystem-is-a-database, and all that other nifty stuff? Well, welcome to Vista SP2. You'll be expected to pay for it, which is great news (for Microsoft) for a service pack. But it will have good compatibility with the previous O/S version. In 2013 when this code finally escapes from Redmond, it might actually be the O/S that Microsoft bragged it would be. Yeah, 8 years late, but maybe worth the wait. Well, I'm waiting, anyway.
The penguin, because if you need a new O/S before 2013, he's always there.
What will make this 20-year-old advice easier to follow today than it was way back then?
I believe Edsgar Dijkstra said something that ammounted to "if you could program a computer in simple english sentences, you would discover that programmers cannot write simple english sentences." It's absolutely true that good cohesion and separation of concerns are two aspects of a clear program. Knowing that fact does not, apparently, induce programmers to write in that fashion.
Has anyone actually *read* Asimov?
Asimov wrote the fameous three laws as the background for a collection of short stories demonstrating just how and why the laws don't, and can't work.
Besides, who's gonna buy a robot that won't do your killing for you? Not the boys in olive drab. Not the rich. Hackers might, but they'd just fiddle it. Actually, I guess with hackers it doesn't matter who buys it if they can fiddle it.
antagonizing a criminal
So, 'Marc' recommends hassling phishers. I can only guess that Marc still retains the belief in immortality characteristic of the very young.
Taunting a person who you already know is (1) a criminal and (2) a hacker is just an invitation for that lamer to pay obsessive attention to giving you the very worst day possible from halfway around the planet. It's as smart as picking a fight in a biker bar. You're gonna get something kicked.
Lots of people get older without having to learn in this particularly painful way, but some people always volunteer to serve as a warning to the rest of us.
Of course, it's always possible that 'Marc' is a recruiter, rather than a volunteer...
There is a hardware crisis, not a software one. The *hardware* makers can't make a core go faster.
I won't be simulating weather or challenging chess grandmasters on my PC any time soon. But I will keep typing characters into my word processor, linearly, one character after the other. Most computer programs are sequential, because most of them implement processes that happen over time. Duh. I can use a little bit of concurrency if I want to type and compile at the same time, but oops! my three year old 2.5ghz P4 is way fast enough, even time slicing.
The sad fact is, if we can't find a way to sop up all those cores, then consumers will demand that prices fall as parts get smaller and cheaper to make. And we can't have that, oh no! Moore's Law was a free lunch, not just for lazy coders, but for lazy chip makers too.
And it's over. Even though you can still put more aggregate power in a chip each year, you can't put more sequential execution speed in anymore. Time to look for a new industry to invest in, or else time to get way smarter about using the resources you have to do more (sequential) work.
Maybe there's a voice recognition or virtual reality app out there somewhere, that everyone will want, that will sop up 128 cores and save Intel. Maybe it's AI. But it won't come from parallelizing inherently sequential applications.
I hear biotechnology is the hot new investment area. Maybe they can use 128 cores for something. Something specialized that can be written by experts.
comments are as useless as...
Comments are not needed. They are as useless as spaces, punctuation, multi-letter variable names, type declarations, classes, and every other artifact of symbolic programming. Everybody knows that the real way to program is by simply providing each program's Godel number. Anything more is redundant, right?
Comments tell you why. The code can't tell you that. Comments tell you what the statement means in the problem domain, even when the code does not. Comments give meaning to programs written in languages that do not perfectly capture the meaning of what you are doing.
Comments remind you why you chose to do something clever rather than something simple. Comments remind you about things you had to learn in order to understand the code.
Good programmers write comments. Programmers who don't write comments are either doing work too boring for good programmers, or they are too fresh to have learned the lessons taught by refactoring their own code.
It makes me so tired to hear these rationalizations over and over through a long career.
There are companies looking for top people, and willing to pay for them. For these companies, there just aren't enough applicants. I feel sorry for these guys.
Then there are companies looking for code monkeys to write login pages and data edit screens. They hate to pay the $60k a new US grad gets. They'd like to pay, say, $30k. I would venture to say that there is, indeed, a shortage of US tech workers who want to give up half their wage to do boring work in a sweatshop.
Microsoft is interesting in the debate because it has both problems. Microsoft has the additional issue that they like to hire new grads who don't have a life yet, and work them 60 hr/week (paying for only 40 of course) so that workers never have time to aquire a life and its attendant distractions.
A modest Proposal
What if H-1b visas were portable, so that once the visa was granted, the foreign worker could work for any employer for a period of time. The grateful worker would tend to stay with the company who sponsored the visa, but only if they paid a market-clearing wage. Scarce top talent could enter the US in unlimited numbers, without unduly damaging the labor market. And nobody gets to pretend they are the first kind of company when they are really the second kind. Bwah-ha-ha!
fingerprints at the scene of a crime
Wait wait. Here's the bad news. If someone is beaten up at the same bus stop where I happened to get on this morning, my file gets flagged. Now every time I go through a burder, into a secure area, etc., I get interrogated or worse yet denied access with no explanation and no clear path to get it fixed. I was once suspected in a crime because I was standing within the arbitrary perimiter of the crime scene at some prior time.
We've gone from mistrusting people convicted of crimes, to mistrusting people suspected of crimes. This is the fundamental, unforgiveable affront to our civil liberties. Every citizen *might* have committed a crime.
web 0.1 mastodon
Wow, that was perhaps the lowest hit ever for the Register. How do you look at yourself in the mirror? Keep up the good work.
If an H1b visa gave the holder an unlimited right to work for any employer for a period of time, no restrictions would be necessary. Denied the slave labor pool, industry's appetite for foreign workers would decrease.
everything old is new again
If recycling will save the planet, then the recycled platitudes and fallacious arguments in this one thread should go a long way.
It's clever of "Tom" to say C++ must be easy to learn because C and sed(??!) were. When C++ was coming into widespread use, it was a common argument that, "I hearned Pascal and C in a few weeks, how hard could C++ be?" But the reality was different. If you knew Pascal, then learning C was just learning how to spell the keywords; the coding paradigms were the same. But with C++, you *also* had to learn OOP. And exceptions. And generic programming with templates. And STL. Learning C++ was like learning C half a dozen times. Everybody thought they'd just quickly master C++, but in reality it took even good programmers 2 years.
Then there's "auser" and the smalltalk wheeze. C++ isn't *really* object oriented because it isn't like smalltalk. C++ can't do dynamic programming (by which I assume he really means polymorphism, which C++ does fine, just not like smalltalk). It's easy to learn your first programming languatge. It's easy to learn your 10th programming language. But learning your second language is hard, because it isn't just like your first language. So that second language seems all wrong. Once you've learned a dozen languages, you know that.
Only these days we don't learn a dozen languages. Innovation has stagnated. C++ sucks in many ways, but it works reasonably well. And a C++ compiler is a humongous thing. Wirth turned out a compiler for Pascal in about 5000 lines of Pascal code. I don't know how big GCC is, but I'm guessing a million lines. Not an easily duplicated effort for a single researcher. That leaves microsoft in charge of innovation, with predictable results.
C++ is too hard to learn
C++ is old enough to vote. What's up with that?
I love C++. I code in C++ every day. It's 20x faster than Java, easier to debug than C. It's very expressive and powerful.
But C++ is too hard to learn. Type conversion operators don't work like you'd expect. The name lookup rules are complex. Templates are powerful, but damn near impossible to debug. The body of knowledge on how to use exceptions effectively is still in its infancy. It takes two full-time years to learn the basics of C++. Anyone who think's he's "mastered" C++ in less time is delusional.
Did I.Q.s drop in academia since I went to school? Is there truly no way to improve on C++? Give me a break.
Any claimant to revolutionize transportation has to show their invention is an *improvement*. OK, citycars fold up so parking is easier. But you don't own one. They're too small for me and my four kids. I betcha they only go 30mph. And the fold up, which is less good in a crash than it is in a parking lot. So we get one feature and give up 20. Huge improvement, right? But enough to get that grant proposal through...
What would be an improvement is lighter-weight, recycleable materials for the structures of conventional cars, a lighter, more efficient powerplant, attentive and courteous drive-by-robot controllers.
Or just ban the things. Bulldoze all american cities (which would be a good idea in any case) and rebuild for rail transit. Perhaps we'll get a shot at this when sea level rises over most of our coastal cities. Hmmm, global warming as an opportunity. It's the american way.
The climate is going to warm, and sealevel is going to rise, whether we reach out into space or withdraw into caves. Blame your parents. It's 'em wot done it. Can we move on now?
It is in our nature to degrade our environment. If that was all we did, the earth would be a sterile wasteland covered with our bones. Fortunately, it is also in our nature to try new things, and to do old things better. That is to say, Webster, it's good that somebody gives a shit. Now it's time to withdraw to your cave. You shouldn't be breeding.
- Fee fie Firefox: Mozilla's lawyers probe Dell over browser install charge
- Did Apple's iOS make you physically SICK? Try swallowing version 7.1
- Pics Indestructible Death Stars blow up planets using glowing KILL RAY
- Video Snowden: You can't trust SPOOKS with your DATA
- Review Distro diaspora: Four flavours of Ubuntu unpacked