back to article Engineers, coders – it's down to you to prevent AI being weaponised

Debate has raged for months over various internet giants' forays into providing next-generation technology for war. For example, in March, dissenters at Google went to the press about the web goliath's contract with the US military's Project Maven, which aims to fit drones with object-detecting AI among other things. This US …

Voland's right hand
Silver badge

Bollocks

and is for non-offensive uses only.

One word: Bollocks.

The Man Who Fell To Earth
Silver badge
FAIL

Re: Bollocks

Google AI folks & the like can stroke their egos & delude themselves all they want, but the only way they can avoid contributing to AI military weapons development is to get out of the AI business entirely.

Anything else, like signing petitions, is just intellectual masturbation and everyone knows it.

Pascal Monett
Silver badge

AI principles, yeah

I followed that link, and found exactly what I expected : a nice, touchy-feely, heartfelt list of things goody-two-shoes Google promises to do and not do with AI. Nice to see they have found the light.

But I'm sure they did it with the best of intentions.

annodomini2
Bronze badge

Re: AI principles, yeah

Until they setup another business unit under Alphabet and transfer knowledge and tech; "but it's not Google!"

Symon
Silver badge
Holmes

"Thou shalt not make a machine in the likeness of a human mind."

The answer is a Butlerian Jihad?

Although it didn't entirely get rid of civilian deaths in the book, as far as I recall?

Pascal Monett
Silver badge

That it may not have, but it curtailed the hell out computers, to the point where they had to invent mentats - human computers (because human, it was okay).

Because advanced civilizations will always need computers, whatever the form.

Alan J. Wylie
Silver badge

Dual use is hard.

Many years ago, I worked on computer aided mapping: semi-automated line following. Measuring the boundaries of all the woodland in the UK to calculate the total area, better 1:1250 maps with accurate buried utilities to stopping backhoes cutting fibre optic cables, what could possibly be wrong with that? Then came the Falklands war. Digitise the contours and produce a wire-frame perspective of Mount Tumbledown as viewed from Port Stanley, please.

A few years later, I worked on CNC blade tip grinders to make jet engines more fuel efficient. Making 747s greener is great. But what if the US Navy want some for their fighters? Or the Army for an AGT1500 turbine in an M1 Abrams tank?

John Sager

Re: Dual use is hard.

+1. This stuff is going to get used for military purposes, just like all the other technical advances going back into pre-history. The better we know what AI can and can't do the better we are in understanding what potential adversaries have available. And it'll also get used by 'our side' - for better or worse it's the politicians who have to make the judgement calls, though no doubt there'll be a lot of screaming and virtue signalling going on in the process.

FWIW I think AI is pretty shit at a lot of this stuff currently but it will get better. The Met's fiasco of a face recognition trial demonstrates that, but the Chinese seem to be getting much better at it.

James 51
Silver badge
Big Brother

Re: Dual use is hard.

I doubt the Chinese are much better at it, they're just willing to pour more resources into it and no number of false postives is to many as long as there are no false negatives.

Korev
Silver badge

Re: Dual use is hard.

My physics teacher worked in technology used in Head Up Optics; when footage came back from the first Gulf War, it brought home the realities of what he was working on and hence changed careers.

jmch
Silver badge

Re: Dual use is hard.

"And it'll also get used by 'our side' "

The use of the phrase 'our side' is one of the big problems here. Dividing into us and them is what makes it easy for a half-trained army operative to press the red button. The AI said that it's 'them', and that's that. Being fed over-hyped claims about the AI capabilities doesn't help of course.

Yes, any technology developed for whatever reason can and will be used as a weapon. But ultimately there is a guy pulling the trigger, a guy telling the guy to pull the trigger, and a guy* further up making damn well sure that triggers get pulled on command, OR ELSE. And these guys specialize in creating an us vs them. Classic divide and rule. Just don't believe the arseholes peddling this shit.

*It's almost invariably a guy

Anonymous Coward
Anonymous Coward

Re: Dual use is hard.

Remnds me when as an idealistic student many years ago I put in the "any other comments" section of a job application form "prefer not to work on military projects" .... and got tackled on this in a first interview by interviewer "so you don't want to work on military projects - like missiles etc ... so what about working on a database system that gets used to ensure those missiles get deployed to the right places with all the right parts - would you want to work on that". Don't think I got much further with that application!

nijam
Silver badge

Re: Dual use is hard.

Yes, better knives meant better swords, better engineering meant better guns, ...

Always has, always will. It's human nature.

Alan J. Wylie
Silver badge

Re: Dual use is hard.

better engineering meant better guns

Sir Joseph Whitworth's rifle

James 51
Silver badge
Black Helicopters

It's only a matter of time before that kind of pattern analysis gets applied for notionally non-military ends such as advertising and searching for terrorists/subversives/members of non-ruling political parties/not in the right clique of ruling parties etc etc with equally incompetent results. Why have 1984 or Brave New World when you can have both with a side of Brazil as garnish?

Dan 55
Silver badge
Mushroom

See also...

- "Engineers, coders – it's down to you to prevent IoT exploits"

- "Engineers, coders – it's down to you to make better UIs"

- "Engineers, coders – it's down to you to increase software quality"

Oh dear.

deive

Re: See also...

Sounds about right - managers get all the extra money for the "responsibility" then shirking it at every opportunity. leading to https://www.theregister.co.uk/2015/10/10/vw_boss_engineers_blame/

Multivac

...... and physicists

It's down to you to stop atoms being weaponized and chemists you do the same for chemicals!

AdamWill
Silver badge

Re: ...... and physicists

"It's down to you to stop atoms being weaponized and chemists you do the same for chemicals!"

Well...uh...quite. Physicists and chemists have been struggling with this for decades/centuries (respectively). Haven't you *read* about how Oppenheimer and the rest of the Manhattan Project folks struggled with the implications and consequences of their work?

Charles 9
Silver badge

Re: ...... and physicists

There's another issue, too. What happens when it comes time to feed the wife and kids? If all roads lead to Hell, do you starve your family?

WibbleMe

I would like to introduce you to AI personalities, my demo will show you Marvin, the Paranoid Android AI, too depressed and timid to do fuck all. For more information contact me at

Douglas.Adams @ Hitchhiker - Guide - to - the - Galaxy - .co.uk

John Brown (no body)
Silver badge

6000 civilian deaths

I couldn't help but notice that the article does make the point that the AI is highlighting what may be interesting images or video and that the analyst then examines the images/video to see if they really are of interest. The article also points out that prior to any AI pre-selection, the human analyst would have looked at all of the images/video before choosing items of interest. At no point does the article claim that AI is choosing the items of interest itself and then acting on them.

It's an interesting hook to the hang the AI ethics debate on, but I don't actually see anything specific in the article which points at the AI as being the cause of civilian deaths. Is the AI making strong recommendations? Is there a lack of context from what the AI is choosing leading analysts to see what might not be there? Is there some human failing by the analysts that "computer say yes" is biasing their decisions? Is AI nothing to do with this and it's pressure from the higher ups, both military and political, to "get results" leading to going after targets with lower levels of confidence?

Mark 85
Silver badge

Re: 6000 civilian deaths

First a disclaimer/background. I'm a former US Marine who put time in Vietnam. Read into that what you may but this is from my perspective.

Maybe instead of blaming the AI, blame the "fog of war"? Crap happens. People die because of being in the wrong place at the wrong time, It goes back to beginning of human civilization and warfare. Yes, war is hard on civilians. Always has been, always will be. The tactics used by Al-Qaeda are no different than that of the honored Resistance and others: Hide in civilian populations and damage the other side as best you can.

We look back at wars since wars started and civilians have always suffered the most. Ancient times, whole cities were slaughtered and destroyed. Move forward to WWII.. carpet bombing of cities with incendiary bombs, etc. The atom bomb drops. Come forward some more and we have My Lai. Currently, we have terrorists, etc. still targeting civilians.

Does this justify civilian deaths by either side? No it does not. One would think that AI does have the potential (once it's truly AI and not the current BS) of the AI giving choices and then a human selecting the targets that things might get more finely grained. Having said that.. we still have the a-bombs targeting cities.

As has been said, "War is hell". What's needed is either we as humans mature and find a way not to need war, or at the minimum, not take out civilians. I don't see the first being accomplished in our live times but maybe the second... maybe.

fajensen
Silver badge
Flame

Re: 6000 civilian deaths

People die because of being in the wrong place at the wrong time,

Seems to me that: People die because of *us* being in *all* the wrong places at the wrong time!

Why does a 1'st world nation spend it's youth, talent, ressources and never mind those billions of USD obsessing on the ways of a bunch of 3'rd worlders in far-flung places that are basically no threat at all to anyone?

Especially IF that other "we", our "allies" and those "intelligence"-community factions bent on regime change did not arm them and point them in the general direction of someone "we" don't like and then every.single.time once again are shocked and surprised (since nobody ever gets fired or even shot for these things) when "our terrorist freedom fighters" turn out to be just "terrorists fighting freedom" - like, egg.actly as it said on the tin!

Institutionalised stupidity is what this whole "war on terror" has become. And with Mr. Flip-Flop Double-Down on Failure now running the show, it can only get worse. It is not a big wonder that democracy doesn't exactly has the ring of quality that it used to have!

AdamWill
Silver badge

Re: 6000 civilian deaths

There is, of course, a significant difference in the current situation vs. the Vietnam War or the Second World War: they were actually wars. The "war" on terrorism is not. The US is not at war with Iraq or Afghanistan or anywhere else. Which only makes this all the worse.

Michael Wojcik
Silver badge

Re: 6000 civilian deaths

Maybe instead of blaming the AI, blame the "fog of war"?

Cori addressed this to some extent in the article, and it's been studied fairly extensively by psychologists and others. The issue with AI is that the more decision-making is delegated to technology, the easier human decision-makers find it to initiate or permit violence. That's been demonstrated in everything from organized large-scale combat to one-on-one interactions.

When drones are using ML to pick "plausible" targets, human operators will quickly adapt to simply confirming every suggestion made by the algorithm. ("Stupid bird! Why did I leave you in charge!")

Moral, tactical, and strategic decisions are expensive - they require significant conscious higher-order brain activity, which means they only happen when we make ourselves consider them, and they're tiring. They also carry post-commitment costs: second-guessing, guilt, etc. Those are very strong incentives to avoid them. If they can be delegated to a machine, that's exactly what most people will do.

True, wars are full of decisions - accidental and deliberate - which harm civilians (and unnecessarily harm combatants, for that matter). That doesn't mean there aren't very serious ethical and pragmatic issues with automating that decision process.

Wellyboot
Silver badge

Weaponised?

Our soft skin, slow running speed, small teeth and almost useless claws didn't make us the dominant species. All that is needed is a little though and anything has a lethal potential.

Dripping water can be weaponised. Hands up who's not seen or used a bic biro blowpipe?

jmch
Silver badge

Re: Weaponised?

"Our soft skin, slow running speed, small teeth and almost useless claws didn't make us the dominant species. All that is needed is a little though and anything has a lethal potential."

It's not just the brain power that made us the dominant species. It's the manual dexterity that allows us to direct the thoughts into physical reality, and most of all, it was thoughtless mindless animal aggression

Aladdin Sane
Silver badge

Re: Weaponised?

Also, the fact that humans are nasty, evil, vicious bastards. On a good day.

Michael Wojcik
Silver badge

Re: Weaponised?

most of all, it was thoughtless mindless animal aggression

Oh, I think many people have put plenty of thought into their aggression.

Anonymous Coward
Anonymous Coward

As if

Before the 'selective' targeting there were fewer civilian deaths... Killing a house full of the wrong people is a crime, but perhaps not as great a crime as obliterating an entire village. As soon as you have asymmettric warfare where the combatants are mixed in with and indistinguishable from the civilian population instead of in nice military camps and uniforms this is inevitable. The interesting question is who is responsible, and there are no easy answers. In the Korean War my father's contemporaries didn't like shooting up ox carts, but when one in five is full of ammunition and explodes and mixed in with the peasants going to market are peasants carrying supplies for the troops - where does the responsibility lie?

Ledswinger
Silver badge

Re: As if

Before the 'selective' targeting there were fewer civilian deaths...

Could you be more specific, because as I read that, I'm thinking you are very, very wrong?

fajensen
Silver badge
Terminator

Re: As if

Before the 'selective' targeting there were fewer civilian deaths.

Bollocks! The "terrorist / civillian" ratio in the drone wars is about the same as the "nazi / civilian" ratio that granddad managed to achieve - using far less ressources and gobbledygook to justify it - dropping dumb bombs over Dresden and Hamburg.

If we wanted fewer civilian deaths we could hang a few of "our side" for the same warcrimes that we hung most of the losers for. I'd bet that would put some much needed discipline and professionalism into the war-"game".

The tragedy is that there is no personal nor any career risks to droning some 3'rd worlders so, why not?

Michael Wojcik
Silver badge

Re: As if

Could you be more specific, because as I read that, I'm thinking you are very, very wrong?

I think we were supposed to read the subject as an introductory adverbial phrase, which inverts the sense: "[It's not] as if before the 'selective' targeting...". But I could be wrong about that; the thesis is not entirely clear.

It's not what I'd call a well-constructed post. Some folks really need to do some revising before they click Submit.

LucreLout
Silver badge

Sorry Cori, I respectfully disagree...

The public needs a healthy dose of realism about how America has used and will use these technologies, and how the war on terror looks on the ground where it is waged.

They also need to understand why drones are deployed so often. IEDs. If you want to blow up the soldiers while passing, rather than fight a ground war with them, then don't be so suprised when their mates blow you up with a drone. Our countrys first responsibility is to the men & women we send to war - whether you believe they should be there or not is completely irrelevant to that point - and we owe them the very best protection that may be provided them whilest deployed.

they never said why Salem and Waleed were caught in the crosshairs.

Well, most likely because of a mistake. Unfortunately in war, mistakes happen - like any other walk of life, just with bigger bangs and worse consequences. If the enemy combatants would stop hiding amongst the civillian population, or if the civillians would simply move away from the men with guns, then collateral damage could be greatly reduced. Expecting one side to not fire back is unrealistic and unhelpful.

A human fired the missiles, but did so, in part, on the software's recommendation.

And they did so in part due to standing orders, rules of engagement, and the situation in the given area. I don't follow all of resharpers bat shit crazy recommendations (or all I'd have are untestable static classes), and blaming the software for the human acting on its mistake is missing the point. It's why we don't allow automated firing by the AI.

in societies where most men are armed, and insurgents are interwoven and married into civilian populations, network analysis will always make mistakes.

Those societies and men have specifically chosen to have a higher rate of casualties amongst their neighbours and family by living amongst them as enemy combatants. You spend your day shooting at soldiers and blowing them up with IEDs, then seek to complain when a drone takes out your house while you're having dinner? Frankly, that isn't a reasonable complaint to make - you made your bed, now die in it.

Some of Google's people seemed less concerned about moral balance than they were to avoid public discussion of the contract at all.

Moral balance doesn't mean anything. You think your morals are the correct set. I think mine are. They won't always align, so whose morals get primacy? Thus, your morals mean nothing to me, in the same way as mine mean nothing to you. You can't expect the rest of the world to work per your own moral framework. It's astounding how many seemingly intelligent people cannot grasp that simple fact.

Weaponized AI is probably one of the most sensitized topics of AI – if not THE most.

It is, and rightly so. I'm not sure anyone is yet advocating rolling out Terminator style hunter killers that purge a location of all humans, but that day will come eventually, unless terrorism is knocked on the head as a means of conflict. If you wish to be martyred, stand and fight like a conventional army. If you're frightended of dying, well, stop picking fights with other nations, and stop blowing up their civillians. If you don't care about or are deliberately target their civvies, yours will one day become fair game, or at the very least collateral damage.

Lets take a moment to review what that phrase really means today. It means your civillians were viewed as being expendable to the achievement of the mission. If that mission is to stop your menfolk blowing up our families, then its wholly understandable why it is considered preferable for our drones to blow up your menfolk. Unfortunately for you, that may be after they pop home for lunch, and while aiding and abetting them, you might get killed too.

Under President Trump, the targeting rules have been made even looser, with predictable results: over 6,000 civilian deaths last year in Iraq and Syria alone.

As upsetting as that may be, how many lives were saved due to the deaths of the primary targets, the enemy combatants? Gross numbers aren't nearly so useful as net figures. How many of our soldiers lives are worth sacrificing to avoid what may be more or fewer civillian deaths if we use planes and tanks instead?

Do we even know if drones kill more civvies than bombers, fighter jets, helicopters, or tanks? Are some of the objections really just emotive, because there's no risk to life of the dorne pilot?

We all have a role to play in the debate about where AI should be used. But the most important audience is AI developers and engineers.

We do. And the number of soldiers I've met with serious injuries and dead friends due to IEDs leads me to believe that it is preferable to deploy drones to eliminate the terrorist threat rather than having our guys out their with their ass in the breeze. See, ethical and moral standpoints vary from person to person, so while you may feel they're a great decision filter, the filter comes up short when we account for interpersonal differences.

This is true mainly for the populations of wealthy nations. While you and I bicker on Twitter, buy crap on impulse, or do any of the things that figure in these TED-talk dystopias, Orwell is out there: for the poor, the remote, the non-white.

Race may be a correlation of drone strikes, but its absolutely not causal. The cause of drone strikes is terrorists planting IEDs, not prayer books or brown skin.

That's why some say engineering and computer science should be regulated like the old professions: medicine and law.

And I'd completely agree with you that they should be. However, don't for a second think that would prevent the development of autonomous drones or weapons.

Could unethical uses of AI land developers in hot water? Sure.

Illegal use, sure. Unethical? Not a chance. Your ethics have no bearing upon anyones actions but your own. Just as my ethical framework guides my actions. You've no specific expectation or right to think I'll act according to your ethics than I do of you acting according to mine. Its the main problem with ethics.

That's what could solve the AI ethics debate – for those with the gift to code to think about what they are building.

If what "I" build helps save the lives of our soldiers that would otherwise be blown up by a terrorist IED in some godforsaken part of the world, then I could sleep real easy at night. There is, after all, nothing that mandates these clowns to hide behind their wives when the drones come calling - in choosing to do so, they choose to make their families as expendable to us as they are to them.

I don't build drone software and never have, but I certainly have no moral objection to it. Quite the opposite.

If they chose to wield their power for good, who knows what they could do?

Define good.

This is the point where simplistic and emotive rhetoric breaks down. Is it good that drones save the lives of our troops? Yes, absolutely it is and they absolutely do achieve that. Is it good that drones end the lives of terrorists before they can kill more of us? Yes, absolutely it is, and again they do achieve that. Is it good terrorists hide behind their families in an attempt to avoid the consequences of their actions? No, it isn't, but who made that choice? So whose fault is it really?

I'll get more downvotes for this than a bacon sarnie in a mosque/synagog, but the point is there is always more than one view point, and a reason why emotion must be kept out of such debates.

jmch
Silver badge

Re: Sorry Cori, I respectfully disagree...

"Our countrys first responsibility is to the men & women we send to war - whether you believe they should be there or not is completely irrelevant to that point"

This sentence makes no sense at all to me. If a country's primary responsibility (or one of them at least) in military matters is to take good care of the soldiers who put their lives and limbs on the line to protect our security, then surely the very first duty to those soldiers is not to send them into situations that have zero net benefit (and probably even negative net benefit) to the security of the country.

Secondly, you make the categorically wrong assumption that drones are supporting soldiers on the ground, but in fact in cases such as Yemen they are replacing soldiers on the ground. WTF is the US doing in the Arabian Peninsula anyway? Oh yes, helping to prop up the corrupt dynasties that sell us their oil by running after a 'terrorist' group that has almost zero international scope and is really is a local insurgency.

Anonymous Coward
Anonymous Coward

Re: Sorry Cori, I respectfully disagree...

yes, the point of all this targeting and intelligence is not to blow up more of the wrong people with bigger weapons, but to blow up fewer of the wrong people using smaller weapons. So if the result of my campaign to block AI weapons development is that more innocent people are killed, and more civilian property is destroyed, then what is my moral position?

jmch
Silver badge

Re: Sorry Cori, I respectfully disagree...

"If you wish to be martyred, stand and fight like a conventional army. If you're frightended of dying, well, stop picking fights with other nations, and stop blowing up their civillians"

Well, thats' useful advice to the downtrodden citizens who are victims of a dictatorship. Maybe the French Resistance should have faced the German army, tanks and all, on the field of battle, just so it wouldn't be so inconvenient for Himmler to root them out?

Just as you say, everyone has their own moral code, so maybe your moral code is fine with blowing up people who may or may not be 'terrorists' (by whose definition, anyway?). In my book, targeting people to kill them just because they have a different moral code than my own is not OK. If they do try to come to my country and blow shit up, by all means do whatever is necessary to stop them.

Anonymous Coward
Anonymous Coward

Re: If you wish to be martyred, stand and fight like a conventional army.

i.e. gather all you mates in a large-open space, away from any inhabited areas (desert would be great, if available), get ready with your AKs to face your opponenents' conventional army...

....

Oh, pardon me, you said "IF YOU WISH TO BE MARTYRED", look, here it comes.... BANG.

CONGRATULATIONS, you

a) stood and fought like a conventional army (tick)

b) got martyred (tick)

MISSION ACCOMPLISHED.

rg287

Re: Sorry Cori, I respectfully disagree...

You spend your day shooting at soldiers and blowing them up with IEDs, then seek to complain when a drone takes out your house while you're having dinner?

Here's where the crux of your argument fails.

Back in Iraq, Coalition forces frequently came under mortar attack - i.e. indirect fire.

Frequently, they could spot someone (colloquially known as "a dicker") on a mobile phone or radio who was quite blatantly the spotter calling in the fire, but they were not authorised to engage because they were very deliberately NOT carrying a weapon and posed no direct/apparent threat to the troops.

They didn't need AI to do this, but fundamentally that's a signature behaviour. You're under fire, you can see one person with Line-of-Sight who is on a phone, it's probably them calling it in.

This was escalated and the ROE was changed so that someone who was part of a mortar team could be engaged irrespective of whether they personally were carrying a weapon.

Was that fraught with risk? Yes. Did you risk shooting some poor unarmed sod who was on the phone to their mum? Yes.

But the worst that happens is you shoot someone accidentally. That's quite bad, but it's better than following them home, dropping a 500lb bomb through their roof and killing their wife, children and extended family in the process.

When a drone lobs a bomb into a house (with - inevitably - a fuzzy number of occupants, with unknown identities), you risk enormous collateral damage.

When you industrialise collateral damage and make it accepted practice, you commit a war crime (except this isn't a conventional declared war, it's "counter-insurgency" so the USA likes to wash it's hands in the grey area, same as the non-POW "enemy combatants" in Gitmo).

Blowing up someone's family has nothing to do with your duty to your serviceman, or any Military Covenant. It's sloppy practice and - as others have pointed out - there's no duty to servicemen. Officially there are no ground forces in those locations. We're just flying in and bombing - proving the adage that a war cannot be won remotely. You need boots on the ground. And if there were boots on the ground (limited SF), then you'd fly in teams in helicopters (bypassing IEDS) for targeted snatch/kill missions.

LucreLout
Silver badge

Re: Sorry Cori, I respectfully disagree...

This sentence makes no sense at all to me.

That's because you're trying to conflate two seperate and unrelated issues.

If a country's primary responsibility (or one of them at least) in military matters is to take good care of the soldiers who put their lives and limbs on the line to protect our security, then surely the very first duty to those soldiers is not to send them into situations that have zero net benefit (and probably even negative net benefit) to the security of the country.

You've assumed, without any evidence, that the current deployments have zero beenfit, which is at best a debatable point, at worse pure ignorance. It simply doesn't matter whether you agree with the troop deployment, the duty once deployed it to protect them until they can be brought home safely. There's no room for variance or lefty whataboutery there. Sorry.

Secondly, you make the categorically wrong assumption that drones are supporting soldiers on the ground, but in fact in cases such as Yemen they are replacing soldiers on the ground.

Actually, you have misunderstood - that they replace troops ont he ground is precisely my point. Too many ground troops were getting blown up by IEDs, which led to drone development, investment, and deployment.

WTF is the US doing in the Arabian Peninsula anyway?

Utterly irrelevant to the debate at hand.

Oh yes, helping to prop up the corrupt dynasties that sell us their oil by running after a 'terrorist' group that has almost zero international scope and is really is a local insurgency.

You've fallen off the fact waggon and into the swamp of your own idealism and poltical views here. Lets staick to the facts that may be established and keep the debate on focus.

LucreLout
Silver badge

Re: Sorry Cori, I respectfully disagree...

So if the result of my campaign to block AI weapons development is that more innocent people are killed, and more civilian property is destroyed, then what is my moral position?

Has that been established? I've not seen anything remotely resembling a fact that suggests it has.

Before we can establish that "AI" is a problem here, or even part of a problem, or even that a problem exists, we have to first establish what the collateral to target ratio is for "dumb" weapons. Then we can work out per target eliminated, whether the collateral damage is better or worse. Then we can debate whether or not it is socially acceptable.

None of that work has been done yet.

LucreLout
Silver badge

Re: Sorry Cori, I respectfully disagree...

Well, thats' useful advice to the downtrodden citizens who are victims of a dictatorship. Maybe the French Resistance should have faced the German army, tanks and all, on the field of battle, just so it wouldn't be so inconvenient for Himmler to root them out?

In hiding amongst the populace, the resistance accepted there would be significantly higher French civillian casualties and collateral damage. You may or may not view that as legitimate after the fact, but that is the choice they made. Did it end the war quicker? Maybe, but definitively it ensured innocent civillians were caught up in the sweeps and fire fights.

Just as you say, everyone has their own moral code, so maybe your moral code is fine with blowing up people who may or may not be 'terrorists' (by whose definition, anyway?).

My moral code is perfectly fine with executing terrorists. I lose exactly not one seconds sleep over it.

The only definition that matters is the guy with his finger on the drones trigger. Beyond that, there are international conventions and definitions for these things. Drone strikes where civillians have been inured or killed have all occurred on battlefields where terrorists are hiding amongst the civillian population. If they allow that, or fail to leave the area, then they are accepting the risk of becoming collateral damage. Sorry, but we can't simply not shoot back - it isn't an option.

In my book, targeting people to kill them just because they have a different moral code than my own is not OK.

That's not what we're doing. We're targetting them because they are terrorists who are actively engaged in the murder of innocent civillians - those not shielding combatants of any variety for any reason.

If they do try to come to my country and blow shit up, by all means do whatever is necessary to stop them.

There are those that would argue that is precisely what we have been doing since 2001.

LucreLout
Silver badge

Re: Sorry Cori, I respectfully disagree...

But the worst that happens is you shoot someone accidentally. That's quite bad, but it's better than following them home, dropping a 500lb bomb through their roof and killing their wife, children and extended family in the process.

Thankfully, that isn't what happened or how drone strikes are scheduled. Certainly it'd mark them as a person of interest and they'd be followed (likely from a drone camera) - the intelligence opportunity would outweight the benefit in taking out a single spotter after the fact.

When a drone lobs a bomb into a house (with - inevitably - a fuzzy number of occupants, with unknown identities), you risk enormous collateral damage.

Fortunately, they only bomb houses where there are known to be terrorists hiding. Yes, it is hard to account for collateral damage, but then, if daddy is a terrorist and insists on coming home at night, well, then daddy is a moron who has chose to put his family at risk. They have chosen to let him.

You can't simply allow terrorists to escape because they seek to use their own families for cover. If they choose to endanger them, then that is their choice. The drone pilots do their level best to reduce collateral damage, because when dropping a 500lbs bomb, there will always be some (same as the poor sod calling his mum that gets a bullet through the face).

then you'd fly in teams in helicopters (bypassing IEDS) for targeted snatch/kill missions.

Worked real well in Somalia, no? Blackhawk Down is a very prettied up - turn disaster into victory - illustration of why that doesn't work so well as a plan. On paper, its fine, but in the real world, its way better to send a drone.

Anonymous Coward
Anonymous Coward

Re: Has that been established?

Ask the citizens of Guernica in 1937, London, Plymouth or Sheffield in 1940, Germany and Japan in 1945, of North Vietnam or of anywhere along the Ho Chi Min trail in the 60s or 70s....

jmch
Silver badge

Re: Sorry Cori, I respectfully disagree...

"That's because you're trying to conflate two seperate and unrelated issues."

You made the claim that whether one believes the troops should be there or not should not be a consideration with regards to their safety. I'm merely pointing out that the troops being in a war zone is 'de facto' the primary source of danger to them, and therefore it is by far the first duty of army leaders to only deploy soldiers if absolutely necessary. I think that is incontestable.

"You've assumed, without any evidence, that the current deployments have zero beenfit"

You seem to be assuming, also with zero evidence, that these deployments do have some benefit. Of course I don't have any proof, because this is one of the things where a counterfactual isn't really possible. But history has countless examples showing that terrorists are more easily beaten by political dialogue than by military force. And those are facts and has nothing to do with my supposed "idealism and poltical views "

Cavanuk

Re: Sorry Cori, I respectfully disagree...

"but in fact in cases such as Yemen they are replacing soldiers on the ground."

Thereby preventing them from being killed and maimed.

Perhaps instead of targetted drone strikes that unfortunately also kill some civilians, we should just take out the entire village, town etc? No, then ground force invasion must be your solution?

Some of the comments here are ridiculous. No one in the West wanted the ISIS caliphate to be set up and start terrorizing populations. Should we have let them just commit genocide against the Yazidis? Not our business? Charming attitude. Drone killings are a much better option to random bombing or ground invasion.

As was said above, if you don't want your civilian family killed, don't shelter with them.

There is nothing wrong with the basic concept of weaponized AI, if it helps to more accurately target the other side (yes, "the other side". It's a valid concept). The alternatives lead to higher casualties and threaten our own military.

LucreLout
Silver badge

Re: Sorry Cori, I respectfully disagree...

You made the claim that whether one believes the troops should be there or not should not be a consideration with regards to their safety. I'm merely pointing out that the troops being in a war zone is 'de facto' the primary source of danger to them, and therefore it is by far the first duty of army leaders to only deploy soldiers if absolutely necessary. I think that is incontestable.

Uncontestable as you feel it may be, it isn't relevant.

You seem to be assuming, also with zero evidence, that these deployments do have some benefit.

Wrong. There's plenty of evidence that they do have benefit. How many attacks has OBL launched this year? None, because he's dead. How many of his own people did SH gas this year with chemical weapons? None, because again, he dead. And so it goes.

But history has countless examples showing that terrorists are more easily beaten by political dialogue than by military force.

No it doesn't. It suggests that military force to eliminate the threat works best, then dialog with the few remaining survivors - see ISI, AQ etc etc for evidence. Even the IRA were militarily beaten - split top to bottom by intelligence assets and with limited remaining funding, they had no choice to to end their "war".

jmch
Silver badge

Re: Sorry Cori, I respectfully disagree...

"There's plenty of evidence that they do have benefit. How many attacks has OBL launched this year? None, because he's dead. How many of his own people did SH gas this year with chemical weapons? None, because again, he dead. And so it goes."

As I said, this type of argument is moot because there is no counterfactual. However it's not too controversial to say that without the power vacuum post SH's removal that ISIS would never have emerged.

In the case of OBL it was a targeted incursion on a known terrorist mastermind. I fully support his termination with extreme prejudice and body dumped at sea. But that's not the same as lobbing a missile at a house because the occupant fitted a profile based on AI which most probably is not as smart as is claimed.

*Incidentally* Genuine question that occurred to me - If a target is flagged up as a probable terrorist, and if there is good enough surveillance on the target to know when he's at home (because you wouldn't want to waste a missile on his home if he's not in! ) then why not attack when he's 'at work'?

P. Lee
Silver badge

Re: If you wish to be martyred, stand and fight like a conventional army.

I have to agree with you here.

The US military hegemony is so large, expecting a 19th century meeting on a battlefield is ridiculous. The only way to win is to make the cost of war unacceptable to your opponents. In the case of the US, that means ensuring that the true (or exaggerated) civilian cost is publicised to the US voters.

The US strategy is to minimise the loss of US life with overwhelming firepower. Their opponents strategy is to make every US strike an expensive one. I'm sure Sun Tzu would have something to say about using your enemy's strength against them.

Perhaps, rather than debating military strategy, we should be examining the civilian political decisions which lead to fighting.

Page:

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Biting the hand that feeds IT © 1998–2018