An American law student has published an analysis of international law regarding war crimes that might be committed using future brain-interface-controlled weapon systems. Stephen White, studying at Cornell Law School, had his paper Brave New World: Neurowarfare and the Limits of International Humanitarian Law published (pdf) in …
I think White's suppression plans wouldn't work and, since legal efforts in suppression are likely to undermine efforts to regulate the systems we _do_ get, White's paper itself has made future war-crimes more likely.
Thanks a lot, White!
The first thing we do is...
...kill all the lawyers.
As in, "I put on this hat that turns my unconscious thoughts into lethal force and then walked into an area full of civilians.".
A Modest Proposal
>"For instance, quicksilver brain-directed weapons could conceivably be handy for close quarter gunfighting one day. But if the system is going from subconscious assessment to shooting without further ado, it must be aiming the gun itself as well as firing it - this thing is mainly an automated weapons turret on a robot, now. Why not put the operator and his wired-up brain off the battlefield via remote link, then? At which point the need to be really quick so as to keep him safe has vanished, so you may as well just give him a normal, consciously operated firing switch."
Well, why not go one further, why not take the next logical step? Why bother maintaining an entire expensive human body, with all those needs for feeding and exercise and sleep and healthcare, when all you need is a brain in a jar? Clearly, the future of conscription is to round up criminals, poor people, blacks, etc. etc., and harvest their brains for the army! GOD BLESS AMERICA!
Gross criminal negligence?
Basically the problem is that the soldier would be perceiving a threat and the weapon would fire before the soldier was sure it was a threat.
Accidents happen, and soldiers have shot noncombatants mistakenly. However, if you fire at anything that moves which might be an enemy, you're setting yourself up for friendly fire and crossfire deaths. It's criminally negligent to fire upon recognition that something unknown is present. Therefore, it'd be criminally negligent to equip yourself with a weapon which automatically fires for you upon the same criteria. Using the type of tech mentioned would be like reflexively firing without restraint.
Restraint and poise in battle is a difficult thing, but it is necessary. Soldiers in the major armies of the world get trained to know when to fire and when not to fire. The use of the proposed firing solution would be like voluntarily disregarding that training.
This isn't as ground breaking as it sounds
There are already automated systems for identifying and engaging enemies on an automatic basis - the Phalanx system comes to mind quite quickly. In that case, just as in brain-plug stylee systems, a switch is flipped (physically or mentally) and the system automatically shoots at the threat. There are still lines of responsibility, and the usual "nearest human" precedents apply.
He who deploys a booby-trap is culpable
I believe that in the US, if you deploy an automatically-triggered gun or bomb, you carry the blame just as if you pulled the trigger or pushed the switch. In this new case the soldier is the 'trigger,' so the higher-up who decided to install the weapon and cock it bears responsibility.
Presumably existing robo-kill gadgets don't evade war crimes either, do they? Seems like the same sort of concept.
A thought becomes an action...
... when it leaves your head and does something to the physical world around you.
Currently this only happens through your muscles, unproven psychokinesis aside, but this system is simply providing another way for thoughts to become actions.
A typical poorly reasoned law student argument.
Method of control doesn't really provide a defence - you'd be hard pushed to prove that the design of the controls made you do something. And if you could prove it, 'brain control' would be no different to a poorly designed control panel.
I suspect that the method of interface doesn't actually matter - brain->finger->weapon isn't significantly different to brain-> weapon.
In both cases you could end up reacting too quickly, and cause a bad outcome.
And insufficient buffer between thought & action has always been a problem in all areas of human activity.
In any case I suspect anything designed with this interface concept would be built to require an explicit trigger (i.e. a deliberate thought rather than subconscious) which negates the defence.
As for holding engineers liable - if you build a tool, you may well find yourself liable if it does something unintended but if it works as intended ultimately liability lies with the user.
Any object can be used for bad purposes - hammer, knife, car, biro(!) - and you don't hold the person who designed it liable when this happens.
Even the majority of 'military' objects can be used for defensive as well as offensive roles - and it gets even more complicated when you consider 'lesser of two evils' type arguments. It's a bit more difficult to argue when it comes to things like nuclear armed ICBMs - though even these can be said to have a 'good' purpose if used purely for deterrent.
So it's the end user who holds the can, not the engineer. Unless of course the engineer made a cockup!
The conscious act is wearing that weapon
If someone were to strap on a gadget that sometimes randomly fired a gun at people passing by, in the full knowledge that it did so, and walked down the High Street wearing it, then if someone got killed it would be judged as murder.
If a soldier knows his kit may randomly kill innocent people and yet deliberately puts it on and goes out wearing it, then he is guilty if anyone is killed.
thoughts never harmed anyone previously, now according to this they will be able to? In which case, someone fitted with this must be responsible for their own thought and actions
"That would be fine if the soldier's brain had correctly spotted a legitimate target, but obviously less so if instead a noncombatant got smoked. That would be a war crime."
Last I checked, it wasn't a war crime when a civilian gets caught in the crossfire. It's bad, yes - particularly so for the civilian involved - but an instinctive self-preservation act by a soldier in a war zone can hardly be called a war crime.
In fact, as I see it, war crimes are by definition the opposite of the issue here: They're premeditated actions taken against a civilian (or military, if they're bad enough) population with either no military value or military value only in their effect on others.
Unless these brain-plug weapons are making long-term decisions for the soldiers, I fail to see how their use would have anything to do with war crimes, unless it was determined that they were used, on purpose, to trick men on the ground into firing on civilians. But then it would be the brass and not the soldiers responsible - and any war crimes court would hopefully see it that way.
Well, Mr. Page...
...I commend your stoutheartedness and overall resilience in reading, analyzing and commenting upon such an exercise in sophism, retardedness and pointlessness. In fairness, the author might be a 2nd semester student trying to say something of interest.
"International humanitarian law, therefore... should create incentives to produce maximally predictable and accurate weapons."
Sure, let's see humanitarian law incentivize deathboffins (put on notice of liability for indiscriminate slaughter beforehand) towards the creation of better weapons that discriminate in an age of asymmetric warfare. Whalesong please!
I'm waiting for the first ambulance-chasing lawyer coming out of the Green Zone Front Door trying to check on whether that JDAM was actually well-placed.
only the loser commits war crimes
The concept of War Crimes is not valid.
"War crimes" prosecutions are a final outrage, perpetrated by the winner on the helpless loser. In any kind of asymmetrical conflict, the outcome is certain, with the winner using superior weapons, troops, and logistics to brutally destroy the loser. The presumed loser in such an asymmetrical conflict makes a rational choice to use weapons and tactics that the winner labels unfair, because if he fights the war on the presumed winner's terms, the outcome is certain.
No third party could distinguish the asymmetrical kill ratio obtained from overwhelming firepower from the asymmetrical kill ratio inflicted by (for instance) a weapon of mass destruction. In the end, either route results in body bags.
If the concept of war crimes can ever make sense, it only applies between opponents of approximately equal strength, where use of proscribed weapons is available to both sides, and serves only to increase the body count without providing a strategic advantage. It is a leftover notion from world wars. In the modern world of superpowers, it has no meaning.
I don't think the military will ever accept a weapon that will fire at times and targets they cannot predict.
They're not idiots about things that go boom - not the ones in charge of other peoples' lives, anyway - and they seem to have taken even the rare short circuit or buggy program that throws lead or drops steel with extreme concern.
To tell them the new small arms their troops walk around with might spontaneously shoot things a soldier's subconscious (a thing we do not yet fully understand) considers threatening seems quite unlikely to be well-received if they intend to continue fielding soldiers in groups of more than one and in places where there are non-targets.
So I'm not worried.
'Course, if I see a borg-troop, I'm going to make myself quite scarce.
The problem as I see it
is that there are too many lawyers in the world Personally the sooner we can ratify International law to enable the global culling of lawyers the better In fact substitute baby seals with law students, I'm sure greenpeace would be on board with that.
The sooner we can thin the multitudes of blood sucking ambulance chasing oxygen stealers the better off we will be
re: a modest proposal
Using a criminal brain is a terrible idea, you'll end up with a drug addicted psychopathic killing machine.
As long as we can make a 'good' robot with a dead hero cop's brain maybe we can turn the tide of the stain on our good society that is nuke.
Quick change the law
We need action now. Anyone thinking bad thoughts is comitting a crime.
A crime remains a crime despite attempts at obfuscation
There is no way out - as long as we acknowledge the rule of law, that is. If a soldier shoots a noncombatant at present, he commits a war crime. The fact that he is tired, nervous, worried the non-combatant might be about to shoot him in the back - none of that matters.
Give the soldier a clever automated system that reads his mind and shoots for him, and the situation becomes more complicated. It's not immediately clear who is responsible, but it is still obvious that a war crime has been committed by someone. Trying to split up the responsibility so many ways that it kinda fades away altogether is a typical bureaucratic trick - the Met did exactly that with Jean Charles de Menezes' shooting, which was obviously murder but ended up not being pinned on anyone. Likewise Harry Stanley.
But an honest analysis would blame either the soldier for agreeing to wear the kit, the people who designed and manufactured the kit, or (perhaps most reasonably) the high command who procured the kit and ordered the soldier to wear it. Why not just go the whole hog and blame them all? With perhaps a side order of attempting to pervert the course of justice.
will be developed
This technology will be developed and tested in virtual environments. It will also most likely be evaluated and viewed as a great assett by the military. After all tests and probes it will be put in a first real world trial in a combat situation. Soon some soldier will in passing have an unfavourable thought about his officer and the weapon will power up (for safety reasons it will probably not be fully activated in its first empoyment). This experience will scare the boots of the military and they will return the technology to base - delaying any further trials for at least another ten years...
White has done some good
He has started a discussion on this subject. His paper may be total tosh, I do not know. He has however raised some good points. How about a criminal war? I personally believe that the war in Iraq was based upon false information. Why are we not hunting down the people that provided these lies? Why aren't the politicians who started the war being tried to see if their actions were criminal?
The responsibility for a war and the actions that take place within a conflict should go *all* the way to the top.
I'm with the turning the thing on is the conscious act crowd...
Can we have a vote?
>Using a criminal brain is a terrible idea, you'll end up with
>a drug addicted psychopathic killing machine.
Or worse! Someone who drives too fast.
It'd be a fine bit of tech if it just throws a marker onto a HUD (assuming it can be made light enough and discriminate enough). Anything the trooper is looking at and pointint their weapon at has already been assessed and processed by the conscious mind; this'd be a super-peripheral vision and the dogface would have to acquire the sight picture and pull the trigger having been alerted, just as if they caught movement out the corner of their eye.
It's a long way from "detection -> flag it" to "detection -> bring weapon to bear using soldier's neuromuscular reflexes -> discharge weapon" and there are plenty of milliseconds for additional threat-discrimination algorithms to be applied.
That sounds like a good plan...
There might even be a film in an idea like that. :-)
"If a soldier knows his kit may randomly kill innocent people and yet deliberately puts it on and goes out wearing it, then he is guilty if anyone is killed."
Last time I went to a war zone I was ordered to carry and use all my kit, it would have been nice if I'd had the option of not wearing any of it and getting a nice tan but that would have landed me in Colchester....
I'd like to join your Army Christof, what's the pay like?
@AC - Under orders
>it would have been nice if I'd had the option of not wearing any of it
"I was only following orders" has already been discredited as a defence.
I think that British soldiers actually have a duty not to follow an illegal order?
Destroy all rational thought
"the new small arms their troops walk around with might spontaneously shoot things a soldier's subconscious ... considers threatening"
I have a mental vision of the soldiers bursting into a room, and then firing wildly at spiders, bugs, ennui, childhood fears, "The Exorcist", the scary doll on the test card page etc, and other demons that lurk in the depths of the subconscious. How does the system differentiate between real military threats, e.g. a man with a rifle, and things that a man might also be subconsciously scared of, e.g. birds, sharp bits of glass, frogs, speeding cars etc?
There would have to be a control that prevents action on things the soldier perceives as only mild threats, in which case the baddies could circumvent this by dressing in lovely flowery dresses and painting their guns pink.
"Using a criminal brain is a terrible idea, you'll end up with a drug addicted psychopathic killing machine."
Er.. "Soldier" is shorter.
This is better. In this age of absolute military power, what is less important from a criminality pov is the indescriminate killing of non-combatants, and what is more important is the indescriminate choice of enemy.
The real war crime
is committed by the Presidents or Prime Ministers who start the damn things.
No truer words were ever spoken in this regard.
The loser is the only one capable of committing war crimes because by definition the winner has all the spoils and was justified in his actions.
Speaking as someone who recently taught a lesson on the Law of Armed Conflict to soldiers, perhaps I can shed some light on what constitutes a war crime ...
Disproportionate or unnecessary targeting of civilians is a war crime. Mistakenly shooting a civilian on the spur of the moment, in a combat situation, is legal. It's certainly a bad thing and is to be avoided at all costs*, but the LOAC has to vaguely adhere to reality, or people simply can't (and therefore won't) follow it.
* Not only because it's wrong in and of itself, but because of the knock-on effects on the soldier and the greater civilian population
Lets test the system..
.. on CounterStrike!
Rules of Engagement
I am surprised that Lewis Page did not mention these in passing. As I understand it, in UK military doctrine, the Grand Strategy are the aims and ambitions of the politicians, Strategy is set by the senior commanders that describes the military activity intended to achieve the (military part of) GS. Tactics are what are employed by the commanders on the ground (air, land, space) to deliver the senior commander's intent. The ROE are set by the politicians (in general terms) and are refined into military orders in Strategy.
The ROE are set such that acts of delivering military effect can (i.e. 'should') only occur towards achieving Strategy; failure to follow ROE is the 'war crime'.
I think the current UK military would find it difficult to define ROE that would adequately be able to control a weapon such as the direct 'thought->shoot' type
Written by a law student?!
Either American law is *significantly* different from UK law, or he;s an extremely poor student.
In general you need to prove two elements beyond reasonable doubt to show a crime has taken place. You need to show mens rea (guilty mind - usually intent, or negligence) and actus reus (the guilty act). If the soldier only falls foul of one of those two things he generally won't be guilty. By definition if he only thinks something but doesn't carry out the guilty act (i.e. he doesn't pull thre trigger) he can't be guilty.
[No third party could distinguish the asymmetrical kill ratio obtained from overwhelming firepower from the asymmetrical kill ratio inflicted by (for instance) a weapon of mass destruction. In the end, either route results in body bags]
The difference isn't the number of body bags, it's the contents of those bags. The rules of war say nothing about how many people you can kill, just about who you can kill and when. There's nothing wrong with killing 100,000 enemy combatants, but killing 1000 civilians is very, very bad.
@Ross and everyone else
Killing 1000 civilians is not a war crime. If it were, then every bomber pilot in WWII would be guilty of a war crime on both sides. I know that there are some who think so, but so far, there has been no covictions (or charges laid AFAIK).
Standard Operating Procedures
OK, the US forces seem to ignore them (which is odd because the hunting and fishing brigade ought to know better than shoot at the moving bush and *hope* it's a deer) but isn't it SOP that if you can't tell if it's an enemy, DON'T SHOOT.
I can shorten his entire theory into one word..
Even if it were technologically feasible, do you think the officers would let the enlisted run around with sub-consciously activated weapons?
Sir Frank Whittle
... invented the "turbojet", not the jet engine, 20 years after the jet engine was invented.
- Boffins attempt to prove the UNIVERSE IS JUST A HOLOGRAM
- Review Raspberry Pi B+: PHWOAR, get a load of those pins
- Review Reg man looks through a Glass, darkly: Google's toy ploy or killer tech specs?
- MEN WANTED to satisfy town full of yearning BRAZILIAN HOTNESS
- +Comment 'Stop dissing Google or quit': OK, I quit, says Code Club co-founder