3 laws of robotics
There are a Asimov's 3 laws of robotics, correct me if I'm wrong, but I don't believe the US Military has signed up to follow them per se.
There are another set of three laws that I say tongue in cheek, but hope I'm wrong about are...
1. A robot will not harm authorized Government personnel but will terminate intruders with extreme prejudice.
2. A robot will obey the orders of authorized personnel except where such orders conflict with the Third Law.
3. A robot will guard its own existence with lethal antipersonnel weaponry, because a robot is bloody expensive.
There is also a decent enough argument for the use of the 'zeroth law':
"A robot may not harm a human being, unless he finds a way to prove that in the final analysis, the harm done would benefit humanity in general."
Add pinch of salt and refer to; http://en.wikipedia.org/wiki/Three_Laws_of_Robotics