back to article Google assisting the Pentagon in developing AI for its drones

Google is working with the US Department of Defense to develop AI algorithms to identify objects in videos taken by drones. So far, it’s unclear exactly what technology is being used and how involved Google are. But it’s all part of Project Maven, a programme launched by the Pentagon last year in April. The goal is to start …

  1. Ole Juul

    Google and the military

    "The DOD has made public its intentions to ramp up research efforts in AI and machine learning and the need to partner up with industry."

    One would hope that a company with an international reputation of "do no evil" would be more careful to distance themselves from military interests.

    1. Rebel Science

      Re: Google and the military

      They've been with the military, DARPA, CIA, etc. from day one. DeepMind was just pretending to be ethical and conscientious just to attract top talent. The mainstream AI industry is not to be trusted.

    2. Mark 85 Silver badge

      Re: Google and the military

      You forgot the sarcasm icon... "do no evil" has been buried a long time ago by Google.

  2. Arachnoid
    Facepalm

    Development

    You would have thought Amazon would have been all over this

  3. Sleep deprived
    Happy

    I'm Feeling Lucky

    Google your name to see if death is coming from above.

  4. Denarius
    Happy

    au contraire

    Feel safe. Now DoD is involved, the process droids will kill it with the Death of a thousand pieces of paper cuts. Being military is no defence against magical thinking.

  5. Anonymous Coward
    Anonymous Coward

    Gorilla warfare ?

    just saying.

    1. JohnFen Silver badge

      Re: Gorilla warfare ?

      Why do you want war against gorillas? What have they done to you?

  6. DougS Silver badge
    Terminator

    A company without ethics developing AI for the military

    What could possibly go wrong?

  7. Teiwaz Silver badge

    Google helping Military with object recog A.I?

    I can see the results now, the first couple will be sponsored shopping suggestions....

    Just what the troops need in a hairy furball.....

  8. tiggity Silver badge

    Sheepish

    I hope the AI is better than some of the recent widely reported visual AI systems that seem to especially struggle with sheep..see the nicely titled "do neural nets dream of electric sheep"

    http://aiweirdness.com/post/171451900302/do-neural-nets-dream-of-electric-sheep/amp

  9. Anonymous Coward
    Anonymous Coward

    google AI

    bombing apple hq ring because "hacking"

  10. Anonymous Coward
    Anonymous Coward

    What a disaster

    Google has so many technical issues they can't find their arse with both hands and a road map. You can take it to the bank that Google will FUBAR any AI effort.

    1. ratfox Silver badge
      Angel

      Re: What a disaster

      I'm confused. Do you mean that's a good thing they're involved in this project, or a bad thing?

  11. Czrly

    But TensorFlow is Open Source!

    TensorFlow is Open Source and Google and the whole machine learning sphere draw extensively from the open source community, raising a chewy question to those of us who are NOT citizens of the USA and who do not get a democratic vote (degree of democracy and utility of electoral mechanism to be debated elsewhere) with which to make a stand for or against the actions of the US military.

    Essentially, whether one approves or disapproves, if one has submitted a patch to TensorFlow or any upstream component, one is contributing to their effort. If one has helped diagnose and debug an issue, one has played a role in this. Even those innocent and ubiquitous Google Captchas feed into this in some way -- how else will the DoD identify vehicles, shop fronts and street signs with high accuracy?

    This raises an important moral question about Open Source software. Your amusing cat-riding-a-skateboard detector might be used to target bombs in the future -- are you sure you want to give it away on GitHub or Kaggle Notebooks? Sure, this outcome is vanishingly unlikely. Sure, you can invent the "pacifist BSD" license and/or write "may not be used to target bombs" at the top of each Python script. The chance is still there and so the question remains open.

    Targeting bombs may be hyperbole but the automated and wide-spread surveillance of private citizens of another sovereign nation -- citizens who have no vote against such actions -- is still wrong in my opinion. Whether some extra-judicial entity on the other side of the world labels those citizens as "terrorist" or "non-terrorist" is entirely irrelevant. Air-strikes are also a reality and those air-strikes are triggered and guided by such surveillance. Air-strikes are unilateral acts of war (let's call it what it is) and do kill civilians. According to the USA, they also eliminate targets labelled as "terrorists" by the aforementioned extra-judicial entities. According to me, that is debatable at a higher, international level.

    1. Robert Helpmann?? Silver badge
      Childcatcher

      Re: But TensorFlow is Open Source!

      Czrly, what you bring up is really at the heart of what I think the ethics of this are. Not the bit about working on open source because it can be abused - that kind of thinking leads to stagnation as anything can be re-purposed to accomplish goals other than what was originally intended. The real ethical issues as I see them are 1) when is it ethical to develop new weapons and 2) when is it ethical to use them?

      The article brings up the idea that ethics are tied to risk analysis and that not enough has been done, but that is just a matter of spending the time and doing the analysis and perhaps implementing and proving failsafes. Once we are past that, we are still stuck with the above to questions.

      My feeling is that AI used by a nation for political or military (what difference, really?) goal should be governed by the same rules as any other use or threat of force, but that is just my simple opinion. The first question as to when it is ethical to create a new weapon is much more complex and I don't know where to begin on it.

  12. JohnFen Silver badge

    Evolution

    This seems like the natural evolution of Google's decision to be evil.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Biting the hand that feeds IT © 1998–2019