The problem is
Tell all of that to the MSM media and get them to change... Good luck...
Artificial intelligence these days is sold as if it were a magic trick. Data is fed into a neural net – or black box – as a stream of jumbled numbers, and voilà! It comes out the other side completely transformed, like a rabbit pulled from a hat. That's possible in a lab, or even on a personal dev machine, with carefully …
If you want to skip the hype and actually see what AI looks like. Watch this Google Talk.
https://www.youtube.com/watch?v=u4alGiomYP4
It's not supposed to require a PHD apparently, although after watching it I think it certainly helps!
It's interesting, although clearly little more than the Hello World of Machine Learning. So its pretty clear that most "enterprise" developers are unlikely to be dropping low level ML models into their next project. That's probably why Googles starting to also offer some "AI" as a Service APIs. So you can treat it as a magic black box, and pay Google to do the thinking for you.
Yes. Here are some things I've noticed over the years; offered in case they help.
"A.I. There's a lot of hype..." El Reg (2017).
"A.I. is hard." This dates back to Minsky I think. Everybody keeps forgetting it every ten years.
"A.I. is hard, especially outdoors." Mine. Applies to self-driving cars. If you understand what Minsky means by "hard", then you'd be very cautious about standing in front of self-braking Volvos, or Teslas speeding towards you.
"A.I. outdoors needs senses." Sight, hearing, touch, smell... Self-Driving Cars will need hearing and smell. Plus vibration detectors.
Google Deep Dream suggests a conceptual approach to visualize exactly what it is that your artificial neural net has *actually* learned. Nobody seems to have noticed this connection.
Cheers.