Re: Is it a bug?
If you have to insert an explicit rule, it's not AI. It's a human-written heuristic.
If the machine can't learn on its own, it's not AI either. It's a human-controlled heuristic.
If you have to spend your life telling it "Oh, and look out for this explicit thing that you get wrong", then you may as well just write a list of rules.
And the exact problem with these "deep learning" machine algorithm things is that you can't just say "Oh, take this into account", because they aren't written that way, they've learn from the data.
No, you have to go back, create test cases for every imaginable scenario, spend years training it on all of them and hope it picks up on what it was doing wrong. And then someone comes up with, say, picture-in-picture which confuses it again. Back to square one retraining on that too.
So you can't use it for, say... crowd-based facial recognition (as is often advertised as a use case for such things), or self-driving vehicle cameras, because it could flag ANYTHING at any point just by being sufficiently distracted - even with NO knowledge of its underlying training or algorithms. And you can't train it on every possible scenario well enough that someone trying to catch it out can't just make it flip.
Imagine telling even a 2-year-old that they're going to need to win the toy by getting the video right. And you show them a video with a thousand frames of tigers, and one frame of an Audi. Would you really ever expect them to press "Car" instead of "Animal"? This system is no better trained than a 2-year-old human, in that case, who can do a lot more besides.