Or (as I would personally expect) will everybody get off scot-free, and all because it's 'for the children'?
While I'm not one especially worried abouut Paedogeddon which is always around the corner, this system is working to remove actual images of child abuse. You know, the pictures of children being raped and whatnot.
Even if the scheme was put together with no consideration of the possibility that a hash of a given image might raise a false positive somewhere, then the very nature of the work means that the image would need to be looked at by a live operator who is quite capable of telling the difference between a tourist picture of the Colosseum and an image depicting child abuse. Remember, the hashes work on a file level, it's not an automated comparison system of the actual image content (so you're not going to get odd false positives relating to the computer not being able to work out the context of a given image).
The worst comes to worst, a hypothetical image sharing site decides that they'll simply disallow uploads if a file hits the checklist, and you'll end up not being able to share that photo of Our Margaret with that horrific sunburn, how we laughed when Our Kev swapped the factor 50 for tanning oil.