'letting an AI rip on the unbalanced data simply trains it to be similarly biased. Hiding a field labelled "skin color" does not compensate for anything when the AI's algorithms charge ahead identifying the same patterns of biased social profiling by the justice system anyway.'
I would go as far to say that bias is the society the 'AI' was created in, and I quote 'AI' because that is another can of worms.
The bias is there, in the many areas of media, government, people in areas and so on. Funny how we are seeking a completely neutral, for a given value of neutral, approach to decision making. A neutral decision making process is easier the simpler the process.
Take a court system.
If you assign a sentence to a particular crime, and that sentence is weighted by previous convictions, age of convicted etc, then that should take place regardless of anything else.
Now if you are trying to automatically bring in a Mercy factor, or mitigating factor - based on upbringing, lack of chances etc, and you have a person who is from a wealthy white background - they will be penalised because now we say 'you had every chance yet still you did X'. This may be true, but in the context of the crime, is this also just?
It will never be a perfect system. Just like the existing wetware isn't a perfect system. Human nature - we have consistantly shown bias toward the powerful. Whether that is down to money and background/status, or power awared in the particular societal construct people happen to fall into. (Soviet Russia etc).
In attempting to leave our gods, decry them either dead or never were, we are trying to create new ones to replace them.
Oh the irony.