Reply to post: "He was a pimp" ...

Q. If machine learning is so smart, how come AI models are such racist, sexist homophobes? A. Humans really suck

T. F. M. Reader Silver badge

"He was a pimp" ...

... is a statement of fact about one particular individual, not about "all men" or "most men" or anything like that. As such, it cannot be biased.

Might there be a problem with the research methodology?

Speaking of which, I would be curious about two pieces of further research: 1) I assume the "classifier" (after it is fixed, cf. above) can be run on both the training set and the AI output. Is the latter more or less biased than the former? If there is any significant difference, are the machines more politically correct or more in your face than the human authors of the original material? 2) I assume the output of the AI can be passed through some relatively simple software that would correct for the biases (if you have a "classifier" that detects bias it should be possible to augment it with suggestions of what an "unbiased" equivalent would be). How similar would the outcome be to the "politically correct" speech that various busybodies try to impose on us humans?

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

SUBSCRIBE TO OUR WEEKLY TECH NEWSLETTER

Biting the hand that feeds IT © 1998–2019