| |||
![]()
|
![]() ![]() |
![]()
Computers judging how a person is treated It is fashionable to adopt policies whereby a computer system judges how a certain person deserves to be treated, but they "put a human in the loop" by giving per the job of looking at the computer's recommendations and authorizing them or not. Experiment shows that such systems systematically fail. The article explains why they fail. What it comes down to is that "putting a human in the loop" is ineffective at correcting the computer system's errors, but instead has the practical effect of serving to excuse those errors. The article linked to just above displays symbolic bigotry by capitalizing "black" but not "white". (To avoid endorsing bigotry, capitalize both words or neither one.) I denounce bigotry, and normally I will not link to articles that practice it. But I make exceptions for some articles because I consider them important — and I label them like this. The experience with Israel's machine learning target selector system tends to confirm this conclusion. </li> |
|||||||||||||
![]() |
![]() |