Войти в систему

Home
    - Создать дневник
    - Написать в дневник
       - Подробный режим

LJ.Rossia.org
    - Новости сайта
    - Общие настройки
    - Sitemap
    - Оплата
    - ljr-fif

Редактировать...
    - Настройки
    - Список друзей
    - Дневник
    - Картинки
    - Пароль
    - Вид дневника

Сообщества

Настроить S2

Помощь
    - Забыли пароль?
    - FAQ
    - Тех. поддержка



Пишет Richard Stallman's Political Notes ([info]syn_rms)
@ 2024-12-22 10:38:00


Previous Entry  Add to memories!  Tell a Friend!  Next Entry
Computers judging how a person is treated

It is fashionable to adopt policies whereby a computer system judges how a certain person deserves to be treated, but they "put a human in the loop" by giving per the job of looking at the computer's recommendations and authorizing them or not.

Experiment shows that such systems systematically fail. The article explains why they fail. What it comes down to is that "putting a human in the loop" is ineffective at correcting the computer system's errors, but instead has the practical effect of serving to excuse those errors.

The article linked to just above displays symbolic bigotry by capitalizing "black" but not "white". (To avoid endorsing bigotry, capitalize both words or neither one.) I denounce bigotry, and normally I will not link to articles that practice it. But I make exceptions for some articles because I consider them important — and I label them like this.

The experience with Israel's machine learning target selector system tends to confirm this conclusion.

</li>


(Читать комментарии) (Добавить комментарий)