Are algorithms building the new infrastructure of racism?
"What gets chosen is usually whatever is easiest to quantify, rather than the fairest." "While the architects of the new infrastructure may not have the same insidious intent, they can’t claim ignorance of their impact, either. Big-data practitioners understand that large, richly detailed datasets of the sort that Amazon and other corporations use to deliver custom-targeted services inevitably contain fingerprints of protected attributes like skin color, gender, and sexual and political orientation. The decisions that algorithms make on the basis of this data can, invisibly, turn on these attributes, in ways that are as inscrutable as they are unethical." Aaron M. Bornstein explores in Nautilus magazine.
From Nautilus magazine