1 Matching Annotations
- Jan 2024
-
www.eff.org www.eff.org
-
Images of women are more likely to be coded as sexual in nature than images of men in similar states of dress and activity, because of widespread cultural objectification of women in both images and its accompanying text. An AI art generator can “learn” to embody injustice and the biases of the era and culture of the training data on which it is trained.
Objectification of women as an example of AI bias
-