Google's image-gen and editing apps *regularly* won't work on my photos if there's *any* hint of skin. Sometimes it will even reject editing simple (fully clothed) selfies of mine because it determines they're too risqu茅.
I tried this photo numerous times, getting rejected over and over, until it finally accepted it. Same photo. Same edits. Different results.
This reminds me of the research I did years ago on the racial, gender, and economic biases in the three main image-gen models (at the time). I showed statistically that each model associated women with domestic abuse victimhood, men with aggression & violence (specifically frat boys and Black men), racial minorities with poverty, and well-paid professions with white men.
Now we're seeing more sexist bias in *AI tools* that are being put into commonly-used software. This makes me want to run tests on image edit rejection in Google Photos with variously femme and masc-presenting images in different states of undress.