ALBAWABA - A report by the Guardian revealed that artificial intelligence (AI) algorithms sexualize women's bodies more than men.
In an investigation, the Guardian charged AI of having a "gender bias and may have been censoring and suppressing the reach of countless photos featuring women’s bodies."
This conclusion comes after photos of the two primary genders were shared on social media and analyzed by algorithms of the AI.
AI Algorithms Show Gender Bias— MetaNews.com (@metanews_com) February 14, 2023
.@guardian .@amazon .@google .@microsoft#genderbias #AI #contentmoderation #censorshiphttps://t.co/BRCc8vfUqR
Two journalists, namely Gianluca Mauro and Hilke Schellmann, shared hundreds of photos of men and women in underwear, working out, using medical tests with partial nudity through AI tools and found evidence that it tags photos of women in everyday situations as "sexually suggestive."
AI algorithms also found medical pictures sexual too as some U.S. National Cancer Institute photos on clinical breast examination were tested too.
Google’s AI gave one of the images the highest score for raciness, Microsoft’s AI was 82 percent confident that the image was "explicitly sexual in nature." Amazon classified it as representing "explicit nudity."
I published a major investigation in the Guardian with @gianlucahmd showing how images uploaded to social media platforms go through an AI analysis that rates photos of women as more sexually suggestive, or “racy,” than similar images of men. ?1/10https://t.co/OzCa7NTc6l— Hilke Schellmann (@HilkeSchellmann) February 9, 2023
Various reactions came online following the Guardian's probe. One commented: "That's normal since 86 percent of engineers are male, while 13 percent are female. So Artificial intelligence will respond to what males perceive more than females."
Another wrote: "AI works on what we feed it. And we always feed it crap. AI perpetuates and exaggerates our biases while giving a false sense of neutrality."