Study reveals AI systems objectify women’s bodies

08/02/2023 | The Guardian

An investigation by The Guardian, which analysed how AI systems developed to protect users by identifying violent or pornographic images, found these systems have a gender bias. The research identified that many AI tools owned by large technology companies are often censoring images featuring women’s bodies because they are more sexually suggestive compared to pictures of men. The article includes a quiz with five examples of men and women in a similar context. In each case, Google AI and Microsoft AI considered the images of women racier than the equivalent photos of men. 

Read Full Story
Man and woman on beach

What is this page?

You are reading a summary article on the Privacy Newsfeed, a free resource for DPOs and other professionals with privacy or data protection responsibilities helping them stay informed of industry news all in one place. The information here is a brief snippet relating to a single piece of original content or several articles about a common topic or thread. The main contributor is listed in the top left-hand corner, just beneath the article title.

The Privacy Newsfeed monitors over 300 global publications, of which more than 4,350 summary articles have been posted to the online archive dating back to the beginning of 2020. A weekly roundup is available by email every Friday.

Freevacy has been shortlisted in the Best Educator category.
The PICCASO Privacy Awards recognise the people making an outstanding contribution to this dynamic and fast-growing sector.