Images on the internet amplify gender stereotypes

Images on the internet amplify gender stereotypes

Images on the internet amplify gender bias, to the detriment of women, with a lasting effect on users who are exposed to them, according to a study published Wednesday in Nature.

Images have long taken precedence over text in the consumption of information on the web. A movement amplified by the use of social networks on smartphones, where platforms like TikTok, Instagram and Snapchat favor images.

The study carried out by Douglas Guilbeault, professor of computer sociology at the American University of Berkeley, shows that this hegemony of the image exacerbates “gender bias, to the detriment of women“, by imposing a majority of images of men, with a psychological effect on Internet users.

A judged observation “alarming, because of the potential consequence of reinforcing harmful stereotypes, mainly for women, but also men“, underlines the researcher with AFP.

Co-author of the study and professor at Berkeley, Solène Delecourt cites the “very simple example of a child seeking to know more about a profession, who would only see in images the representation of a single genre, and could say to himself that it is not made for him“.

A problem all the more serious, according to the study, as the “silent effect“of an image is multiplied by its distribution on a large scale on the web. However, it is”more easily remembered and emotionally evocative” than the text.

Psychological effect

The team formed around Douglas Guilbeault sifted through more than a million images automatically extracted from Google, Wikipedia and the IMDb film database, and a corpus of more than one hundred billion words on Google News.

They then measured the association of images and texts with nearly 3,000 social categories, including professions (like doctor or carpenter) and roles (like colleague or neighbor).

First result: a gender bias in favor of men, more present than women in images and texts. Second result: this bias is significantly more pronounced in images than in text.

The gender bias, for example, that the profession of nurse is a woman.is always stronger in images than in textual descriptions of this category on the same online platforms“, explains Professor Guilbeault. In other words, “these stereotypes are exaggerated, or stronger, in images than in texts.”

The study shows that the phenomenon is not confined to the United States – the team extracted images from internet addresses abroad -, or to a particular platform.

This bias in the images is more pronounced than that detected in an opinion survey carried out by the researchers. Above all, it contradicts the statistics of the American Department of Labor on gender and occupations.

But the most disturbing thing was to measure the psychological effect of the images in relation to the text.

A lasting effect

People searched either in an image database or in a text database for descriptions of professions linked to science, technology and the arts, such as astronaut, harpist, poet or even neurobiologist.

Immediately afterward, participants were given a standard test, the IAT (Implicit Association Test), which measures a person’s implicit biases.

We found that people who searched for images had a more pronounced gender bias” than those who searched for the text, notes Professor Delecourt to AFP. A lasting effect, since it is still detectable three days later.

Ultimately, images influence people in ways they are not aware of“, adds Professor Guilbeault. According to him, “we do not pay enough attention to this movement towards image-based communication“.

The study questions the responsibility of content platforms in the propagation of gender bias, and the means they could take to avoid accentuating them.

Even more so with the development of image generation tools based on artificial intelligence, whose algorithms draw their raw material from online content.

With the result that “the images generated by these algorithms reflect all kinds of biases“, remarks Professor Guilbeault.