John Taylor, Professor of Economics at Stanford University and developer of the "Taylor Rule" for setting interest rates | Stanford University
John Taylor, Professor of Economics at Stanford University and developer of the "Taylor Rule" for setting interest rates | Stanford University
From platforms like TikTok and YouTube to Google Images and AI art generators, the digital world is increasingly dominated by visual content. In 2023, over 3.2 billion images and 720,000 hours of video were uploaded daily, a number that continues to grow. Douglas Guilbeault, an assistant professor at Stanford Graduate School of Business, highlights the unprecedented scale of image production and circulation.
These images do more than just change information consumption; they also reflect and reinforce societal perceptions, including gender stereotypes. Guilbeault points out that online visual tools are dynamic, serving images based on user behavior. This creates a feedback loop that can amplify unconscious biases when users interact with stereotypical images.
In a pioneering study, Guilbeault and co-authors explored how gender bias manifests in online images compared to text and its impact on users. They examined whether certain professions or social roles were more associated with a particular gender. “This has to do with what kinds of people occupy our thoughts when we understand the social world,” says Guilbeault.
The researchers analyzed around 350,000 Google Images depicting various jobs and roles while using AI tools to analyze text from Google News for gender associations. Both data sets revealed skewed associations between gender and specific occupations, with women often shown in liberal arts roles and men in science and technology positions.
However, this imbalance was more pronounced in images. For instance, all software developer images featured men, while text descriptions included women. Compared to workforce data, images tended to overrepresent men across various jobs.
To assess how these associations might influence stereotypes, volunteers searched for either text descriptions or images of 22 occupations on Google News or Images. They then used a slider to indicate which gender they associated most with each occupation. “This surprisingly simple measurement actually predicts a lot about what people think and believe about these categories,” says Guilbeault.
Results showed intensified gender bias among those searching for images rather than text—a bias that persisted even three days later. Participants exhibited stronger associations if the occupation matched their own gender identity.
Guilbeault notes the methodological choice to focus on binary genders but emphasizes the intent not to reinforce such conventions. Less than 2% of participants categorized faces as nonbinary when identifying genders in online images.
The study found underrepresentation of women in online imagery but suggests algorithmic factors as potential explanations for observed biases. Search engines like Google Images may prioritize male-oriented content due to implicit user biases reflected across the internet.
Despite actual profession demographics—such as male-dominated doctors—the study found exaggerated biases in online imagery compared to real-world figures. The inherent challenge is avoiding gender bias through language alternatives doesn't translate well into visuals where associating gender becomes unavoidable.
Images have a profound impact due to their ease of processing and emotional resonance compared to text—a phenomenon known as picture-superiority effect. As society shifts toward visual communication cultures, Guilbeault warns against stereotype amplification without interventions promoting different norms: “The medium is the message.”
This article was originally published by Stanford Graduate School of Business.
©Copyright Stanford University