top of page
Ravit banner.jpg

AI ETHICS
RESOURCES

Sexism in genAI recommendation letters

New research: ChatGPT and other GenAI write sexist recommendation letters.



➤ Yes, more bias from generative AI!


It is not surprising but very important to highlight because eliminating these biases can be extremely difficult.



➤ As described in a Scientific American article about this paper:



😫 We observed significant gender biases in the recommendation letters,” says paper co-author Yixin Wan, a computer scientist at the University of California, Los Angeles.



😫 While ChatGPT deployed nouns such as “expert” and “integrity” for men, it was more likely to call women a “beauty” or “delight.”



😫 Alpaca [another genAI] had similar problems: men were “listeners” and “thinkers,” while women had “grace” and “beauty.”



😫Adjectives proved similarly polarized. Men were “respectful,” “reputable” and “authentic,” according to ChatGPT, while women were “stunning,” “warm” and “emotional.”



➤ If you use generative AI to assist in writing important documents such as letters of recommendation –


🚨Don’t forget to look and resolve the bias!!


➤ Join the discussion about this paper in my LinkedIn post


FOR UPDATES

Join my newsletter for tech ethics resources.

I will never use your email for anything else.

bottom of page