top of page
Ravit banner.jpg

AI ETHICS
RESOURCES

AI enforcement case study: EEOC on hiring discrimination

Progress in US AI Regulation: The EEOC, a US federal agency, demonstrates holding companies accountable for AI discrimination.


➤ Background:


👉Workday makes and sells AI-enabled hiring tools.


👉Derek Mobley is suing Workday for hiring discrimination. He argues he was turned down from more than 100 jobs that used Workday's' platform due to his race, age, and disability status.


👉Workday filed a motion to dismiss. They argue they are not liable because they are not the employers, they just develop the tools.



➤ The new development:


👉 The EEOC helped Mobley’s case by urging the judge to dismiss Workday’s motion because their tools function as an employment agency.


👉 The EEOC is a US federal agency: The Equal Employment Opportunity Commission




➤ Importance 1 - End finger pointing


📢There’s a finger pointing game about AI responsibility.


📢The companies that develop AI tools shrug responsibility because they are not the users and don’t control the outcomes


📢The users shrug responsibility because they are not the developers and don’t control how the tools work


📢The EEOC is taking a stand that companies that develop AI are do carry responsibility for the laws their AI breaks



➤ Importance 2 - General laws apply to AI


📢 There is much focus on AI-specific laws, such as the EU AI Act.


📢 But enforcing non-AI-specific laws, such as non-discrimination laws, is just as important.


📢Mobley’s case, with the EEOC's help, demonstrates enforcing non-discrimination laws on AI.



➤ Bottom line:


Companies that sell AI-enabled hiring tools are responsible for hiring discrimination resulting from employers' use



➤ I'd love to hear other people's take on this development! Join the conversation in the LinkedIn thread.


FOR UPDATES

Join my newsletter for tech ethics resources.

I will never use your email for anything else.

bottom of page