Full Stack Recruiter Newsletter

Full Stack Recruiter Newsletter

Share this post

Full Stack Recruiter Newsletter
Full Stack Recruiter Newsletter
AI Bias on Trial: The Mobley v. Workday Hiring Lawsuit
Copy link
Facebook
Email
Notes
More

AI Bias on Trial: The Mobley v. Workday Hiring Lawsuit

Explore how the Mobley v. Workday lawsuit highlights the risks of AI bias in recruitment and its major implications for recruiters, vendors, and companies.

Jan Tegze's avatar
Jan Tegze
May 25, 2025
∙ Paid
4

Share this post

Full Stack Recruiter Newsletter
Full Stack Recruiter Newsletter
AI Bias on Trial: The Mobley v. Workday Hiring Lawsuit
Copy link
Facebook
Email
Notes
More
1
Share

If you're a recruiter using any kind of AI-powered tool — whether it's resume screening, skills matching, or automated assessments — the lawsuit Derek Mobley v. Workday, Inc. should have your full attention.

This isn’t just another legal battle. It’s the first major lawsuit to challenge how artificial intelligence is used in hiring, and it’s already making waves. At the center is Derek Mobley, a Black man over 40 who also lives with anxiety and depression. He says he was automatically rejected from over 100 jobs by companies using Workday’s recruiting software — without even getting an interview.

Mobley’s case argues that Workday’s AI tools screen people out unfairly based on age, race, and disability. And a federal judge recently agreed that his age discrimination claims were serious enough to be considered as a nationwide class action. That’s huge.

But what really makes this case different — and frankly, urgent for all of us in talent acquisition — is that it doesn’t just target an employer. It targets the vendor: Workday itself.

If the courts decide that vendors like Workday can be held responsible for discriminatory outcomes their AI tools produce, it could change the way we all evaluate, buy, and use recruiting tech. It might even redefine what due diligence looks like in our jobs.

What Is This Case Derek Mobley v. Workday, Inc. Really About?

At its core, Derek Mobley v. Workday, Inc. is about one thing: whether AI tools used in hiring can discriminate — and if the companies that build those tools can be held accountable.

Derek Mobley, the plaintiff, is a qualified professional who says he applied for more than 100 jobs through companies using Workday’s AI-driven recruiting software. Not once did he get an interview. In some cases, he says, he was rejected within minutes — even in the middle of the night. That kind of speed doesn’t exactly suggest thoughtful human review.

Mobley is arguing that he was filtered out automatically because of who he is. His claim? That Workday’s software unfairly screens out people like him — people over 40, people with disabilities like anxiety or depression, and Black applicants.

The lawsuit makes a strong case that AI isn’t as neutral as it might seem. It points out how algorithms can learn bias from the data they’re trained on.

For example:

  • If you list a graduation year that hints you’re older, that could work against you.

  • If you went to an HBCU, the system might infer your race.

  • If you take a personality test and answer honestly about your mental health, you might be flagged as a “poor fit.”

Mobley’s legal team isn’t just saying that these things might happen. They’re saying they did — and that Workday’s tools played a key role.

What makes this case even more important is who it's aimed at. Usually, discrimination lawsuits target employers. But here, Mobley is going after the tech company — saying Workday acted like an “agent” of the employers by screening out candidates before hiring managers even got involved.

That opens up a big legal question: if a vendor builds and sells the tech that filters people out, can they be held responsible if that tech causes discrimination?

That’s exactly what the court is starting to explore — and that’s why this case is a potential game changer.


Subscribe today and stay one step ahead of other recruiters with the latest news and tips!


Workday’s Defense and What It Tells Us About AI in Hiring

Workday isn’t taking these allegations lightly — and their defense comes down to one main message: “We don’t make hiring decisions. Our customers do.”

Workday’s argument is that they’re just a software provider. They build tools, and it’s up to each employer to decide how they use them. According to Workday, the hiring choices aren’t made by their AI — they’re made by the people using their platform.

They’ve also emphasized that their tools are configurable. Companies can choose which features to use, how they’re applied, and whether they want AI recommendations at all. In Workday’s view, this flexibility means they shouldn’t be held liable for how any single employer configures their system.

Another key part of their defense is the idea of compliance. Workday says they conduct ongoing “risk-based reviews” of their products to make sure they meet legal standards. They also argue that they don’t screen or reject applicants directly — they just provide rankings or scores to the employer, who then decides what to do.

But here’s where things start to get tricky.

The court has already pushed back on some of these claims. For example, Mobley’s lawsuit includes details showing that candidates were rejected within minutes — sometimes even before a human could reasonably review the application. The judge also noted that Workday’s own website materials describe features that make applicant “recommendations,” which undermines the idea that the platform plays no role in the actual decision.

And there’s another angle here that recruiters need to watch: Workday claimed that the class action shouldn’t move forward because the potential group of affected people was just too large — maybe even in the hundreds of millions. But the judge flipped that point around, saying: “Allegedly widespread discrimination is not a reason to deny notice.”

That’s a sharp reminder. The more widely used a tool is, the bigger the responsibility becomes. And Workday is used by thousands of employers.

So even if you're not using Workday specifically, their defense strategy — and how it’s being challenged — gives you a preview of the new accountability questions that are coming for AI tools in hiring.

This isn’t just about software anymore. It’s about outcomes — and who gets left behind when AI makes the first cut.

Case Milestones: A Timeline

This case reached a whole new level on May 16, 2025. That’s when a federal judge ruled that Mobley’s age discrimination claim could proceed as a nationwide class action. That’s a big deal.

Why? Because now it’s not just about one person. The court certified a group of people — anyone age 40 or older who applied for jobs through Workday’s platform and was rejected from September 2020 onward.

This shifts the lawsuit from a personal complaint to a collective legal challenge involving potentially thousands, maybe even millions, of job seekers. And it raises the stakes dramatically — not just for Workday, but for any company using similar tech.

Mobley v. Workday

Here’s what the judge focused on:

  • The central issue is whether Workday’s AI tools have a “disparate impact” on older applicants.

  • Applicants don’t have to prove they were the most qualified. What matters is whether the system itself filtered them out unfairly before they ever had a real chance.

Disparate impact means that even if you didn’t intend to discriminate, your process or tool still had that effect on a protected group. In other words, it’s not about what you meant - it’s about what actually happened.

This ruling also rejected Workday’s argument that the class was too big or too varied. The judge said that a large class doesn’t make the problem go away — it actually highlights the scale of potential harm.

Another important point: the court emphasized that automated rejection alone can be discrimination — even if no human ever made a biased choice. If the algorithm creates an unfair barrier for older candidates, that’s enough to support a claim.

So what does this mean for recruiters?

It means that even early-stage filtering tools — resume scoring, skills matching, knockout questions, automatic ranking — can trigger legal exposure if they screen people out in a way that disproportionately affects a protected group.

And now that a court has said this type of claim can move forward on a large scale, other lawsuits could follow. This ruling could become the legal blueprint for future AI discrimination cases.

If you’re in talent acquisition, this is a loud signal. AI tools that shape who gets seen — and who doesn’t — are now front and center in court. That makes transparency, fairness, and accountability not just nice-to-have, but mission-critical.

The Tools Under Fire: What Workday’s AI Allegedly Did

Let’s talk specifics. What exactly did Workday’s tools do that raised red flags?

Mobley’s lawsuit points to several features in Workday’s recruiting software that, he argues, contributed to bias. And if you’re using similar tools — even from other vendors — this is where things get very real.

Candidate Skills Match

This tool scans job descriptions and compares them to the skills listed on a candidate’s application. It then scores the match using labels like strong, good, fair, or low. Sounds helpful, right?

The concern is how those scores are calculated. If the algorithm has been trained on past hiring data that favored certain types of candidates — say, those with recent graduation dates or specific tech backgrounds — older applicants might score lower even if they’re fully qualified.

Workday Assessment Connector

This one uses machine learning to adjust its recommendations based on how employers behave. If hiring teams consistently prefer certain profiles, the AI learns from that — and starts recommending similar candidates.

That can quickly spiral. Let’s say a hiring manager tends to favor younger applicants or graduates from certain schools. The AI notices this pattern and starts boosting those traits in future rankings — even if those traits don’t actually predict job success.

This creates a feedback loop that reinforces bias. Over time, the AI may quietly deprioritize candidates who don’t fit that learned “ideal,” including older workers, people with disabilities, or those from underrepresented backgrounds.

Personality and Cognitive Tests

Mobley also raised concerns about assessments that evaluate personality or cognitive traits. These tests may seem neutral, but if the questions or scoring mechanisms disadvantage people with anxiety, depression, or other mental health conditions, they can violate disability protections.

Hidden Signals and Inferred Data

What all of these tools have in common is this: they don’t need to directly ask about age, race, or disability to treat people differently. They can infer those things through proxy data:

  • Graduation dates (age)

  • School attended (race)

  • Gaps in work history or test responses (disability)

And because these algorithms operate in the background, candidates often have no idea why they were rejected — or that AI even played a role.

This is the heart of Mobley’s claim: that the system created automatic rejection at scale. He’s saying the AI was the gatekeeper, and it filtered him out before a human ever read his resume.

For recruiters, this is a critical moment of reflection.

  • Do you know how your AI tools are ranking and scoring people?

  • Are you sure those rankings are based on job-relevant factors — and not on patterns that reflect past bias?

  • And if the AI suggests someone isn’t a “fit,” do you challenge it — or just move on to the next resume?

Understanding what your tools are doing under the hood isn’t just a technical concern anymore. It’s an ethical and legal one.

Workday

Are Vendors Now Liable? A Legal Shift Recruiters Can’t Ignore

One of the biggest reasons the Mobley v. Workday case matters so much is because it’s not just aimed at the companies doing the hiring — it’s aimed at the company that built the hiring tools.

Mobley’s legal team is arguing that Workday isn’t just a software vendor. They say it acted like an agent of the employers who used its platform. Why? Because its AI tools had such a strong influence on who made it past the first step — and who didn’t.

This question of whether vendors can be held legally responsible is huge. Traditionally, employers are the ones liable for discrimination. But this case is challenging that by asking:

If an AI system screens someone out unfairly — and it was designed, maintained, and marketed by a vendor — shouldn’t that vendor share some responsibility?

The judge in this case didn’t dismiss that idea. In fact, the court said that Mobley “plausibly alleged” that Workday acts as an agent when employers rely on its AI to handle key parts of the hiring process, like screening or ranking candidates.

This is where things start to shift for the whole industry. Because if vendors like Workday can be held accountable,

And it doesn’t stop there.

The Equal Employment Opportunity Commission (EEOC) also jumped into the conversation, supporting Mobley’s case. They argued that companies like Workday might qualify as “employment agencies,” because their tools screen, score, and influence which applicants move forward — kind of like a modern, automated recruiter.

Workday, of course, disagrees. They say they don’t procure employees for clients, so they shouldn’t be treated as an employment agency under the law. For now, the court has rejected that specific label — but left the door open on others, like agent or indirect employer.

So what does this mean for recruiters?

It means that the tools we rely on are under serious legal and ethical scrutiny — and that scrutiny could change how vendors are regulated and how we evaluate their products.

We’re moving into a world where “we just use what the vendor gave us” won’t be a good enough answer. If your AI tool is causing harm, someone will be held responsible. And now, it might be both the builder and the buyer.

So… Will Derek Mobley Win This Case?

That’s the big question — and if we’re being realistic, probably not in the way many headlines suggest.

Mobley’s lawsuit raises urgent and important issues. But when it comes to proving personal harm, he may face an uphill battle. Part of his claim is that he was rejected from jobs going back as early as 2017 — a time when Workday hadn’t yet rolled out the kind of AI hiring features that are now in the spotlight. In fact, most of Workday’s major AI capabilities didn’t take shape until 2024 and beyond.

Derek Mobley

To put it in context, ChatGPT didn’t even launch to the public until late 2022. The real surge of generative AI and machine learning in HR tech is still very recent. That time gap could weaken Mobley’s personal claims, especially if he’s trying to tie his rejections to features that didn’t exist when he applied.

Still, that doesn’t mean the case doesn’t matter — far from it.

What Mobley v. Workday is really doing is testing the legal boundaries of AI in hiring. The court’s willingness to grant class action status and explore whether AI tools can create a disparate impact is the real story here.

Even if Mobley’s individual case doesn’t succeed, the industry-wide impact could be massive. If the courts decide that things like knockout questions, ranking algorithms, or scoring systems in applicant tracking software create unfair barriers — that opens the door for future lawsuits. Especially if candidates can show that they were filtered out by automated processes without a fair shot.

For recruiters, this means one thing: Whether or not Mobley wins, this case has already changed the conversation.

Bias in hiring isn’t just a human issue anymore. It’s a systems issue. And it’s now your responsibility to understand how your tech works — and whether it’s working fairly for everyone.

Because AI might be the future of recruiting — but fairness is the future of accountability.


Do you know who could benefit from this? Share this article with them.

Share


Building/Implementing AI the Right Way: Best Practices for Recruiters

If you’re using or evaluating AI tools in your hiring process, this part is for you. The Mobley v. Workday case has made one thing clear — using AI in recruiting isn't just about speed and scale anymore. It’s about responsibility.

So let’s talk about what doing it right actually looks like:

This post is for paid subscribers

Already a paid subscriber? Sign in
© 2025 Jan Tegze
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share

Copy link
Facebook
Email
Notes
More