Will AI Hiring Discrimination Become The Next Major Wave Of Employment Litigation?
When AI is not as unbiased as it may seem
Posted on 03-03-2025, Read Time: 6 Min
Share:
Highlights:
- The Mobley v. Workday ruling establishes that AI hiring tools can legally act as an employer’s agent, making businesses liable for discriminatory screening decisions.
- AI-driven hiring systems inherit biases from historical data, raising the risk of discrimination lawsuits and increased legal scrutiny from regulatory bodies.
- HR leaders must conduct bias audits, scrutinize vendor contracts, and implement human oversight in AI-driven decisions.

A court ruling last summer has put AI-driven hiring tools under the legal microscope, raising concerns that employers and vendors could soon face a surge of discrimination lawsuits. In Mobley v. Workday, Inc., the Northern District of California held that Workday’s AI-powered screening tool could be liable for hiring discrimination—despite the company acting as a third-party provider rather than an employer. This decision serves as a wake-up call for businesses relying on AI to make hiring decisions, underscoring the urgent need to assess legal exposure, algorithmic fairness, and compliance strategies.
The Legal Framework: AI as an Employer’s Agent
At the heart of the Mobley decision is the principle of agency. The court ruled that when employers delegate traditional hiring functions—such as applicant screening and rejection—to an AI-powered tool, that tool effectively acts as an agent of the employer. As such, employers and AI vendors alike may be held accountable under federal anti-discrimination laws, just as if a human hiring manager had made the decision. This interpretation significantly broadens the potential liability for both employers utilizing AI tools and the technology providers developing them.The ruling emphasized that "Workday’s role in the hiring process [was] no less significant because it allegedly happen[ed] through artificial intelligence rather than a live human being." In other words, whether hiring decisions are made by humans or algorithms, the legal scrutiny remains the same. Courts will focus on the function delegated to AI—not the method—when determining liability.
AI Bias: A Well-Documented Concern
AI hiring tools are only as fair as the data on which they are trained. Studies have shown that AI models can inherit biases present in historical hiring data, potentially leading to discriminatory outcomes. For example, past hiring patterns that reflect gender, racial, or age-based disparities may be inadvertently codified into AI models, perpetuating discrimination under the guise of objectivity. In high-profile cases, AI recruitment tools have been found to systematically disadvantage certain groups, underscoring the risk of algorithmic bias.Despite efforts to refine AI models and introduce bias-mitigation techniques, these tools remain vulnerable to unintended discrimination. The Mobley case signals that plaintiffs’ attorneys are increasingly scrutinizing AI’s impact on employment decisions, opening the door for a new wave of litigation.
Regulatory Uncertainty: A Moving Target for Employers
At the state and local levels, jurisdictions such as New York City have taken proactive steps to regulate AI in hiring. New York City’s Local Law 144, effective since July 2023, imposes stringent requirements on employers using automated employment decision tools (AEDTs). Key provisions include:- Mandatory annual bias audits conducted by independent evaluators.
- Public disclosure of audit results and AI tool usage.
- Advanced notice to job applicants subjected to AI-driven screening.
At the federal level, recent developments further complicate the regulatory landscape. President Biden’s 2023 Executive Order on AI directed federal agencies to ensure the responsible development of AI, particularly in hiring practices. However, the current administration has rolled back key elements of that initiative, contributing to growing uncertainty around compliance expectations. Additionally, the Equal Employment Opportunity Commission (EEOC) and the Department of Labor have rescinded prior guidance on AI discrimination, leaving human resources (HR) professionals with fewer official resources for navigating these complex issues.
Legal Developments and Growing Liability Risks
In August 2023, the EEOC settled its first lawsuit involving alleged discriminatory AI hiring practices in EEOC v. iTutorGroup, Inc., where an AI system automatically rejected older job applicants in violation of the Age Discrimination in Employment Act. The $365,000 settlement underscored the EEOC’s willingness to pursue AI-related bias claims aggressively.Similarly, in November 2024, a $2.2 million settlement was reached in a class-action lawsuit against SafeRent Solutions, where an AI-based tenant screening tool allegedly discriminated based on race and income. While this case focused on housing, it demonstrates the expanding legal scrutiny of AI decision-making across industries and provides insight into how courts may approach AI hiring discrimination cases in the future.
How HR Professionals Can Mitigate Risk
In light of the Mobley ruling and increasing legal scrutiny, HR professionals should adopt a proactive approach to AI governance. Here are key steps to future-proof hiring practices and minimize legal exposure:1. Conduct Regular AI Audits: Routine bias audits and impact assessments are essential to ensure that AI hiring tools do not inadvertently disadvantage protected classes. Organizations should partner with independent auditors to assess algorithmic fairness and transparency.
2. Scrutinize Vendor Agreements: Employers utilizing third-party AI hiring solutions should carefully review contracts and terms of service with vendors. Legal counsel should assess whether agreements include indemnification clauses or liability-shifting provisions that could protect against potential discrimination claims.
3. Enhance AI Literacy in HR Teams: HR professionals must develop a strong understanding of AI hiring tools, including how algorithms are trained, what data sources are used, and what measures are in place to mitigate bias. Engaging directly with AI developers and asking critical questions about algorithmic transparency can help identify and address potential risks.
4. Stay Ahead of Regulatory Changes: With AI governance laws rapidly evolving, HR leaders must stay informed about new compliance requirements at the federal, state, and local levels. Establishing an AI compliance task force or partnering with legal experts can help organizations navigate shifting regulatory landscapes.
5. Adopt a Human-in-the-Loop Approach: While AI can enhance hiring efficiency, it should not replace human oversight. Employers should implement processes where human decision-makers review AI-generated hiring recommendations, ensuring that algorithmic outcomes align with organizational diversity and inclusion goals.
The Road Ahead: Balancing Innovation and Compliance
The Mobley case is likely just the beginning of a broader legal reckoning over AI-driven hiring practices. As courts and regulatory bodies grapple with the implications of AI in employment decisions, employers and HR professionals must strike a balance between leveraging technology for efficiency and safeguarding against discrimination risks.In an era where AI is reshaping the workforce, organizations that proactively address bias, enhance transparency, and maintain compliance with emerging laws will be best positioned to navigate the future of hiring. The legal landscape surrounding AI hiring discrimination is still evolving, but one thing is clear: ignoring these risks is no longer an option. Employers and HR leaders must act now to ensure that AI remains a tool for innovation, not a liability for litigation.
Author Bio
![]() |
Devon Mills is an Associate at Michelman & Robinson, LLP. He represents employers in single-plaintiff litigation involving discrimination, harassment, retaliation and wrongful termination claims and wage and hour class and representative actions in state and federal courts and provides a range of additional employment-related counsel to clients across industries. |
Error: No such template "/CustomCode/topleader/category"!