Top 10 Employer Takeaways as New Jersey Cracks Down on AI Discrimination
Insights
1.21.25
New Jersey recently put employers on notice: AI-driven bias is illegal discrimination. The state’s January 9 guidance on algorithmic discrimination makes it clear that the New Jersey Law Against Discrimination (LAD) applies to AI-powered decision-making in hiring and beyond. At the same time, the state launched a Civil Rights Innovation Lab to monitor AI compliance, enforce violations, and educate businesses on AI risks. This means that New Jersey employers using AI-driven tools – including those you purchased from third-party vendors – must now proactively ensure these systems don’t create discriminatory outcomes. Here are the top 10 takeaways employers need to know to stay compliant under New Jersey’s new AI enforcement push.
1. AI Bias is Illegal Under New Jersey’s Law Against Discrimination
The NJ LAD explicitly prohibits algorithmic discrimination, treating AI-driven bias the same as traditional discrimination. This applies to employment, housing, public accommodations, credit, and contracting. Employers will have a difficult time arguing that AI’s “black box” decision-making shields them from liability. If an AI system results in biased outcomes, the new guidance – which you can find here – says that the employer will be held responsible.
2. Employers Are Liable for AI Discrimination—Even if They Use Third-Party Vendors
The guidance emphasizes that employers cannot escape liability by outsourcing AI hiring, screening, or evaluation tools. And they can’t simply point the finger of blame at a vendor if a bad outcome occurs and a lawsuit follows. If an AI tool used by an employer leads to disparate impact or direct discrimination, the guidance says that the employer is still legally responsible.
3. You Should Audit AI Systems for Bias Before and During Use
The guidance says that New Jersey regulators expect businesses to evaluate AI tools before deployment and continuously monitor them. As a result, employers should:
- Conduct bias audits before implementing AI-driven hiring or decision-making tools.
- Perform regular assessments to ensure AI tools don’t disproportionately impact protected groups.
- Establish compliance protocols for mitigating AI-driven bias.
4. Automated Hiring and Promotion Tools Face Heightened Scrutiny
The NJ Civil Rights Division will focus on AI bias in employment decisions, including hiring, promotions, and terminations. Employers using AI-driven resume screening, personality assessments, or video interview analysis should ensure these tools do not unintentionally exclude or rank candidates unfairly. “We will continue to make sure that people and companies do not use innovative new technologies to discriminate against and exclude our state’s residents,” said the state Attorney General in an announcement accompanying the guidance.
5. Disability & Reasonable Accommodation Compliance Applies to AI
AI cannot ignore or penalize candidates or employees with disabilities, according to the guidance. Employers should:
- Ensure AI hiring tools account for accessibility needs (e.g., alternative keyboards, speech-to-text tools, etc.).
- Prevent productivity-tracking AI from flagging medical-related breaks as performance issues.
6. The Civil Rights Innovation Lab Will Monitor AI Enforcement
The guidance announced the creation of New Jersey’s Civil Rights Innovation Lab. This new government agency will:
- Develop AI tools to detect discrimination in hiring, housing, and credit.
- Enhance enforcement of AI-related discrimination complaints.
- Offer compliance training to businesses on AI risk management.
7. Public Transparency Requirements May Expand
While New Jersey hasn’t yet mandated disclosure rules, employers should prepare for possible regulations or a change in the law that would require you to:
- Notify candidates when AI is used in hiring.
- Disclose AI’s role in employment decisions.
- Offer explanations for AI-driven hiring rejections.
8. State Launches Training & Education Programs on AI Bias
The guidance says that New Jersey will offer public education sessions and compliance training on AI-related discrimination. Employers should take advantage of these state-backed resources once they are announced to stay ahead of enforcement trends.
9. The National AI Compliance Landscape is Changing
New Jersey’s guidance is just one of many states hopping on the AI regulation bandwagon. The Garden State now joins a growing trend of states regulating AI discrimination:
- New York City’s Local Law 144 was the nation’s first law to create obligations for employers when AI is used for employment purposes – including obligatory bias audits.
- Colorado was the first state to pass a law requiring AI bias prevention measures.
- Illinois became the second state to pass AI workplace legislation that will require employers to provide notice to applicants and workers if they use AI for hiring, discipline, discharge, or other workplace-related purposes.
- New York and California just took steps to advance the ball when it comes to regulating AI use in the workplace.
- Several other states– including Texas and Connecticut – have pending AI bias legislation teed up for 2025.
10. Employers Should Take Immediate Action
To reduce legal risk, employers should consider taking these steps:
✅ Audit AI tools for potential bias.
✅ Build an AI Governance protocol in your workplace. Here are the first 10 steps you should take.
✅ Train your HR teams on AI compliance.
✅ Engage with AI vendors to ensure bias-mitigation protocols are in place. Here are the essential questions you should ask your AI vendor before deploying their systems at your business.
✅ Monitor enforcement trends to anticipate regulatory shifts. Read our Comprehensive Review of AI Workplace Law and Litigation as We Enter 2025 for the full picture.
Conclusion
We’ll continue to monitor developments in this ever-changing area and provide the most up-to-date information directly to your inbox, so make sure you are subscribed to Fisher Phillips’ Insight System. If you have questions, contact your Fisher Phillips attorney, the authors of this Insight, any attorney in our New Jersey office, or any attorney in our AI, Data, and Analytics Practice Group.
Related People
-
- Sarah Wieselthier
- Partner