Staffing Agencies Will Be Using AI in 2024 – Your Overview of the Benefits and the Risks
Insights
12.04.23
Artificial intelligence is already changing the landscape of the workplace – and has the potential to change the very business model used by staffing agencies. What benefits can staffing agencies realize by deploying AI tools? And what are the legal risks? This Insight will not only review the ways that AI can be used to your advantage and the legal challenges that staffing agencies might face, but will also provide some best practices so you can harness the power of AI technology for your organization. And we’ll dive deeper into this topic during the Fisher Phillips’ PeopleLaw Conference from January 24-26, so make sure you register today.
The Benefits
It is estimated that some of 83% of employers, including 99% of Fortune 500 companies, already use some form of automated tool as a part of their hiring process. So how can AI take this automation to the next level? By introducing “smart” automated processes that learn as they grow and more closely replicate human behavior. Some specific ways AI can aid your staffing agency:
- Recruiters spend a substantial amount of time searching for candidates and culling through hundreds of resumes. By leveraging AI, recruiters and staffing firms alike can quickly source the best candidates without wasting time performing repetitive, administrative tasks such as reviewing resumes or setting up candidate meetings.
- It takes time to write job descriptions that get the attention of candidates and while also adhering to the patchwork of salary disclosures and other requirements. Generative AI products can assist with the development of engaging job descriptions – at least solid first drafts – that will give you a head start to focus on legal compliance.
- AI-fueled “smart” chatbots can answer questions about the job or application process and provide candidates with immediate help at any time.
- When used correctly, generative AI can also reduce instances of bias. For example, assisting in identifying the sorts of masculine-leaning terms in a job description that may dissuade women from applying (e.g., “driven,” “objective,” “determined,” etc.). These objective mechanisms will enable staffing agencies to increase diversity hiring goals.
- AI can also use predictive analytics to analyze candidate data, resumes, social media, online behavior, and other data sources to predict which candidates are most likely to be successful in the role.
- The technology can also broaden the talent pool by identifying potential candidates who may not have applied to the job or who didn’t use the “right” keywords in their resumes but have the requisite skills and qualifications – and even make first contact with hidden gems.
Overall, AI technology can enable recruiters to focus on the more human aspect of recruiting. Though it is important for recruiters to have a human touch and oversee the work product – as robots are nowhere close to replacing human judgment – the efficiency gains created by AI allows recruiters to better focus their efforts and ultimately increase overall job placement numbers.
The Risks
The advancement of technology and the associated benefits are not without potential risk. The use of AI to assist with employment-related tasks is under a microscope thanks to some high-profile missteps.
- As early as 2022, the Equal Employment Opportunity Commission (EEOC) ramped up its enforcement of federal anti-discrimination laws by targeting staffing firms for alleged workplace bias, filing no fewer than 10 lawsuits against staffing agencies alleging hiring discrimination in FY2022. This followed the agency’s warning about using AI technology to make staffing decisions that unintentionally led to discriminatory practices.
- The EEOC stepped up its efforts in 2023, launching a broad initiative to ensure AI workplace tools comply with anti-discrimination laws and releasing technical assistance warning those employers that deploy AI that it will apply long-standing legal principles to the evolving environment in an effort to find possible Title VII violations.
- In August, the EEOC settled its first AI discrimination-related lawsuit where an AI-powered hiring selection tool automatically rejected women applicants over 55 and men over 60. It is expected that this will not be an outlier but that we can expect a significant increase in legal actions from the EEOC and plaintiffs’ attorneys alike because of AI in the workplace.
This lawsuit illustrates the risks that AI systems can inadvertently be biased, either because of bias in the data itself or in how the algorithm processes the data. It may result in an unintended elimination from consideration of certain candidates with disabilities, foreign-born candidates, and in other discriminatory ways. For example, chatbots can inadvertently receive information about a candidate’s disability that could lead to discrimination claims.
To avoid such risk, it is essential to ensure that the use of AI is supplemental to the human aspect of hiring. It is key that you retain a healthy dose of human judgment in workplace decision-making. If not audited and tested regularly, AI can potentially introduce bias and create discrimination claims for employers. For this reason, you should regularly conduct bias audits to help root out any unintentional discrimination in the workplace or hiring process.
So what does the future hold? Congress is considering federal legislation that would regulate the use of AI in the hiring process, requiring employees to perform an impact assessment on any automated decision-making system. And while one Congressman who spoke at a recent FP AI Conference doesn’t believe that federal legislation is on the horizon anytime soon, you won’t necessarily be off the hook from complying with AI-related regulation. New York City became the first jurisdiction to require employers using AI in the employment context to conduct AI bias audits last year. Surely, this will not be the last municipality (or state) to introduce similar legislation, and we expect more in 2024.
And, again, as noted above, the EEOC has announced that it intends to increase oversight and scrutiny of AI tools used to screen and hire workers, using existing laws to ensure compliance. Your AI actions will be under review in the near year and beyond.
Learn More!
We invite you to join us at the Terranea Resort in Rancho Palos Verdes, California, for the annual Fisher Phillips’ PeopleLaw Conference on January 24-26, where we’ll discuss these – and many other topics – in depth. We’ll bring together thought leaders in the PEO, staffing, and gig economy industries to discuss common legal challenges and solutions, and you’ll hear from our firm’s PEO and Staffing Group lawyers, along with industry association executives, on the latest trends in a series of engaging and interactive sessions. Each session will provide you practical skills you can put to use right away. You’ll also have plenty of time to network with your peers to gain invaluable insights and learn from each other. You can learn more about the conference and register by clicking through here.
The Takeaway
AI is posed to change the entire workplace, and, as illustrated above, the impact on staffing agencies will be significant. While it has the potential to vastly increase overall productivity, it also carries with it risk.
That said, given the complexities of AI and its intersection with workplace law, you should partner with legal counsel who understands the many issues that need to be considered – data privacy, confidentiality, trade secrets, bias audits, copyright law, labor law, and overall best practices, just to name a few. Whether you educate existing in-house counsel, recruit new talent, or retain outside lawyers with a focus in AI, you should take steps to have experienced counsel at your side.
If you have questions about the best ways to maximize the value of AI in your workplace while reducing legal, ethical, and reputational risks, contact your Fisher Phillips attorney, the authors of this Insight, any attorney on our Artificial Intelligence Practice Group, or any attorney on our PEO and Staffing Team. We will continue to monitor further developments and provide updates on this and other workplace law issues, so make sure you are subscribed to Fisher Phillips’ Insight System to gather the most up-to-date information.
Related People
-
- Benjamin M. Ebbink
- Partner
-
- Emily N. Litzinger
- Partner
-
- Erica G. Wilson
- Associate