A lawsuit against Workday could mark the history of artificial intelligence applied to HR

This is a test that could mark the history of human resource management software (HRIS/HCM). It takes place in America, where a candidate has answered, in vain, several job advertisements. He accuses the artificial intelligence built into Workday's recruiting solutions — which handled most of the ads he responded to — of being “biased.”

But these accusations of discrimination and “hiring bias” – since that is indeed what it is – will have ramifications beyond the United States in the sense that the verdict could establish for the first time the liability of each other (publisher, company user) the treatment case after using AI in the combustible field of HR.

Software accused of bias and making illegal requests

But let's start from the beginning. In 2023, a man named Derek Mobley files a lawsuit in the U.S. District Court in Oakland, California against Workday.

He says he's applied for about 100 ads from various employers, who he says use Workday's recruiting software.

Although he holds degrees in finance and network systems management, his application is routinely rejected, which he believes is due to software bias.

Mr. Mobley, who is described in the filing as an African-American man in his 40s who suffers from anxiety and depression, therefore argues that Workday's software is discriminatory.

For many positions, he would have to take “an assessment and/or a personality test [réalisés avec un logiciel] of the Workday brand.” The lawsuit alleges that these tests “are illegal disability-related surveys” designed to identify mental disorders that would have no effect on someone's ability to succeed as a worker.

Workday, indirect employer and HR gatekeeper?

In January, Judge Rita Lin dismissed the charges against Workday.

However, it also gave plaintiff an opportunity to file an amended complaint to assert other points of law raised at the hearing and in pleadings. Taking this opportunity, Mr. Mobley's attorneys expanded their case to expand the case. While the original complaint was 16 pages long, the amended version presented a more detailed 37-page argument.

Among the new legal points raised by Mr. Mobley's lawyers is that Workday acts as an indirect employer.

Its recruiting tools “discriminately interfere” with hiring, a claim that also suggests Workday's dominance in HR is forcing the company to act in fact like a 'gatekeeper' (an access controller) in HR, like Google or Facebook on the Internet.

The working day defends itself against “unfounded” actions

In the new layoff request filed this month, Workday rejects this “gatekeeper” idea and flatly denies these allegations.

He would have no control over the day-to-day actions of his clients and could not force them to make decisions, in the hiring process or any other HR process.

Bottom line: it's the clients who configure and use the software to screen candidates.

In a statement sent to the US editorial staff of TechTarget (owner of MagIT), Workday also believes this legal action appears to be “baseless.”

“We deny the claims and allegations made in [nouvelle] amended complaint. We remain committed to responsible AI (Editor's note: unbiased and explainable AI),” the editor assures our American colleagues.

The hot potato

For analysts, Workday's line of defense highlights the situation an HR manager can find himself in when using technology in general and artificial intelligence in particular.

In the United States, labor law is very clear: “the responsibility ultimately lies with the employer,” emphasizes Paul Lopez, a specialist attorney at Tripp Scott, a Florida firm.

A similar problem has already occurred in the HR world. For a fee.

For Dean Rocco, co-chairman of the Los Angeles law firm Wilson Elser (specializing in employment and labor law), the “payroll” consideration could therefore provide a foretaste of the future division of AI responsibilities.

When it comes to payroll, publishers are exempt, Wilson Elser believes. Most include clauses in their contracts that transfer full responsibility for legal compliance to the employer (user). A situation validated by the US courts which, according to him, have always refused to hold technology suppliers responsible.

In contracts with payroll tool vendors, clients “recognize from the outset that all the issuer is doing is providing them with a technology platform,” insists Mr. Rocco.

Three ways to minimize HR legal risk

This does not mean, however, that publishers are off the hook, mitigates Paul Lopez. The employer/customer may well counter by arguing that the software is “defective” and that the publisher has a liability to its customers.

Another tip from Tripp Scott's attorney: clients are advised to include an indemnification clause in their licensing contracts in case of a problem.

The advice seems like common sense, but the lawyer also knows that publishers are reluctant to change their standard contracts. And “small” clients simply won't be able to enforce these clauses on large international publishers.

In this case, the lawyer simply recommends not choosing a platform until you see other companies using it and using it without problems or problems for several months or even several years. “Prudence is the mother of safety,” the saying goes.

HR will have to learn to understand artificial intelligence

Will the blame fall more on AI than payroll? The upcoming decision in Derek Mobley v. Workday will provide a first element of the answer.

But in any case, for employers, the risks of AI in recruiting, hiring (and other HR functions) “are significant enough to warrant ongoing oversight, audits and evaluation of HR solutions.” “Artificial intelligence is being implemented,” warns Helen Poitevin, Distinguished Vice President Analyst at Gartner.

Jamie Kohn, another Gartner analyst, agrees, noting that hiring managers “don't always know how vendors are using AI in their products.” However, increasing legal requirements “mean they will need to develop a deeper understanding of the technologies they use and know what questions to ask.”

“If you put [une technologie] applied, you are responsible for the impact it will have on your hiring decisions,” he insists.

This understanding will be even more important as AI – because it's also positive for HR (and can also combat human bias) – is increasingly integrated into employee management tools, predicts Forrester analyst Katy Tynan Research.

But the more artificial intelligence is introduced into HRIS, the greater the “likelihood that the technology will be used in a way that leads to litigation,” he warns.

Leave a Reply

Your email address will not be published. Required fields are marked *