Skip to main content
Tech

AI’s role in hiring practices in the spotlight amid bias concerns


Artificial intelligence (AI) likely played a role in securing some applicants’ current jobs or excluding them from consideration for the position they missed out on. Human resources departments have been using automated tools to assist in recruitment for some time now. However, in February of 2023, one of the first allegations of discrimination based on an employer’s use of AI in the hiring process emerged.

Derek Mobley has become the face of a class action lawsuit against Workday, Inc. Mobley claimed that since 2018, he has applied for positions at up to 100 companies through Workday’s application platform but he has not been able to secure a job.

“All things being equal, you know, he’s qualified,” Mobley’s attorney Rod Cooks said.

According to Cooks, Mobley alleges that he has been discriminated against because he is African American, over 40 years old and has a disability.

“We believe this lawsuit is without merit,” a Workday spokesperson said in an email. “At Workday, we’re committed to responsible AI. Our decisions are guided by our AI ethics principles, which include amplifying human potential, positively impacting society, and championing transparency and fairness.”

This is not the first time AI-based recruitment tools have faced bias concerns.

In 2015, Amazon discontinued its own recruiting engine after it was discovered that the system had developed a preference for male applicants over women.

John Rood is the founder of Proceptual, a company that works with HR departments to ensure compliance with emerging legislation. Rood emphasized the significant harm caused if protected minority classes, such as racial, gender, or sexual orientation minorities, systematically fail to secure jobs.

Rood explains that when using an automated employment decision tool, the technology picks which applicants should advance to the next phase based on predetermined criteria. Hiring managers often rely on these system recommendations, especially in large-scale hiring processes.

“The computer has not like decided who to hire, but at a part of that funnel, it’s made a very specific decision about who should be advancing to the next phase,” Rood said. “And most hiring managers are not going to, like look at that and be like, ‘oh, well, I need to carefully go through every single resume,’ especially when it’s, you know, an enterprise company, and they’re hiring for hundreds of roles.”

To address bias concerns, Rood’s company conducts “bias audits” for organizations using automated tools. These audits evaluate whether an algorithm selected applicants at a lower rate based on race, gender, national origin, or other factors.

New York City passed Local Law 144 in 2021, which requires employers to conduct bias audits and disclose the results publicly. Enforcement of the law will commence on July 5, 2023.

However, Rood says the law is far from perfect. While it sets the standard for what an audit should entail, he highlights that no one is entirely satisfied with the criteria outlined.

Critics include Matt Scherer, senior policy counsel for Workers’ Rights and Technology at the Center for Democracy and Technology. Scherer said that the audit requirement is redundant, pointing out that the law references federal regulations that already mandate companies to collect information on race, sex and national origin.

“The law is not doing anything for this bias audit that companies should not already be doing anyway,” Scherer said.

New York City’s law also necessitates employers to disclose their use of AI in the hiring process and the manner in which it is employed. However, a narrow version of the law passed, requiring that companies follow these rules if their automated employment decision tools completely replace human decision making.

Because many companies use such technology to assist in human decision-making, and not replace it entirely, those companies may not be affected by Local Law 144.

“I do not think that New York Law 144 is a very good law, I hesitate to call it regulation, or certainly new regulation,” Scherer said. “And I am not at all keen on the idea of that being kind of a model for the rest of the country for how these tools are regulated.”

As New York’s regulation takes effect, it shines a spotlight on a sector that has been utilizing AI products for years, potentially impacting the demographic composition of the workforce.

Tags: , , , ,

LONGWORTH: CHANCES ARE, YOU HAVE A-I TO THANK FOR YOUR CURRENT JOB. OR MAYBE IT’S TO BLAME FOR THE JOB YOU DIDN’T GET.

HR DEPARTMENTS HAVE BEEN USING AUTOMATED TOOLS TO AID IN RECRUITMENT FOR A WHILE NOW.  BUT AS OF FEBRUARY, WE’RE SEEING ONE OF THE FIRST ALLEGATIONS OF DISCRIMINATION BASED ON AN EMPLOYER’S USE OF A-I IN THE HIRING PROCESS.

DEREK MOBLEY IS THE FACE OF A CLASS ACTION LAWSUIT AGAINST WORKDAY.  MR. MOBLEY ALLEGES THAT, SINCE 2018, HE’S APPLIED FOR POSITIONS AT NEARLY 100 COMPANIES USING WORKDAY’S APPLICATION, AND HAS YET TO FIND A JOB.

ROD COOKS: “ALL THINGS BEING EQUAL, YOU KNOW, HE’S QUALIFIED, HE’S, YOU KNOW, EDUCATIONALLY QUALIFIED, QUALIFIED EXPERIENCE WISE, AND THE ONLY CONCLUSION HE CAME TO AND WHAT HE SHARED WITH US AND WHAT WE CAME TO INVESTIGATE HIS CLAIM IS THAT HE’S BEING DISCRIMINATED AGAINST, BECAUSE HE’S AFRICAN AMERICAN, HE’S OVER 40, OR HE SUFFERS FROM A DISABILITY.”

LONGWORTH: A WORKDAY SPOKESPERSON HAS SAID IN AN EMAIL THAT THE LAWSUIT IS “WITHOUT MERIT.”

THERE IS A HISTORY OF BIAS ISSUES WITH THIS KIND OF TECH.

IN 2015, AMAZON SHUT DOWN ITS OWN RECRUITING ENGINE, WHICH HAD TAUGHT ITSELF TO PREFER MALE APPLICANTS OVER WOMEN.

JOHN ROOD: “IF SYSTEMATICALLY PROTECTED MINORITY CLASSES, ESPECIALLY, YOU KNOW, RACIAL MINORITIES, GENDER MINORITIES, SEXUAL ORIENTATION MINORITIES DON’T GET JOBS SYSTEMATICALLY, LIKE THAT’S A VERY SUBSTANTIAL HARM.”

JOHN ROOD FOUNDED PROCEPTUAL–A COMPANY THAT WORKS WITH HR DEPARTMENTS USING AUTOMATED TOOLS TO ENSURE THEY’RE COMPLYING WITH EMERGING LEGISLATION.

ROOD: “LET’S SAY YOU PUT UP A JOB POST, YOU GET 100 APPLICANTS FOR IT. AND YOUR APPLICANT TRACKING SYSTEM SAYS, YOU KNOW, HERE’S THE 20, TOP SCORES, RIGHT, ASSIGN SOME KIND OF SCORE BASED ON ON WHATEVER CRITERIA” “THE COMPUTER HAS NOT LIKE DECIDED WHO TO HIRE, BUT AT A PART OF THAT FUNNEL, IT’S MADE A VERY SPECIFIC DECISION ABOUT WHO SHOULD BE ADVANCING TO THE NEXT PHASE. AND MOST HIRING MANAGERS ARE NOT GOING TO, LIKE LOOK AT THAT AND BE LIKE, ‘OH, WELL, I NEED TO CAREFULLY GO THROUGH EVERY SINGLE RESUME, ESPECIALLY WHEN IT’S, YOU KNOW, AN ENTERPRISE COMPANY, AND THEY’RE HIRING FOR HUNDREDS OF ROLES.’”

LONGWORTH: ROOD’S COMPANY PROCEPTUAL RUNS WHAT ARE CALLED “BIAS AUDITS” FOR ORGANIZATIONS USING AUTOMATED TOOLS. A BIAS AUDIT EVALUATES WHETHER AN ALGORITHM SELECTED APPLICANTS AT A LOWER RATE BASED ON RACE, GENDER, NATIONAL ORIGIN, OR OTHER FACTORS.

IN 2021, NEW YORK CITY PASSED A LAW THAT REQUIRES EMPLOYERS TO RUN BIAS AUDITS AND PUBLICLY DISCLOSE THE RESULTS. LOCAL LAW 144 WILL BE ENFORCED STARTING JULY 5TH.

BUT ROOD SAYS THAT LAW IS FAR FROM PERFECT.

ROOD: “WHAT NEW YORK CITY HAS DON/E IS THAT IT’S BEEN THE FIRST BODY TO SAY, HERE’S WHAT AN AUDIT LOOKS LIKE. AND I THINK THAT BASICALLY, NO ONE IS SATISFIED WITH WHAT THEY’VE SAID AN AUDIT IS.”

LONGWORTH: CRITICS, INCLUDING MATT SCHERER, THINK THE AUDIT REQUIREMENT IS REDUNDANT.

MATT SCHERER: “THE BASIS FOR THOSE THREE CATEGORIES, RACE, SEX, NATIONAL ORIGIN, IT’S NOT ACTUALLY LAID OUT IN LOCAL LAW 144 ITSELF. WHAT LOCAL LAW 144 DOES, IS IT REFERENCES FEDERAL REGULATIONS THAT ALREADY REQUIRE COMPANIES TO MAINTAIN INFORMATION ON THOSE CATEGORIES OF DEMOGRAPHICS. // SO REALLY, THE LAW IS NOT DOING ANYTHING FOR THIS BIAS AUDIT THAT COMPANIES SHOULD NOT ALREADY BE DOING ANYWAY.”

LONGWORTH: NEW YORK CITY’S LAW ALSO REQUIRES EMPLOYERS TO DISCLOSE WHEN AND HOW THEY’RE USING AI IN THE HIRING PROCESS.

BEFORE THE LAW PASSED, LEGISLATORS NARROWED ITS SCOPE–GIVING MANY COMPANIES THE OPPORTUNITY TO SAY IT DOESN’T APPLY TO THEM.

AS OTHER STATES CONSIDER WAYS TO REGULATE AI IN RECRUITMENT, SCHERER SAYS HE HOPES THEY PRODUCE LEGISLATION THAT’S STRONGER THAN NEW YORK’S FINAL VERSION OF THE LAW.

SCHERER: “I DO NOT THINK THAT NEW YORK LAW 144 IS A VERY GOOD LAW, I HESITATE TO CALL IT REGULATION, OR CERTAINLY NEW REGULATION. AND I AM NOT AT ALL KEEN ON THE IDEA OF THAT BEING KIND OF A MODEL FOR THE REST OF THE COUNTRY FOR HOW THESE TOOLS ARE REGULATED.”

ROOD: “NEW YORK IS THE FIRST OF ITS KIND FOR NOW. BUT IN THE NEXT 18 MONTHS, WE’RE GOING TO SEE A NUMBER OF NEW LAWS ACROSS THE COUNTRY.”

COOKS: “IT’S THE WILD WEST IN THIS AREA NOW. IF WE DON’T GET A HANDLE ON THIS, AND, AND GET A HANDLE ON IT QUICKLY, I THINK IT CAN HAVE A FAR RANGE OF EFFECTS ON EVERY SECTOR OF WORK.”

LONGWORTH: AS NEW YORK’S REGULATION IS ENFORCED, IT CALLS ATTENTION TO A SECTOR THAT’S BEEN USING A-I PRODUCTS FOR YEARS, POTENTIALLY AFFECTING THE DEMOGRAPHIC MAKEUP OF OUR WORKFORCE.