Automation and the Great Resignation: Sociologist takes on hiring systems

In case you haven’t heard, there’s a worker shortage in the United States. As the country recovers from the pandemic, companies are trying to bring their employees back into the workplace but are finding that many of those employees have resigned – a socalled “Great Resignation.”

There are many factors behind this worker shortage, but Noelle Chesley thinks there might be one going overlooked: The use of automated hiring systems to fill those open positions.

Chesley is an associate professor of sociology at UWM and her research focuses on the intersection of technology, work, and family. She’s noticed researchers becoming more and more concerned with automated hiring systems and how they might actually be filtering out qualified candidates. These systems include a variety of software, but people might be most familiar with platforms like Indeed, ZipRecruiter, or Monster that use algorithms and keyword screening to automatically sort and match job seekers with employers. Other systems are custom-made for companies and may include applicant tracking, resume screening and ranking, custom analytics, assessment tools, and even personality tests.

“If these algorithms work, terrific. They scale up really, really quickly and can reach millions of people. That can work really well,” Chesley said. “But the opposite is true, too. I’m a lot more worried about the opposite case, which is that the algorithms actually aren’t working that well and we’re scaling them up without attending to what the consequences of that might be.”

Chesley has identified four possible problems with automated hiring systems, and also has some possible solutions that may address some of their potential harms.

1. Algorithms may not use the best criteria for matching.

Before job seekers can apply, employers have to make sure their job ad actually reaches those looking for work. But research has shown that using an automated system to do that can have some unexpected pitfalls, Chesley noted.

“The algorithms in Google or LinkedIn, or several other platforms, have been tested on how that information gets sent out to different types of users. One of the things we know is that it doesn’t work the same way for all users,” Chesley said.

The machine learning and programming for these algorithms can often draw conclusions that the programmers never intended. For instance, one audit of Google showed that, due to privacy settings, men were more likely to be shown certain job ads than women. An algorithm at Amazon searching for internal candidates to fill higher-up positions targeted only male candidates – because male candidates had been overwhelmingly hired for those positions in the past.

“You might think that qualified women or qualified men might be equally likely to view that job ad,” Chesley said. “The fact of the matter is they’re not. The algorithms work in such black box ways that, depending on how people set their settings on their profile, for instance, can influence whether or not they’ll even see the ad.

“To the extent that, you know, women are half of the labor force, you’re shrinking your applicant pool. There’s the connection to the worker shortage,” Chesley added.

2. Using automated hiring systems can create problems at scale

Estimates are that 99% of Fortune 500 companies use automated hiring systems in some capacity. Over the last two years, many more companies have turned to automation in their human resources departments, thanks to the shift towards online and remote work during the pandemic. While automated systems are necessary to wade through the vast pool of applications that companies receive, Chesley worries that their filters may be disqualifying perfectly good candidates – and they’re doing it across the board.

“Part of it is the same algorithm issue with the job (ads), which is that there can be things that happen in these algorithms that lead to sort of weird outcomes,” she noted. “The other issue is also that there some human tendencies that are getting reapplied and these algorithms are setting some really key filters.”

For instance, the algorithms could be told to screen out candidates who have more than a six-month gap in employment on their resume. That’s a bad idea when society is trying to emerge from a pandemic in which thousands of people lost their jobs, Chesley said.

And because these automated systems are so ubiquitous, applying those criteria broadly has a tremendous affect on the labor market.

“If you’re rejecting automatically with no human discretion, you might not even realize that your system is set up to do that,” Chesley said. “That’s the idea of rejecting qualified applicants at scale.”

3. More recruiters want ‘purple squirrels’

In recruiter parlance, a ‘purple squirrel’ is a candidate with superior skills, that meets the qualifications and goes beyond them, the kind a recruiter would dream of hiring. In other words, a candidate as rare as a ‘purple squirrel.’

Chesley said that recruiters are searching for them more and more.

“Because it’s possible to filter and get so much information about all of (a candidate’s) different attributes, we’ve had a raising of expectations in hiring There’s some research to show that job ads are becoming more and more complex,” Chesley said. “There’s this uptick in expectations on the part of hiring managers.”

For example, employers may want to hire someone fully proficient in the Adobe Suite, but the job only requires using Photoshop. That can eliminate many qualified candidates and could contribute to the labor shortage.

4. Automated hiring systems could lead to alienation

Much of the “Great Resignation” has been fueled by employees seeking better treatment and wages, but Chesley wonders if there’s not an additional reason: Oftentimes, applying for jobs can feel like throwing your resume into a black hole.

“I wonder a lot about the role of a very alienating job-seeking experience that is, at least in part, being fueled by automation,” Chesley said. “Public opinion research suggests a general distrust of automation in hiring, and some more target studies of job seekers suggest that potential workers find interacting with these systems alienating.”

She added that automated systems are designed with the employer as the client with little regard to applicants.

“To me, one of the things that is really unjust … is that you’re actually contributing free labor and information that’s going to be used by systems (like LinkedIn or Indeed) whether or not you get a job, so they profit from your information,” she said.

Solutions

1. Companies should audit their automation.

“If they know that they’re launching ads on certain platforms, (companies) should have some of their technical people look and do a bit of testing to see who’s seeing these ads,” Chesley said. The same holds true for the internal systems that companies use to screen resumes and rank candidates.

“Let’s just make sure that we’re not scaling up some things that are problematic,” she added, referring to the inadvertent sexism, racism, and other potential discrimination that algorithms can inadvertently introduce into a candidate search. Because organizations have control over the platforms they use to distribute job ads and their own automated systems, this is a natural place to think about making changes.

2. The job ad matters

Research shows that how companies craft a job ad can impact the kind of candidate who applies. Using more masculine language in an ad can inadvertently discourage women from applying, Chesley said. Employers should also think about the qualifications they’re requiring: Is that qualification truly necessary for a person’s job performance, or is the hiring manager searching for a “purple squirrel”?

3. Treat job seekers like customers

Developers of automated hiring systems should reconsider who their product is for.

“The idea is to harness the current fascination with improving customer experiences online (sometimes called ‘UX’ research) and translate that to thinking of job seekers as ‘customers,’” Chesley said.

Because hiring organizations are the actual clients, any push for change would have to come from them — to call for a product that provides a better experience for job seekers.

“What would that experience entail? More transparency, better communication with the hiring organization, timely decision making, etc.,” Chesley added.

4. The government might have a role.

“Government and other stake holders, such as non-profits that work with job seekers, may need to develop interventions and policies that directly support job seekers in learning how to best navigate automated hiring practices, as well as policies that better regulate the use of automated hiring tools,” Chesley said.


Share: