Share

WATCH | How AI is filtering millions of qualified candidates out of the workforce

accreditation
0:00
play article
Subscribers can listen to this article
  • There is a well-documented dark side to AI technology used to screen job applications - sometimes being qualified is not enough to get your application seen by human eyes. 
  • But, according to one expert, there is actually very little that a person can do if they continue to be screened out of the hiring process.  
  • For more stories, visit the Tech and Trends homepage


Scroll through social media for long enough, and it won’t take long to find influencers spouting tricks and tips on how their viewers can land their dream jobs. They just have to get past the AI screening of their applications first. 

Their advice is the byproduct of a real-life concern - that qualified candidates could be filtered out of the hiring process before their applications are seen by human eyes. 

The use of technology like ATS, or applicant tracking system, is prevalent. According to the study ‘Hidden workers: untapped talent’ by Harvard business school, 99% of Fortune 500 companies use ATS when looking for new hires.

And 63% of surveyed countries across Germany, the United States and the United Kingdom do the same. 

According to Manjari Raman - one of the researchers behind that study and the Senior Program Director Managing for the Future of Work Project at Harvard Business School -  companies turn to these automated systems because they are sometimes flooded with applications.  

"But when that automated system has the responsibility of taking thousands of candidates and filtering it down to the top five choices … Well, then what happens is now the technology is hiding workers who could work in that position, high skills or middle skills, and that's a problem," she explains. 

The qualified workers left behind 

There is also a well-documented dark side to this technology - sometimes being qualified is not enough to land a job interview. 

In 2018, Amazon realised the hiring software it was developing for four years was scoring qualified female candidates below their male counterparts. 

The reason for this was simple. The AI was trained on the company’s previous hiring track record, and since men dominate the tech industry, it decided that male candidates were preferable to female ones. 

That same year, auditors of another screening tool found that the software ranked people with the name Jared and a history of playing lacrosse in high school more favourably than other applicants. 

According to Kerry McInerney, a Research Fellow at the Leverhulme Centre for the Future of Intelligence at the University of Cambridge, AI can even perpetuate discrimination when its developers design it to do the opposite.  

"One of the claims that companies make about AI-powered hiring tools is that, unlike a human recruiter, an AI-powered tool doesn't see gender and doesn't see race or other characteristics about us," McInerney told Euronews.  

McInerney added:

But I'm really sceptical of this idea that technologies are inherently more objective than human recruiters because ultimately they're trained on the same biased data produced by human recruiters.

She added that because of this, many companies are "putting their resources into purchasing tools that don't work rather than investing in tried and true diversity and inclusion strategies that we know do work."  

How smart is AI? 

According to the previously mentioned Harvard study, there are also millions of people across Europe and the US classified as 'hidden workers', or qualified people who are filtered out of the application process because of things like large gaps in their resumes.  

ATS can also reject applicants because of lengthy and wordy job postings.  

"ATS systems, like almost all forms of artificial intelligence don’t think. They don’t reason. They're not smart in the way humans think of intelligence," Joseph Fuller, a Professor of Management Practice at Harvard Business School, told Euronews.  

"Quite a lot of the problems with artificial intelligence in hiring actually fall at the feet of the employer, not the technology.  

"Job descriptions are ingested by the way they are written, and the technology takes a language in that job description and more or less treats it as scripture.” 

For example, we at Euronews recently tried to see how hireable one of our journalists actually was when applying for a job that was comparable to a position they were already doing. 

We ran their resume through Jobs Scans, a website that claims to help people get past ATS screeners. And we asked it to rank that person as a possible candidate for an actual job posting. 

But they were ranked as a low candidate because the job asked for international experience, and the ATS screener thought the journalist didn’t meet this requirement despite them previously having worked in five different countries. 

"In this instance. I think the AI was confused. It doesn't view living in a country as the same as travelling,"  Fuller said. "So if the candidate had said, 'I have travelled extensively while being based in five different countries'. My guess is it would have come to a different conclusion."  

Does ‘white fonting’ actually work? 

Some people are trying to bypass this hurdle by ‘white fonting’, or copying and pasting a job post into their resume in small font and hiding it from the human eye by changing the colour to white. 

The idea behind this is that while the recruiter won’t be able to see it, the AI screening software would. And their CV would get spit out as a possible contender for the role. 

But despite videos claiming 'white fonting's' success rate, this is more "myth" than fact. 

"This is now a bit of an urban legend," Fuller said. "More recruiters in large companies are scanning a whole application and then changing all the text [to a different colour] so 'white fonting' will be exposed.  

"Also, if your actual job history doesn't fit very well with the requirements of the job, and you're bluffing your way into the interview process, you're likely, in fact, not to succeed."  

Instead of ‘white fonting’, Fuller suggests that hopeful employees look at LinkedIn profiles of people already doing their desired job at the relevant company and replicate how they describe their skills and position. 

According to Gracy Sarkissian, the Executive Director at the Wasserman Center for Career Development at NYU, "candidates may also take advantage of new tools like ChatGPT to support their job search."

Sarkissian said:

ChatGPT can enable candidates to identify potential job titles and opportunities, analyse job postings to help them determine what skills to highlight, predict interview questions, translate application materials to different languages, and provide nuanced salary information.

And for the some 27 million 'hidden workers' in the US and the other five million in the UK and Germany, Fuller recommends that they could try to get past the AI bots by closing the gap in their resumes. 

For example, he suggested they could return to the workforce by finding gig work, part-time employment or by studying for a course while they look for a new job. 

Trying to regulate AI 

But, according to Raman, beyond that, there is actually very little that a person in a similar position to 'hidden workers' can do if they continue to be screened out of the hiring process.  

"Employers have very little ability or power or agency, they are the ones who are suffering because of this problem," she said.  

"But in the case of employers, it's a self-inflicted wound and employers are the only ones who can make any changes."  

Some regions and countries are trying to address this power imbalance by moving to regulate this ever evolving technology.  

European Union officials are working on groundbreaking rules to regulate AI that could become the de facto standard for global countries because of the size of the 27 nation bloc and its market. 

In the United States, New York City is working on a law that would require companies to inform candidates that their applications are being screened by AI. And Illinois enacted a law requiring companies to notify people that their applications will be screened by AI and obtain their consent. 

China is also drafting regulations requiring security assessments for any products using AI, while the UK's competition watchdog has opened a review of the market.



We live in a world where facts and fiction get blurred
Who we choose to trust can have a profound impact on our lives. Join thousands of devoted South Africans who look to News24 to bring them news they can trust every day. As we celebrate 25 years, become a News24 subscriber as we strive to keep you informed, inspired and empowered.
Join News24 today
heading
description
username
Show Comments ()
Voting Booth
Should the Proteas pick Faf du Plessis for the T20 World Cup in West Indies and the United States in June?
Please select an option Oops! Something went wrong, please try again later.
Results
Yes! Faf still has a lot to give ...
68% - 1982 votes
No! It's time to move on ...
32% - 946 votes
Vote
Rand - Dollar
18.58
+0.2%
Rand - Pound
23.25
+0.3%
Rand - Euro
19.89
+0.4%
Rand - Aus dollar
12.14
+0.0%
Rand - Yen
0.12
+1.1%
Platinum
963.10
+0.1%
Palladium
952.50
-0.3%
Gold
2,304.32
-0.7%
Silver
26.44
-0.7%
Brent Crude
83.44
-3.5%
Top 40
70,515
+0.8%
All Share
76,649
+0.8%
Resource 10
60,588
-1.1%
Industrial 25
106,289
+1.2%
Financial 15
16,880
+1.7%
All JSE data delayed by at least 15 minutes Iress logo
Editorial feedback and complaints

Contact the public editor with feedback for our journalists, complaints, queries or suggestions about articles on News24.

LEARN MORE