The Emerging Challenges of AI in Australian Recruitment

Megan Ortiz Avatar

By

The Emerging Challenges of AI in Australian Recruitment

2024, recruitment in Australia is changing fast. Now, a stunning 43 percent of organizations use AI to improve their hiring processes. Additionally, 19 percent of these organizations utilize AI extensively, raising concerns about biases inherent in the algorithms that drive these technologies. Big Commonplace’s recent study, spearheaded by Natalie Sheard, points to key shortcomings in data sets big and small. It looks at the specifically opaque algorithms underlying AI model.

Sisterworks, a social enterprise based in Melbourne, has been helping migrant and refugee women enter the workforce for more than ten years. Today, the organization is still wrangling with the consequences of recruitment by AI. Sisterworks CEO Ifrin Fittock said the organisation ran into all sorts of unforeseen trouble when it started using AI-based recruitment methods in late 2024. This is a microcosm of the larger problem of how AI could be biased against marginalized groups in the workforce.

As Sheard found in her research, AI models can be biased in ways that drastically affect the outcomes they provide. For example, a few systems might intentionally bias against women or not be accessible to disabled job seekers. Sheard shone a spotlight on how AI recruitment systems commonly filter out candidates for gaps in employment history. This practice disproportionately impacts women, who are often the ones to take leave for childcare needs.

“In my research I heard of systems not being accessible to job seekers with disabilities, heard of CV screening systems using things like gaps in employment history to screen candidates out,” – Natalie Sheard

Sisterworks’ experience underscores these concerns. Fittock remembered one shocking result when nine participants from their program went to interview but then weren’t hired for the jobs. Workers’ stories took a new turn when it turned out that these behavioral interviews were with AI-powered, pre-recorded avatars instead of real people.

“We actually got caught on surprise, when towards the end of last year, we sent about nine sisters to an interview and they didn’t make it, they didn’t pass the interview,” – Ifrin Fittock

Fittock underscored that many of these participants experienced further barriers from language and digital literacy. The difficulties came not only from lack of familiarity with the technology but from the individual situations of each candidate.

“The challenges with these AI recruitment or AI interview for some of our sisters is really, first of all, English is not their first language, but also the level of digital literacy that they may or may not have,” – Ifrin Fittock

Fatemeh Zahra Hazrati, a participant from Iran in Sisterworks’ job course, emphasized the significant role AI plays in contemporary life. She recognized the importance for people to pivot to new technologies.

“AI has a real big effect on our lives nowadays,” – Fatemeh Zahra Hazrati

When it comes to the challenges of AI in recruitment, Hazrati is bullish on the ability to navigate obstacles. She says developing the skills to operate within these new systems will be key to achieving long-term success.

“All of us have to learn it, and we just need to adapt ourselves for new things and accept challenges to learn new things,” – Fatemeh Zahra Hazrati

The Australian Government’s Responsible AI report offers more detail, but it adds a layer to the picture of AI’s deepening penetration into different sectors. It brings up important issues of fairness and accessibility. Andreas Leibbrandt, an expert on algorithmic bias, noted that training data sets often reflect historical biases from their originating organizations.

“These AI algorithms are fed with training data sets or data sets from corporations. But their training data in itself may be biased. It may come from an organisation where there was, for instance, a bias against women,” – Andreas Leibbrandt

An eye-opening meta-study tells a different story. Women and non-Anglo candidates tend to be more motivated to apply for positions once they find out that AI tools are being used. Leibbrandt captured this unusual phenomenon. He observed that people tend to think that bias in AI algorithms is worse than bias they face from human recruiters.

“It’s not that both women or ethnic minorities feel there’s no bias in the AI algorithm, but they feel that this bias is less so than when they’re faced with a human recruiter,” – Andreas Leibbrandt

Megan Ortiz Avatar
KEEP READING
  • The Hidden Struggles of Chronic Illness: One Mother’s Journey with Invisible Disabilities

  • Landmark Court Ruling Could Benefit Millions of UK Consumers

  • TechCrunch All Stage 2025 Welcomes Side Events Initiative in Boston

  • Taiwan’s Leader Advocates for Peace Amid Rising Tensions with China

  • Albanese Prepares for Diplomatic Reinforcement with Second Visit to China

  • Mental Health Services Face Crisis as Demand Outstrips Supply in South Australia