LinkedIn’s new algorithm is taking heavy fire from users as they complain about sudden and massive decreases in engagement and impressions, with strong bias against female users. This new scrutiny follows accusations of sexism and bias against Black, LGBTQ+ and other creators affecting content visibility on the business-oriented social media service. In perhaps the more jarring example, Michelle, a pseudonym for one user, changed her profile gender field from female to male. And wouldn’t you know it, within a single day she experienced a whopping 238% increase in impressions!
That outrage is just a symptom of much more serious problem. These algorithms, developed to maximize positive user experience, can unintentionally amplify existing biases. Tim Jurka, LinkedIn’s vice president of engineering, acknowledged that the platform employs Large Language Models (LLMs) to surface relevant content. Nonetheless, these models are not without fault, as they can perpetuate sexism and racism due to the fact that they are trained on content created by humans.
Recent research on LLMs, including those used to train chatbots, has demonstrated the presence of such biases, calling into question the equity of LinkedIn’s algorithm. The platform recently denied that demographic information plays any role in the amplification of content. As users shared their stories, we heard that by changing their gender it led to more impressions.
Gender Bias Experimentation
Cindy Gallop and Jane Evans, with the collaboration of other women, then created an experimental effort called #WearthePants to hack LinkedIn’s algorithm to prove gender bias. Gallop discussed her experience when she would post the exact same content that a male peer had just posted. Her post only got seen by 801 people, compared to the male user’s post which received a whopping 10,408 impressions. This dramatic contrast adds fuel to the fire of worries that gender affects the level of engagement on the platform.
This has been the experience of many users, who feel that there is a bias against content made for women. Shailvi Wakhulu stated, “It’s demotivating for content creators with a large loyal following.” These sentiments have inspired public outcry and demands for greater transparency and accountability from LinkedIn over its algorithmic practices.
Michelle’s experience wasn’t an isolated incident. She made amendments to her profile and posting strategy to write in a way that was easier to digest and more straightforward, because “Michael.” As a result, she saw a phenomenal boost in engagement. This trend leads to important questions around how one’s gender identity affects visibility on the platform.
“I’d really love to see LinkedIn take accountability for any bias that may exist within its algorithm,” – Marilynn Joyner
Algorithm Mechanics and User Behavior
LinkedIn’s algorithm is designed to consider various signals when determining which content is prioritized in users’ feeds. According to a spokesperson for the company, “Member behavior shapes the feed. What people click, save, and engage with changes daily.” This creates a scenario where consumer engagement with content directly impacts the content they are shown.
Despite claims that demographic details do not factor into visibility decisions, some researchers argue that demographics can still affect both sides of the algorithm—what users see and who sees their posts. Sarah Dean, an assistant professor of computer science, noted that the algorithm may amplify existing signals based on user behavior.
Brandeis Marshall pointed out that social media algorithms often “innately have embedded a white, male, Western-centric viewpoint.” This bias leads to inequities in engagement between users of different demographics. This paves the way for their reach to be restricted by identity rather than experience.
“What we don’t know is all the other levers that make this algorithm prioritize one person’s content over another. This is a more complicated problem than people assume,” – Brandeis Marshall
The Call for Accountability
With increasing awareness of algorithmic biases, advocacy groups have begun demanding accountability from LinkedIn. Users are demanding that the platform acknowledge and rectify the possible discrimination built into its algorithm. Chad Johnson remarked on the importance of clarity in writing, stating that “It cares whether your writing shows understanding, clarity, and value.” This highlights why equitable visibility on LinkedIn matters. Production requires us to consider how we build spaces in which a range of new voices can thrive.
As LinkedIn’s user base continues to expand, posting is up 15% year-over-year while comments have surged by 24%. The pressure mounts for the platform to reassess how its algorithms operate. The company maintains that it runs ongoing tests “to understand what helps people find the most relevant, timely content for their careers.”
As more users come forward with experiences reflecting systemic bias, it becomes increasingly vital for LinkedIn to take proactive measures. Transparency around algorithm design and implementation would be key in regaining the trust of those creators, particularly marginalized creators.

