Why women on LinkedIn are masquerading as men
Megan Cornish had spent months puzzling about her waning reach on LinkedIn when she decided to run a test: She recast her profile to seem more like a guy.
Within a week, her impressions on the careers website quadrupled.
The initial post went viral, eliciting hundreds of comments from people voicing frustration about their reach. Some reacted to Cornish’s post with incredulity that changing pronouns and rewording posts could make such a difference. Since then, LinkedIn has been inundated with posts from women who said their visibility on the platform grew after they took measures to obscure their gender or appear masculine. The episode is sparking questions about how bias — in humans or machines — dictates who must work harder to be heard in professional spaces, online and otherwise.
In response, LinkedIn said in a news release that its AI systems and algorithms build off “hundreds of ... signals” but don’t use demographic information — such as age, race or gender — to determine the “visibility of content, profile, or posts in the Feed.”
“Changing gender on your profile does not affect how your content appears in search or feed,” Sakshi Jain, LinkedIn’s head of responsible AI and AI governance, said in a statement to The Washington Post.
The wave of experiments comes as the world’s largest professional network is recording huge traffic gains, making it a larger nexus for job hunters and business owners. Posting is up 15 percent year-over-year, while comments have surged 24 percent, the company said. That growth, on a platform that’s a nexus for roughly 1 billion users and millions of job opportunities, is fueling intense competition for users’ attention.
Cornish’s experience raises questions “about the way that language or characteristics that have been traditionally associated with women are more devalued and embedded into our structural systems,” said Allison Elias, assistant professor of business administration at the University of Virginia.
Gender bias still affects how occupations and contributions are perceived, she said. Higher-earning professions in such sectors as finance, technology and engineering remain male-dominated, while women continue to be overrepresented in many low-wage roles, including education, caregiving and retail. It’s a major driver of the widening wage gap in the United States, which as of 2024 has women earning 80.9 cents for every dollar collected by a man, Census Bureau data shows.
Technology “reflects societal values, and a lot of times those values are not even apparent to us as we hold them,” Elias said. “LinkedIn thinks its algorithm is very neutral … but if the people using LinkedIn implicitly have gender bias, then that might be shaping the algorithm.”
The gender downgrade
The online gender-swap move is just “one very specific example” of gender stereotypes influencing expectations of people’s professional abilities, said Carol Kulik, a professor in the Center for Workplace Excellence at the University of South Australia.
“LinkedIn is a professional platform, and business language is very male,” Kulik said. While she doesn’t doubt LinkedIn’s assertion that its algorithm isn’t designed to suppress certain identity groups, “is it going to be sensitive to gendered language? Of course it is!”
A 2025 systemic review of academic literature on female leadership across multiple disciplines, including management, psychology, women’s studies and economics, concluded that persistent stereotypes continue to hinder women’s advancement in the upper ranks.
Whether “the product is a poem, a computer program, a résumé, a scientific article, online teaching … as soon as people know it was a male author or male person who created it, it’s just seen as better quality,” said Kulik, who is also a scholar in the Academy of Management.
Others say the issue goes beyond gender and instead reflects how layers of human and machine biases collide.
Cass Cooper, a freelance writer and inclusion strategist, grew frustrated after reading posts from several White women who replicated Cornish’s experiment without considering race or other factors. She suspected many users’ experiments would not yield the same results.
So Cooper, who is Black, adjusted her profile to appear as though it was that of a White male. Doing so for as little as a day drastically reduced the reach of her posts, she found, while posting as Black man produced even worse results. Some users, both women and men, took up the experiment with no meaningful change in their visibility on the platform, a review of hundreds of posts and comments on the platform shows.
The wide range highlights the complexity of the factors that determine one’s reach, Cooper said, and the fact that “technology is not a passive experience.”
“If we’re going to talk about bias, visibility, and influence online, we cannot pretend we all start from the same default settings,” she wrote on LinkedIn about her results.
Cornish, a mental health professional who works with the tech industry, said she had hoped to engage a largely male audience while retaining her voice.
“I don’t want to have to use words like ‘scale’ and ‘drive,’” Cornish said. “Barf.”
She went back to posting as herself to her more than 50,000 followers after a week.
Cornish said she was inspired by Cindy Gallop, who has been raising concerns about potential algorithmic bias on LinkedIn after her impressions “fell off a cliff” this year.
The marketing executive turned entrepreneur and advocate says she has used LinkedIn for 20 years because she values it as a place where “you can absolutely accomplish your business goals.” She uses the platform to share funding opportunities for small-business owners, especially women, with her 140,000-plus followers.
But Gallop contends that got harder this year after LinkedIn adjusted the algorithm used to classify posts and distribute them to users’ feeds. In August, LinkedIn’s vice president of engineering, Tim Jurka, wrote in a blog post that the platform had “more recently” implemented large language models into these systems. In recent months, the site has been peppered with posts from people strategizing about how to beat the new algorithm and expressing frustration about feeling silenced.
Gallop began wondering whether the topics she tended to speak about as a women’s business advocate weren’t connecting as well on the platform as they once had.
“I know women are desperate for these opportunities,” she told The Post. “This is about building businesses, careers and livelihoods.”
Gallop repeatedly asked the company whether there was an algorithmic reason for the sudden drop-off. Dissatisfied with the response, she decided to test her theory over the summer by recruiting some friends, including men with far fewer followers, to make the same post and see how they performed. Her post reached about 800, or less than 1 percent of her followers; in comparison, both men netted several thousand impressions, including one who reached more than 100 percent of his followers.
Her experiment stuck in the minds of women such as Rachel Maron, co-founder of the AI company Trustable, as she considered her own experience on the platform this fall. After weeks of following the gender-swap trend with interest, Maron dove in herself, and was surprised to find that reposting the same content about AI governance that had previously gotten fewer than 150 impressions now netted “30,717 impressions.”
“Last week, I removed my pronouns. This week, I changed my LinkedIn gender marker to male. And suddenly, the platform can see me,” Maron posted on LinkedIn about her results.
Maron says she understands that the reality of today’s systems, which are often powered by LLMs, are a “black box” that give companies little insight into how and why they produce certain results. She plans to continue experimenting with changing different aspects of her profile to see how results vary.
“I’m going to keep messing with it,” Maron said.