Kat Kibben

By far the most fascinating trend to witness over the last year has been watching AI become a part of everyday life in general, but particularly in recruiting. The last time we saw this much tech enter the industry at once was in the 90’s when recruiting went online. Tools that once felt futuristic are now products everyone has, helping hiring teams conduct interviews, optimize job postings, and even predict a candidate’s fit. The tools are making us faster, more efficient, and processes scalable – but the reality is that it’s not perfect.

AI is only as good as the data it’s trained with and the instructions it’s given. Without the right digital guidance from humans that are trained on bias, AI can unintentionally amplify biases that were present in past hiring practices. Instead of opening doors to more candidates, these tools might accidentally reinforce outdated stereotypes or even leave entire groups out without a human in the loop. That’s not just bad for diversity initiatives. It’s bad for business.

This is where intentionality is mandatory if you really want to remove bias. You can’t just plug AI into your process and expect magic. We’re deciding who will get hired. It takes thoughtful input, and that starts with understanding bias – not just the obvious kinds like gender or age, either, but the subtle bias that creeps into job postings without you even realizing. 

Why AI Alone Can’t Remove Bias From Job Postings

In the 90s, when the internet was first emerging as a widely accessible technology, there were many misconceptions about its use. From people thinking that the internet was just going to be a passing fad to folks who believed it would only be used by tech experts, we have come a long way and in many ways – much farther than most people ever imagined. I mean, I certainly never thought I’d be able to get anything delivered – and I mean *anything.* 

The adoption of AI is no different. As a relatively new piece of technology, no one is sure what’s ahead, and there are plenty of misconceptions floating around about artificial intelligence. 

One of the most common misconceptions is that AI can remove bias all on its own. In fact, over 59% of recruiters today believe AI will remove unconscious bias from hiring, but that oversimplifies things. It takes more than pasting something and prompting ChatGPT with a simple question like: “What bias does this have?” 

The complexity of bias in data and algorithms is more complicated. Bias will always be present when using AI because it reflects and amplifies any pre-existing biases in the information used to train it. Plain and simple – that means if the technology was trained on any data, it is always going to be biased in some way. 

One of the most well-known examples of this liability in action happened at Amazon back in 2015. For years, the recruiting team trained a hiring tool using the resumes of successful engineering hires as their data set. They thought by including the data from successful applicants, they would more easily be able to predict which candidates would be successful in the future. When they allowed the tool to work autonomously, they quickly saw the bias in their hiring practices front and center: their tool was favoring men. 

For example, their technology penalized resumes with terms like “women’s” on them, because the majority of successful hires were male candidates. The tool could only perpetuate their hiring bias based on past data, not correct the behavior. 

How to Train Your Team to Address Bias in Job Postings 

That doesn’t mean your AI tools are incapable of removing bias, though. It just means we have to get more thoughtful about how humans work in this loop with AI to identify bias that already exists. If you want to use AI to remove bias from job postings, it starts with a combination of training your team on those biases and how to use AI and new technology, not just utilizing that tech alone. Your team can’t address a bias they aren’t even aware of, whether it’s demonstrated by the technology they use or within themselves. 

Training should cover two topics. The first session trains your team on how to effectively write a job post that is inclusive. This provides you an opportunity to establish as a group what a good job post looks like, considering things like format, style, tone, and structure. 

Then, seek out training about how unconscious bias shows up in job postings. This should include education on how to address bias during the hiring manager intake

Often, hiring managers default to bias tactics because that’s the only way they have been trained to write a job post. Their bias is often brushed off by junior recruiters as something that can be addressed later and that’s a mistake. Not only will it create chaos when you’re making a hiring decision, now it has a greater impact: training your data.

To keep clean data and avoid teaching your AI tools to be biased, it’s critical that bias in job postings is addressed as quickly as possible in the process. Avoiding bias in job postings starts off with the simple act of being consistent – with formatting and removing bias. Technology platforms like Ongig can help with that consistency by offering customizable templates that control how each post looks and what fields are available to complete, removing any options for adding bias or editing templated information. 

Common Biases Found in Job Postings and How to Avoid Them

Bias doesn’t just show up in hiring decisions, though; it shows up in the words we choose from the very first day we advertise the role. In job postings, it’s often hiding in plain sight — gendered terms, phrases that imply age, or requirements that favor certain backgrounds over others. These are subtle signals that can discourage great candidates from applying. 

Some of the most common biases you’ll see in job postings include: 

  1. Years of experience. Ageism is often reflected in years of experience. While “one year of experience” may reflect you’re looking for a junior candidate, it also often implies the best candidate should be young. Ageism is also present when using buzzwords like “energetic” and “excited.” 
  2. College degrees. Getting a college degree is a privilege. Degrees should not be required for roles where there is no certification or education pipeline to be successful in the role. Doctors, lawyers, and accountants learn how to do their job at school. Recruiters? Not so much. This unnecessary requirement also contributes to pay inequities of up to 30% when someone with or without a degree can do the same role. 
  3. Preferred requirements. The importance of preferred requirements in job postings is not universally understood. While one candidate may read a job posting with a list of preferred requirements and think, “I shouldn’t even bother applying,” another candidate with the same qualifications may feel confident enough to believe they are qualified – and apply anyway. Remove all preferred requirements. 
  4. Buzzwords. Similar to preferred requirements, the same goes for those buzzwords organizations love to include in job postings. If there’s no universal understanding – which is the case most of the time when it comes to buzzwords – skip this language altogether. Some examples include “fast paced environment” and “team player,” terms that don’t mean the same things at different companies.

We can’t leave it all to technology when it comes to removing bias. That’s why training your team to know what a good job posting looks like and to recognize these common biases is so important. Starting with content that accurately reflects the role without all the cliches and biases will only help the AI reach its maximum potential when it comes to helping you remove bias from job postings. 

Tips on Writing AI Prompts for Inclusive Job Postings 

Now that your team is trained to recognize bias and can create a first draft of a job posting that accurately reflects the role and what you’re looking for, use AI prompts to remove bias and improve the quality of your writing. 

Remember, the quality of an AI tool’s output is only as good as the input it receives. That’s why writing clear, inclusive prompts are critical when you’re using AI to remove bias from job postings.

Focus on the change you want the AI to make while being explicit about what to avoid in your prompts. For instance, instead of saying something like “edit this job post to make it more inclusive,” you might say something like “review this job post for gendered language, exclusionary terms, and industry jargon. Rewrite it to focus on skills and qualifications while using gender-neutral and accessible language” instead.

The first prompt is too broad, leaving the AI to interpret what “inclusive” means, which can lead to uneven or ineffective results. The second prompt clearly outlines what the AI should look for and how it should adjust the content, ensuring a more focused and impactful revision.

Good vs. Bad AI Prompts Tips for Writing Bias-Free Job Descriptions

Here are some examples of bad vs. good AI prompts for removing bias from a job post:

Good vs Bad AI Prompt Tips
Good vs Bad AI Prompt Tips to Remove Bias in Job Postings
  1. Bad Prompt: Edit this job posting for bias.
    Good Prompt: Identify and remove biased language in this job posting, such as gendered phrases, stereotypes, or age-specific terms. Rewrite the text to emphasize skills and experience in a way that appeals to a diverse audience.
  2. Bad Prompt: Make this more inclusive for diverse candidates.
    Good Prompt: Rewrite this job post to welcome candidates from nontraditional backgrounds, emphasizing transferable skills and avoiding unnecessary degree or years of experience requirements.
  3. Bad Prompt: Improve this job post for clarity and fairness.
    Good Prompt: Review this job post for phrases that may exclude certain groups, such as “native English speaker” or “young team.” Adjust the language to focus on the role’s responsibilities and required skills without unnecessary qualifiers.
  4. Bad Prompt: Remove bias.
    Good Prompt: What bias exists in this posting? 

When working with AI, specificity in your prompts will help you improve the inclusivity of your post. Using clear, actionable prompts allows you to remove bias and create content that truly aligns with your hiring and inclusion goals.

Getting Buy-In for Bias-Free Job Postings

Even after you’ve done everything to remove bias from job postings, people might still push back. Remember, most hiring managers have never seen job postings without bias before. They might be concerned or confused about the requirements you put out there. You might get questions like, “Are they specific enough? Will they attract the right people?”

Get buy-in for your new process by including individuals who have a stake in the hiring process and a strong understanding of inclusivity in pilot programs as you make the transition from the old way of doing things to this method of removing bias. HR professionals, diversity and inclusion leaders, and hiring managers are great candidates for this role. Their different perspectives help identify biases the AI might miss and ensure the language resonates with diverse audiences, too. 

When introducing people to this process, you’ll also need to teach them how to provide constructive feedback. Vague comments like “this feels off” or “make this sound better” don’t help refine any written outputs – human or machine. The more specific the feedback, the easier it becomes to not only teach the AI but also improve future outputs. It also means we’re not unintentionally reincorporating bias by including suggestions from the team. 

The Human-AI Partnership for Better Hiring Practices

AI is not here to replace the human element — it’s here to enhance it. While specific AI prompts can save time by quickly generating job postings and removing obvious biases, it can’t replace the thoughtfulness and empathy that people bring to the hiring process. We need humans to lead this technology revolution. 

Training teams on how to provide AI with better content and prompts to remove bias from your job postings is where to start. Using AI to remove bias from job postings is not about just writing content faster. It’s about using technology to create a more thoughtful and inclusive hiring process from start to finish. Bonus: it’s more efficient, too. 

As you implement these practices, remember that small changes can lead to big results. Every prompt you write, every feedback loop you create, and every human review you complete moves your organization closer to a hiring process that is more equitable and effective. By bringing together AI and humans to remove bias, we’re not just improving job postings. We’re setting the stage for a future where hiring is fairer, more inclusive, and more successful for everyone and it all can start with a prompt. 

Why I Wrote This

Bias in hiring starts with job postings, and AI alone won’t fix it. We want to show how thoughtful prompts and tools like Ongig can help you create truly inclusive posts. Want to see how it works? Request a demo today.

by in Job Postings