5 Ways HR Tech Supports Your Diversity Recruiting Strategy

5 Ways HR Tech Supports Your Diversity Recruiting Strategy

Is bias in recruiting still a problem in today’s workforce? Well, yes and no. If you’ve been paying attention to recruiting trends over the past decade or so, you know that diversity is a key consideration for any hiring strategy. Cultivating a diverse workforce from top to bottom delivers higher financial returns over time and supports growth and innovation

Beyond that, pursuing diversity in recruiting and hiring is just the right thing to do. 

The problem is that we don’t always know what we don’t know. In other words, unintentional bias can creep into even the most conscientious of companies. It can happen because our networks tend to consist of people like us, and we don’t always recognize that. It can happen because of the wording used in job postings, or because we “go with our gut” in an interview without truly understanding why. 

The good news, though, is that technology can help us overcome unconscious bias and create a stronger diversity recruiting strategy based on data and merit. 

5 Ways to Improve Your Diversity Recruiting Strategy With HR Tech

At last year’s HR Tech conference, Ryan Browning of consulting firm Mercer and Richard Lopez of Dell Technologies talked about how design thinking can help you implement technology solutions to solve unconscious bias. It’s important to start from the ground up so that you aren’t just applying technology bandaids to deep cultural problems. But if you start from a problem-solving perspective, technology tools can be essential resources for eliminating bias in the recruiting and hiring process.

Here are five ways to get started.

  1. AI for Job Descriptions – Job descriptions are often the first contact a candidate has with your company. That’s why they must speak to all qualified candidates equally. For example, using gendered pronouns to describe an applicant (he/him) can discourage female candidates from applying. Beyond specific word usage, the structure of your description may also discourage some candidates from applying . Studies show, for example, that listing too many requirements can turn excellent female candidates away. This happens because women usually only apply for a job when they meet 100% of the requirements, while men will apply if they meet about 60%.

    New AI tools like Textio help you optimize your job descriptions based on research. Textio evaluates your description and offers recommendations for wording changes and structure so job posts appeal to appeal to all candidates including women, minorities, and older individuals.

  2. Expanded Talent Pool – Limiting your talent pool to candidates in your immediate network can also create diversity challenges. It happens because networks tend to consist of people like us, with similar backgrounds and experiences. Technology can help you reach more diverse candidates by extending your database. For example, Blendoor uses crowdsourcing, strategic partners, and talent events to connect companies with candidates through their jobs app. 

  3. Blind Resume Screening– Companies routinely use tools embedded in applicant tracking systems (ATS) to do an initial resume screen. HR tech vendors often include this capability in their software, saving HR managers the time and effort of reading each resume personally. Ascentis, for example, offers resume parsing through their career portal. This feature helps hiring managers evaluate resumes based on how well they meet job requirements.  

    New tools like TalVista take this even further by redacting identifying information from resumes. Once managers are ready to narrow down candidate lists and schedule interviews, they can evaluate each candidate based on credentials and skills only. This prevents unconscious bias from creeping in based on a name, educational institution, or photo. 

  4. Blind Pre-Hire Assessments – Once they have selected a short list of candidates, many companies will ask them to complete a skills assessment to evaluate specific abilities. These skills assessments introduce another level of bias as demonstrated by the “Orchestra Study.” In the study, female musicians were 50% more likely to advance out of preliminary rounds if blind auditions were used. This happened even though evaluators believed they were assessing each musician based on talent alone. 

    GapJumpers has taken this knowledge and applied it to the pre-hire assessment. Using their technology, companies can eliminate bias by creating anonymous assessments to test candidate skills. 

  5. Predictive Analytics for Candidate Selection – Predictive analytics helps managers assess how a candidate will perform in the role over time. Will he or she be a good cultural fit? What can you expect in terms of retention? Which candidates are likely to be most successful? Rather than hiring based on gut feelings about these questions, managers can use predictive software to get real answers based on data. 

    Ultimate Software, for example uses both predictive analytics (what is the most likely outcome) and prescriptive analytics (what should I do to achieve the best outcome) to hire top candidates and support them after they join the team. 

Can Technology Really Eliminate Bias From the Hiring Process?

Unfortunately, technology can’t eliminate all bias everywhere. There is still an essential human element in the hiring process. That human touch is vital for making final decisions, promoting candidate engagement and creating a positive candidate experience. While this interaction is important and necessary, it also introduces the possibility for unconscious bias along the way. Humans may also inadvertantly introduce bias to technology at the coding level or during the data collection process. If the data itself is not inclusive, then the technology will produce skewed results.  

That’s why you also need to intentionally develop a culture of diversity and inclusion in your workplace, starting with training. Google Vice President Dmitri Krakovsky addressed this issue at the 2019 HR Tech Conference. In his session, he stressed the importance of collecting diverse, representative data so that AI algorithms will not reflect bias in their outcomes. 

Technology is not perfect. Still, it’s a step in the right direction. If we can create AI that “learns” based on unbiased data samples, said Krakovsky, we will “have the potential to be transformational in promoting inclusivity and diversity in recruitment.” 

And that’s the ultimate goal.

0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply