fbpx

The number one conversation in recruiting is the effect of AI on the recruitment process. As the CEO of a management recruitment firm, I am approached by an outside AI firm every few business days with an AI-based tool or resource that will dramatically support our recruiting effort and help my firm improve its productivity. The AI phenomenon has been ongoing for the past few years but has accelerated this past year rapidly. 

While we have implemented some AI in our process, we would be foolish not to; it is hardly a miracle cure as it has been touted. We are all fascinated by the idea that AI will change things, and indeed, in time, if used correctly, it will. The potential for Human Resources and Recruiting Agencies to increase recruitment efficiency is not a pipe dream. Still, it also doesn’t necessarily add exponential improvement to the process and is not without implementation challenges. No question, there are lots of manual steps in the recruitment process that could be automated, but does it improve the quality of the hire? Does it improve the candidate experience? Is it proven enough the embrace wholeheartedly? 

While there are plenty of pros to using AI in recruitment, here are some issues to seriously consider before implementing AI in the recruitment process: 

Erosion of the candidate experience: Do candidates want to use a chatbot to talk to a machine first about one of the essential things in their life, their career? Chatbots are great to start a conversation with a candidate in the recruitment process. Still, they can’t answer more challenging questions, and if you’ve ever used one and you have a real need to connect, you will know this can be a frustrating experience. 

Automated bias in the process: Does AI inadvertently introduce bias into the process?  

If not created in check, hiring algorithms can negatively affect organizations hiring practice and cause inclusion problems on a grander scale than it was attempting to solve. 

For example, after finding it favored applicants based on words like ‘executed’ or ‘captured’ that were more commonly found on men’s resumes, Amazon stopped using a hiring algorithm after finding it favored male applicants. There are plenty of other examples, and one wonders how many well-qualified women or minorities miss out on opportunities because AI denied their application based on baked-in bias. There is also a bias in the human process; however, this may be limited to individuals. When discrimination is built into an AI selection process, whether intentional or not, this can affect an organization on a massive scale. 

Overlooking soft skills and culture fit: Candidates may match keywords well, but that doesn’t mean they have the right soft skills or are even close to an excellent cultural fit to the organization. Teaching a machine to learn to look for matches on soft skills is difficult but not impossible, and even the best AI screening systems are not there yet at being able to evaluate soft skills well. Determining the cultural fit has the same challenges.  

This all goes back to the common-sense adage, garbage in, garbage out. Who’s developing the AI model? What are their ethics and biases, and how are they separated from the code? Were their preferences somehow inadvertently inserted into the process? Whatever they are, unless monitored and tested by those who understand employment law, can be inadvertently scaled up in larger organizations that use AI to make hiring decisions.  

Fairness in the hiring process 

There is also the issue of ‘fairness,’ which is a matter of interpretation. Who decides what’s fair and builds it into the algorithm? There is an entire field growing on ethics in AI, and this needs to be something that is understood by CEO’s before they drive the in-depth implementation of AI in their organizations.  

What happens to all the data collected? 

Also, since AI is so new to recruitment, there is a lack of a legal framework around the data large organizations collect on individuals; this is another lagging area and could create serious repercussions for organizations not protecting their collected data with integrity. 

AI is in its infancy in recruitment but is here to stay 

AI use in recruitment isn’t going away and adds value from a productivity perspective leaning more towards larger organizations that tend to get a massive number of applications for a single position. AI can speed up and enhances an insurmountable screening process and save hours, if not days of productivity.  

AI should be used to inform hiring decisions, not make them. While I believe in using technology to improve productivity, AI has a long way to go and should be used cautiously to aid in the process. AI should not replace human decision-making and should be one of several inflection points when choosing the best candidates to hire for your organization.  

Categories:

Tags:

No responses yet

Leave a Reply

Your email address will not be published. Required fields are marked *