Read

Join Us!
Read
Get Fired

Don't delegate your recruitment to an IA (if you want to do it right).

What if I told you that relying on IA for hiring can be extremely problematic?

By:
Sergio Nouvel
Don't delegate your recruiting to an IA | FUN/Get on Board

Imagine a software that takes 100 candidates for your job and, in a matter of minutes, returns only 5, carefully selected for their skills, experience and ideal characteristics for the position. It's the dream, right?

The idea of efficiency and accuracy in candidate selection - a well-oiled machine - sounds like the holy grail of every area of recruitment, especially in a world where time and resources are limited. The promise of streamlining the hiring process and reducing the workload through the use of Artificial Intelligence is certainly tantalizing.

But what if I told you that relying on IA for hiring can be extremely problematic, even - and especially - when it seems to work perfectly?

Bringing people into a team is an inherently human process. Relying on an algorithm to make such crucial decisions on behalf of your company can have undesirable consequences on the quality of talent and your employer brand.

Let's see why.


Problem 1: The match is often superficial, and does not measure what is important.

Many IA systems rely on poor quality data, such as CVs or LinkedIn profiles, which are often not good indicators of talent, experience or performance. They won't tell you how a person impacted business objectives, dealt with uncertainty, communicated or reacted to feedback when working in a team.

Assuming you were to actually capture someone's experience, talent and values, you only have half of the equation. A person's potential is equally or more important in a good hiring decision than accumulated experience. And potential, precisely because it hasn't materialized yet, you'll never find it in a LinkedIn profile.

Many of the best hiring decisions I've made have been counter-intuitive: people with no relevant experience on their CV, but with potential, who by being in a good team and work culture, manage to excel.

If a CV or LinkedIn tells you so little, why would you use them as your first filter?

Other AI solutions employ tests, games, or assess voice or facial expressions to make up for these shortcomings. The problem is that none of these solutions can capture passion, rapport or value alignment, which tend to be much more powerful indicators of retention, and not correlated with personality or behavioral factors.

You can see this for yourself: your mood, your disposition and even your personality can change radically depending on who surrounds you. Around the right people, you shine. With the wrong people, you sink. Evaluating people through games, or having them express themselves in artificial social contexts, will give you no insight into how that person will behave on your team.


Problem 2: Perpetuating existing biases

Many AI tools, moreover, will only bring you clones from the past, selected based on the same biases as the originals.

Just as an example, a study conducted by researchers at MIT and Stanford University in 2018 showed that AI algorithms, even those developed by large companies such as Amazon, showed gender biases when selecting job candidates . As well as this study, there are other more recent studies that point in the same direction: AI replicates, expands and perpetuates the distortions of organizations that use them to recruit.

It makes sense, given that such algorithms have been trained based on the past, and in the past the business culture in both the U.S. and Latin America has not been characterized by much diversity.

And why would you want clones of your successful 2018 or 2020 talent anyway? The economy, the market, your users, are changing. If your company is undergoing a transformation, looking for photocopies of what worked 2, 3, or 5 years ago doesn't sound like the best idea.


Problem 3: You don't know who you are leaving out.

Let's assume that none of the above happens, and that you have a recruitment system that -through some kind of magic and clairvoyance- offers you a shortlist of diverse, talented candidates with an absolute guarantee of loyalty and retention.

Even under these assumptions, you still have a big problem: you have no idea who you ruled out.

How do you know there wasn't someone even better among the discarded? Even if the recommendations are decent, how would you know that they were the best, or that it wasn't just a coin flip? Or that you recommend the same terna to everyone?

Recruiting using AI, and in general recruiting using solutions that only present you with a shortlist of candidates, induces a false sense of calmness and of "doing it right". Especially when the candidates it suggests to you seem to be a good match.

What would you say if, after hiring an applicant recommended by AI, you found out that there was another person much more qualified for the position and willing to work for the salary offered, but who never had the opportunity to participate in the process because you closed it? What do you think this person would think of your selection process?

When you recruit using intermediaries or closed systems (including artificial intelligence), you don't even have the chance to find out. And you lose the chance to capture candidates for other positions, or to create long-term bonds with a person who may be ready to work with you in the future.

Recruiting using AI, and in general recruiting using solutions that only present you with a shortlist of candidates, induces a false sense of reassurance and of "doing it right".


Problem 4: You can't explain your hiring decisions.

Even with a hypothetical system that reads your mind, makes ideal hiring decisions and shows you who it discarded, you'll run into the following problem: How do you know you ran an open, transparent and fair process if you have no idea why the algorithm selected the shortlist of candidates you received?

The problem of explainability in Artificial Intelligence is serious, and most vendors have incentives to keep their algorithms as black boxes. In most cases, as I mentioned above, the algorithms simply look for correlations with "ideal" candidates from the past. Many of these correlations are spurious or superficial (recall Problem 1), and the algorithm will not be able to explain them at all.

Hiring involves accountability, and if you can't justify why you ruled out certain candidates, you can't learn from your good or bad decisions, thus depriving yourself of the opportunity to refine your search criteria. "The machine did it," or "this is what the recruiting agency brought me" are not acceptable explanations.


Problem 5: It's dehumanizing (and undermines your employer brand).

Let's go back to fantasizing, and suppose you manage to find an effective AI that also explains exactly why it left out all the other candidates. Even in that case, you'll get an impersonal hiring process and a robotic candidate experience.

People who are discarded will take away a cold image of your company and feel they were not given a fair chance; and even those who are hired will want to feel genuinely valued for their skills and experience, not for matching the expectations of an algorithm. What does it say about a company that it doesn't want to deal with 95% of the candidates who apply, leaving that task to a robot? What does it suggest to you about its culture and value priorities?

Think about the times you've had a complex issue with a purchase you made or a service you requested on an app, and wanting to talk to a human at the company, you were referred instead to a litany of chatbots trying to shortcut you and dissuade you from reaching out to a real person. Was it a good experience, or did you feel the Uncanny Valley?

Why would you think that such mechanized experience is a good idea for recruiting the person who will then be working with you?


Photo by Waldemar on Unsplash

These problems stem from underestimating the complexity and consequences of hiring decisions. Starting with overlooking the fact that...


Hiring is a strategic decision; don't underestimate it.

Let's go back to the beginning of this article:

Imagine a software that takes 100 candidates for your job and, in a matter of minutes, returns only 5, carefully selected for their skills, experience and ideal characteristics for the position. It's the dream, right?

Moment: why would that be your dream?Why, of all the things that can be made more efficient in your organization, is it precisely the interaction with potential talent in your organization that you want to get rid of?

Imagine that your country's national soccer team has a new coach, who, as soon as he/she takes office, concludes that he/she prefers to delegate the roster of selected players to an AI to "save time and be more efficient". What would you think of his criteria (and his salary?)?

Hiring is not about selecting parts for a machine, but about integrating humans into a team, and that decision is probably the most important one your organization must make. These humans are not a collection of jobs and skills; they have priorities, values, culture, issues, motivations, beliefs, dreams, fears and ways of functioning. Moreover, they are not just a filtered list of first and last names; they are people who must get along with each other.

Strategic decisions are inherently human, and no decision is more strategic (and therefore more human) than the decision to hire.


Humanize Human Resources

Assesses value and mission alignment through genuine conversations with applicants. Invest time in getting to know people. Have a recruitment process that is transparent, rational and that you can explain to anyone, which will lead you to be unbiased.

Many times, dehumanizing recruiting solutions are implemented because HR leadership asked for unrealistic hiring goals based on quantity rather than quality. Don't turn your recruiting team into a human processor; avoid assigning them volume and quantity targets or KPIs that are not counterbalanced with candidate experience and depth of knowledge metrics.

IA can help you facilitate the processing and understanding of candidates, but it should not make the decisions for you. You need to have all the information, and all the candidates, under your observation. Rule out based on various data points and use rational processes and structured interviews to make informed and fair decisions.

Where AI can support more humane recruitment

Don't use Artificial Intelligence to discard or filter; use it to broaden and enrich.

Well applied and dosed, Artificial Intelligence has many opportunities to enrich selection processes and make them more human, instead of mechanizing them. Here are some ideas we see on the horizon:

  • Data enrichment and structuring: use Artificial Intelligence to know more, not less, about your candidates. Detect keywords and technology, consolidate information from various sources, clean and format information. Leverage AI to create an applicant dossier, highlight data about candidates to get to know them better, and suggest better interview questions.
  • 🔎 Se gmentation and search: use AI to organize and filter your candidates according to keywords, perceived experience, etc. Also for intelligent searches, such as "show me bootcamp graduates first".
  • 👀 "Soft" prioritization: instead of pretending that an AI will give you 3 candidates and you will never know about the rest, use it to understand which applicants to look at first. But look at them all.
  • 💎 Undetected potential and unobvious connections: use AI to identify traits you may be overlooking, or connect dots (perhaps there is something in common between a Data Science course and having a hobby of music production).
  • ⚙️ Better team management: AI can deliver quality metrics on the recruiting team, suggest improvements in the communication tone or sourcing strategy of recruiters, or generate intelligent alerts to avoid losing valuable candidates.

In short: don't use Artificial Intelligence to screen out or filter; use it to broaden and enrich. Don't use it to reduce the number of applicants you look at; use it to discover new talent. Don't use it to avoid human contact; use it to encourage it.

Hiring is a strategic and human decision that requires careful consideration and evaluation of candidates, something that should never be completely replaced by artificial intelligence.

And I am talking about should, and not whether or not, because as leaders and creators of culture in our organizations, the line between what can be done with technology and what is right, just and desirable to do with technology must always be clear. Wrong or wrong, we work with, by and for humans.

Sergio Nouvel is Co-Founder and CEO of Get on Board.

This article originally appeared on the Get On Board blog.

Edited by

Raquel Rojas

Don't delegate your recruitment to an IA (if you want to do it right).
funfunfunfun

Comments

Related COntent
BringFuckup Nights to your organization!

Let's transform our perception of failure and use it as a catalyst for growth.