When AI Selects Candidates, the CNIL Draws the Line
Faced with an exponential number of applications for the same position, recruiters are increasingly turning to artificial intelligence. The automated selection of the most relevant applications is one of the guarantees of this technology. In this context, to what extent can these professionals delegate their responsibilities?
Under what conditions can selection algorithms be used?
If the number of HR solutions for recruitment increases every year, not all software includes selection algorithms. To determine if an AI is making the decision to eliminate a candidate, we must agree on the concept of a fully automated decision: through algorithmic processing, no human intervention occurs in the process... to the point where the data controller cannot justify the reasons for the proposed decision.
When a company uses software with AI sorting applications, it must justify its use to the CNIL (National Commission on Informatics and Liberty) based on an exceptionally high number of applications for a single position. For this, the company must provide a Data Protection Impact Assessment (DPIA) to the public body. According to the CNIL guide on data protection in recruitment, software editors are obliged to clearly explain its operation and limitations, giving recruiters the freedom to verify the relevance of the software.

It should be noted that the collection of personal data takes place during the first phase of selection. When searching for candidates, the company is prohibited from requesting financial personal data (bank details), identity papers (ID card, residence permit, passport), or health insurance card. Only after the job offer has been made can the employer request these information to formalize the contract.
Who is responsible for my data?
In matters of responsibility, the CNIL designates the data controller as the organization or person responsible for defining the purposes and means of this processing. If a recruitment agency simply meets the specifications of its client companies, then it cannot be held responsible for data processing. However, if it participates in defining the process (such as sorting applications or implementing an evaluation system), the agency will be responsible. However, when the agency transfers personal data to the company for it to continue the process, the client company becomes responsible.
Regarding sorting, evaluation, and selection software used, the responsibility falls on recruiters. According to the CNIL, recruiters must ensure that:
- the data implemented in the management system is relevant and allows the algorithm to be trained ethically,
- pre-selection choices are not based on discriminatory criteria such as age, gender, skin color...
- evaluation tools, such as those determining micro-expressions and the candidate's personality, are scientifically validated.
Techniques claiming to deduce certain qualities of candidates from their facial features (psychomorphology), astrological sign, date of birth (numerology), and other methods whose scientific validity is contested in the academic world cannot be considered as having a direct and necessary link with the purpose of recruitment.
CNIL, Recruitment Guide: Fundamentals on the Protection of Personal Data (page 61)
Protecting personal data to avoid technological abuses
Assuming that artificial intelligence needs to be nourished by a substantial panel of applications to be effective (and truly select talents), it should not be used for positions that attract few candidates. In this regard, the responsibility of recruiters may be questioned, as their profession now requires relatively strong computer skills. Indeed, understanding how an algorithm and AI work should gradually be included in their university curriculum or continuous training.
Such learning is now essential, especially since false positives in terms of preselection can hinder participants' chances: the program may present a candidate who, according to it, meets the expectations of the recruiter... except that it doesn't.
Source
[Cover Photo: Rombo]
Support us by sharing the article: