“Hi Siri, find a candidate.” More and more companies are using software to find and screen employees. It sounds good, but there is also a downside, says women’s rights organization WOMEN Inc., which has recently researched to do† Several experts discussed the disadvantage this week.
“The algorithms behind many tools and applications are human creations,” said Cécile Wansink, spokeswoman for WOMEN Inc. “And that’s why they can take over existing prejudices.” For example, it means that women have less chance of a job.
Discriminatory algorithms? The fact is not new: in recent times, for example. face recognition software has been criticized for not being able to identify ethnic minorities and women† Amazon has also come under fire because the company’s recruiting tools favored Men†
These men traditionally dominated the company, and this was reflected in the software and algorithms. “An algorithm in itself is not biased,” says Remy Gieling, owner of the AI platform Ai.nl and author of ‘Discover AI’s Growth Opportunities.’ It’s about the data that feeds it.
‘Raise a child’
Through artificial intelligence, software and machines can independently solve problems and, for example, select specific people. But they do it based on data that people enter. “It’s a bit like raising a child,” Gieling says. “The system learns from the examples you give it.”
For example, Amazon’s systems learned from the resumes of white men who worked for the company. And this data became the basis for the selection of new staff. Amazon has adapted its algorithms, but many other companies are not aware of the limitations of their data, Gieling says. They should be more aware of that, he says.
much goes wrong
According to privacy lawyer and ICT professor Lokke Moerel, things are going wrong in many places. However, she would rather not name names. “Today, there is a tool for every step of an application process,” she says. Such as tools to make job postings more inclusive or job boards trying to get ads for the right man. “Each tool has its own pitfalls.”
For example, a job site algorithm is often aimed at generating as many clicks as possible. This is how websites make their money. Therefore, especially men see an ad for a taxi driver and stereotypes can be magnified. Tools that screen candidates during interviews are trained in language and facial expressions, but these are different for different population groups. “That’s why they can just fall next to it.”
The problem is that those tools are not transparent in how they work, Moerel says. And therefore not controllable. “While the chances of them having prejudices in them are high.”
‘Also good tools’
There are also good tools, says the professor. Such as. Randstad chatbot that asks candidates about their competencies. “People from different backgrounds sometimes describe their skills differently.” When making queries, it is better to search for a working match in the database.
According to Moerel, a lot of recruitment software is still in its infancy and it is difficult to remove all prejudices. “My assessment is that many of the tools do not meet the strict requirements of European Privacy Law (GDPR).” For example, because the software automates the rejection of people. And it can not just happen.
“I do not think it is a good thing in a time of labor market shortages.”
‘People alone are not enough’
According to Rina Joosten, founder of the software company Seedlink, companies need algorithms in times of scarcity. To gain new insights and to find exactly the people they would otherwise skip. And according to her, it can also be done without prejudice. “We look solely at competencies, not at things like gender or ethnic background,” she says.
The company helps companies recruit people. For this purpose, an interview tool has been developed in which applicants answer questions. The answers go through a “non-biased” algorithm through a large database, says the entrepreneur. This should eventually lead to more diverse and better performing teams.
According to Joosten, people will always have to check computer work for prejudices and mistakes. But human judgment alone is not enough. “Computers and algorithms are more flexible than humans,” says Joosten. “And they can be retrained.”