Additional surveillance should put an end to “life-threatening algorithms.”

Additional surveillance should put an end to “life-threatening algorithms.”

The Dutch Data Protection Authority began monitoring the algorithms. For example, you can determine whether people have been invited to an interview or evaluate whether someone has been screened for additional fraud.

Additional control was promised in the coalition agreement. The government provided one million euros for this year. This will increase to a structural amount of EUR 3.6 million in 2026 in the coming years. Both the authorities and the economy are affected by the new control. According to President Aleid Wolfsen, there is a great need for greater oversight of how algorithms work, as things can go wrong.

There have been many examples of misuse of algorithms in recent years. That’s what happened in the childcare allowance scandal, for example. Then, it was determined who would be controlled manually using all kinds of indicators. Indicators were, for example, a person’s nationality, family structure and salary. Another example is the controversial anti-fraud program SyRI.

“Algorithms are increasingly deployed and used in the selection of people,” Wolfsen says. “Who may or may not be a customer of a company, who may or may not be invited to an interview, and who may or may not be subject to additional fraud screening.”

Monitoring of all algorithms

According to Wolfsen, an algorithm can be “life-threatening” if it goes wrong. “An algorithm can be misprogrammed or trained incorrectly,” Wolfsen says. “It can contain discriminatory elements, and it can affect many people at the same time. We should all try to prevent it.”

The new audit should ensure that the data protection officer can take action more quickly in case of faulty algorithms. “If we take a closer look at this, you can intervene faster, warn faster, and stop faster,” he says. “We will do this in conjunction with other regulators.” Citizens themselves can also report their complaints to the Dutch Data Protection Authority. This authority may also decide to conduct an investigation.

Warning Wolfsen

Wolfsen warns companies that the Dutch data protection authority may monitor all algorithms. “We can access anything and everywhere. There are no secrets for us in this area. We are not allowed to disclose trade secrets because we do not want to disclose and do not want to know how it works.”

It is not yet clear whether the support will be sufficient. Amnesty International says resources are insufficient to effectively combat discriminatory algorithms. The human rights organization says that, among other things, additional money should be provided for supervisory duties.

Source: NOS