[2502.01713] Auditing a Dutch Public Sector Risk Profiling Algorithm Using an Unsupervised Bias Detection Tool
Summary
This article presents an audit of a Dutch public sector risk profiling algorithm, utilizing an unsupervised bias detection tool to identify biases affecting students from diverse backgrounds.
Why It Matters
As algorithms increasingly influence critical decisions, understanding their biases is essential for ensuring fairness, especially in public sector applications. This study highlights the importance of bias detection tools in promoting transparency and accountability in algorithmic decision-making.
Key Takeaways
- The study audits a risk profiling algorithm used in Dutch education.
- An unsupervised bias detection tool was employed due to data privacy issues.
- Findings revealed disparities affecting students with non-European migration backgrounds.
- The tool is made available as an open-source resource for further audits.
- The research underscores the need for human oversight in algorithmic decision-making.
Computer Science > Computers and Society arXiv:2502.01713 (cs) [Submitted on 3 Feb 2025 (v1), last revised 15 Feb 2026 (this version, v4)] Title:Auditing a Dutch Public Sector Risk Profiling Algorithm Using an Unsupervised Bias Detection Tool Authors:Floris Holstege, Mackenzie Jorgensen, Kirtan Padh, Jurriaan Parie, Krsto Prorokovic, Joel Persson, Lukas Snoek View a PDF of the paper titled Auditing a Dutch Public Sector Risk Profiling Algorithm Using an Unsupervised Bias Detection Tool, by Floris Holstege and 6 other authors View PDF HTML (experimental) Abstract:Algorithms are increasingly used to automate or aid human decisions, yet recent research shows that these algorithms may exhibit bias across legally protected demographic groups. However, data on these groups may be unavailable to organizations or external auditors due to privacy legislation. This paper studies bias detection using an unsupervised bias detection tool when data on demographic groups are unavailable. We collaborated with the Dutch Executive Agency for Education to audit an algorithm that was used to assign risk scores to college students at the national level in the Netherlands between 2012-2023. Our audit covers more than 250,000 students across the country. The unsupervised bias detection tool highlights known disparities between students with a non-European migration background and students with a Dutch or European-migration background. Our contributions are two-fold: (1) we assess bias in a real-...