[2303.15585] (Un)fair devices: Moving beyond AI accuracy in personal sensing
About this article
Abstract page for arXiv paper 2303.15585: (Un)fair devices: Moving beyond AI accuracy in personal sensing
Computer Science > Computers and Society arXiv:2303.15585 (cs) [Submitted on 27 Mar 2023 (v1), last revised 3 Mar 2026 (this version, v4)] Title:(Un)fair devices: Moving beyond AI accuracy in personal sensing Authors:Sofia Yfantidou, Marios Constantinides, Dimitris Spathis, Athena Vakali, Daniele Quercia, Fahim Kawsar View a PDF of the paper titled (Un)fair devices: Moving beyond AI accuracy in personal sensing, by Sofia Yfantidou and 5 other authors View PDF HTML (experimental) Abstract:Personal devices are omnipresent in our lives, seamlessly monitoring our activities, from smart rings tracking sleep patterns to smartwatches keeping an eye on missed heartbeats. The rich data streams from such devices fuel advanced Artificial Intelligence (AI) applications. Instead of solely relying on direct sensor measurements, these applications are increasingly leveraging Machine Learning (ML) model estimates to derive insights. But are these estimates biased or not? This literature review delivers compelling evidence about the impact of hidden biases that creep into ML models deployed on personal devices. We discuss critical bias issues drawn from prior work such as racial bias in pulse oximeters, weight bias in optical heart rate sensors, and sex bias in audio-based diagnostics. In response to these challenges, we advocate for a shift from prioritizing performance-oriented evaluations of personal devices to adopting assessments grounded in a human-centered approach. To facilitate th...