[2604.02488] Causal-Audit: A Framework for Risk Assessment of Assumption Violations in Time-Series Causal Discovery
About this article
Abstract page for arXiv paper 2604.02488: Causal-Audit: A Framework for Risk Assessment of Assumption Violations in Time-Series Causal Discovery
Computer Science > Machine Learning arXiv:2604.02488 (cs) [Submitted on 2 Apr 2026] Title:Causal-Audit: A Framework for Risk Assessment of Assumption Violations in Time-Series Causal Discovery Authors:Marco Ruiz, Miguel Arana-Catania, David R. Ardila, Rodrigo Ventura View a PDF of the paper titled Causal-Audit: A Framework for Risk Assessment of Assumption Violations in Time-Series Causal Discovery, by Marco Ruiz and 3 other authors View PDF HTML (experimental) Abstract:Time-series causal discovery methods rely on assumptions such as stationarity, regular sampling, and bounded temporal dependence. When these assumptions are violated, structure learning can produce confident but misleading causal graphs without warning. We introduce Causal-Audit, a framework that formalizes assumption validation as calibrated risk assessment. The framework computes effect-size diagnostics across five assumption families (stationarity, irregularity, persistence, nonlinearity, and confounding proxies), aggregates them into four calibrated risk scores with uncertainty intervals, and applies an abstention-aware decision policy that recommends methods (e.g., PCMCI+, VAR-based Granger causality) only when evidence supports reliable inference. The semi-automatic diagnostic stage can also be used independently for structured assumption auditing in individual studies. Evaluation on a synthetic atlas of 500 data-generating processes (DGPs) spanning 10 violation families demonstrates well-calibrated r...