[2603.05048] MCEL: Margin-Based Cross-Entropy Loss for Error-Tolerant Quantized Neural Networks
About this article
Abstract page for arXiv paper 2603.05048: MCEL: Margin-Based Cross-Entropy Loss for Error-Tolerant Quantized Neural Networks
Computer Science > Machine Learning arXiv:2603.05048 (cs) [Submitted on 5 Mar 2026] Title:MCEL: Margin-Based Cross-Entropy Loss for Error-Tolerant Quantized Neural Networks Authors:Mikail Yayla, Akash Kumar View a PDF of the paper titled MCEL: Margin-Based Cross-Entropy Loss for Error-Tolerant Quantized Neural Networks, by Mikail Yayla and Akash Kumar View PDF HTML (experimental) Abstract:Robustness to bit errors is a key requirement for the reliable use of neural networks (NNs) on emerging approximate computing platforms and error-prone memory technologies. A common approach to achieve bit error tolerance in NNs is injecting bit flips during training according to a predefined error model. While effective in certain scenarios, training-time bit flip injection introduces substantial computational overhead, often degrades inference accuracy at high error rates, and scales poorly for larger NN architectures. These limitations make error injection an increasingly impractical solution for ensuring robustness on future approximate computing platforms and error-prone memory technologies. In this work, we investigate the mechanisms that enable NNs to tolerate bit errors without relying on error-aware training. We establish a direct connection between bit error tolerance and classification margins at the output layer. Building on this insight, we propose a novel loss function, the Margin Cross-Entropy Loss (MCEL), which explicitly promotes logit-level margin separation while preser...