[2603.21724] FISformer: Replacing Self-Attention with a Fuzzy Inference System in Transformer Models for Time Series Forecasting
About this article
Abstract page for arXiv paper 2603.21724: FISformer: Replacing Self-Attention with a Fuzzy Inference System in Transformer Models for Time Series Forecasting
Computer Science > Machine Learning arXiv:2603.21724 (cs) [Submitted on 23 Mar 2026] Title:FISformer: Replacing Self-Attention with a Fuzzy Inference System in Transformer Models for Time Series Forecasting Authors:Bulent Haznedar, Levent Karacan View a PDF of the paper titled FISformer: Replacing Self-Attention with a Fuzzy Inference System in Transformer Models for Time Series Forecasting, by Bulent Haznedar and Levent Karacan View PDF HTML (experimental) Abstract:Transformers have achieved remarkable progress in time series forecasting, yet their reliance on deterministic dot-product attention limits their capacity to model uncertainty and nonlinear dependencies across multivariate temporal dimensions. To address this limitation, we propose FISFormer, a Fuzzy Inference System-driven Transformer that replaces conventional attention with a FIS Interaction mechanism. In this framework, each query-key pair undergoes a fuzzy inference process for every feature dimension, where learnable membership functions and rule-based reasoning estimate token-wise relational strengths. These FIS-derived interaction weights capture uncertainty and provide interpretable, continuous mappings between tokens. A softmax operation is applied along the token axis to normalize these weights, which are then combined with the corresponding value features through element-wise multiplication to yield the final context-enhanced token representations. This design fuses the interpretability and uncertai...