[2603.05114] UniPAR: A Unified Framework for Pedestrian Attribute Recognition
About this article
Abstract page for arXiv paper 2603.05114: UniPAR: A Unified Framework for Pedestrian Attribute Recognition
Computer Science > Computer Vision and Pattern Recognition arXiv:2603.05114 (cs) [Submitted on 5 Mar 2026] Title:UniPAR: A Unified Framework for Pedestrian Attribute Recognition Authors:Minghe Xu, Rouying Wu, Jiarui Xu, Minhao Sun, Zikang Yan, Xiao Wang, ChiaWei Chu, Yu Li View a PDF of the paper titled UniPAR: A Unified Framework for Pedestrian Attribute Recognition, by Minghe Xu and 7 other authors View PDF HTML (experimental) Abstract:Pedestrian Attribute Recognition is a foundational computer vision task that provides essential support for downstream applications, including person retrieval in video surveillance and intelligent retail analytics. However, existing research is frequently constrained by the ``one-model-per-dataset" paradigm and struggles to handle significant discrepancies across domains in terms of modalities, attribute definitions, and environmental scenarios. To address these challenges, we propose UniPAR, a unified Transformer-based framework for PAR. By incorporating a unified data scheduling strategy and a dynamic classification head, UniPAR enables a single model to simultaneously process diverse datasets from heterogeneous modalities, including RGB images, video sequences, and event streams. We also introduce an innovative phased fusion encoder that explicitly aligns visual features with textual attribute queries through a late deep fusion strategy. Experimental results on the widely used benchmark datasets, including MSP60K, DukeMTMC, and EventPA...