[2603.19918] Learning Like Humans: Analogical Concept Learning for Generalized Category Discovery
About this article
Abstract page for arXiv paper 2603.19918: Learning Like Humans: Analogical Concept Learning for Generalized Category Discovery
Computer Science > Computer Vision and Pattern Recognition arXiv:2603.19918 (cs) [Submitted on 20 Mar 2026] Title:Learning Like Humans: Analogical Concept Learning for Generalized Category Discovery Authors:Jizhou Han, Chenhao Ding, Yuhang He, Qiang Wang, Shaokun Wang, SongLin Dong, Yihong Gong View a PDF of the paper titled Learning Like Humans: Analogical Concept Learning for Generalized Category Discovery, by Jizhou Han and 6 other authors View PDF HTML (experimental) Abstract:Generalized Category Discovery (GCD) seeks to uncover novel categories in unlabeled data while preserving recognition of known categories, yet prevailing visual-only pipelines and the loose coupling between supervised learning and discovery often yield brittle boundaries on fine-grained, look-alike categories. We introduce the Analogical Textual Concept Generator (ATCG), a plug-and-play module that analogizes from labeled knowledge to new observations, forming textual concepts for unlabeled samples. Fusing these analogical textual concepts with visual features turns discovery into a visual-textual reasoning process, transferring prior knowledge to novel data and sharpening category separation. ATCG attaches to both parametric and clustering style GCD pipelines and requires no changes to their overall design. Across six benchmarks, ATCG consistently improves overall, known-class, and novel-class performance, with the largest gains on fine-grained data. Our code is available at: this https URL. Comm...