[2604.01474] Prime Once, then Reprogram Locally: An Efficient Alternative to Black-Box Service Model Adaptation
About this article
Abstract page for arXiv paper 2604.01474: Prime Once, then Reprogram Locally: An Efficient Alternative to Black-Box Service Model Adaptation
Computer Science > Computer Vision and Pattern Recognition arXiv:2604.01474 (cs) [Submitted on 1 Apr 2026] Title:Prime Once, then Reprogram Locally: An Efficient Alternative to Black-Box Service Model Adaptation Authors:Yunbei Zhang, Chengyi Cai, Feng Liu, Jihun Hamm View a PDF of the paper titled Prime Once, then Reprogram Locally: An Efficient Alternative to Black-Box Service Model Adaptation, by Yunbei Zhang and 3 other authors View PDF HTML (experimental) Abstract:Adapting closed-box service models (i.e., APIs) for target tasks typically relies on reprogramming via Zeroth-Order Optimization (ZOO). However, this standard strategy is known for extensive, costly API calls and often suffers from slow, unstable optimization. Furthermore, we observe that this paradigm faces new challenges with modern APIs (e.g., GPT-4o). These models can be less sensitive to the input perturbations ZOO relies on, thereby hindering performance gains. To address these limitations, we propose an Alternative efficient Reprogramming approach for Service models (AReS). Instead of direct, continuous closed-box optimization, AReS initiates a single-pass interaction with the service API to prime an amenable local pre-trained encoder. This priming stage trains only a lightweight layer on top of the local encoder, making it highly receptive to the subsequent glass-box (white-box) reprogramming stage performed directly on the local model. Consequently, all subsequent adaptation and inference rely solely...