arXiv:2601.10779v2 Announce Type: replace-cross
Abstract: In multi-source transfer learning, a key challenge lies in how to appropriately differentiate and utilize heterogeneous source tasks. However, existing multi-source methods typically focus on optimizing either the source weights or the amount of transferred samples, largely neglecting their joint consideration. In this work, we propose a theoretical framework, Unified Optimization of Weights and Quantities (UOWQ), that jointly determines the optimal source weights and transfer quantities for each source task. Specifically, the framework formulates multi-source transfer learning as a parameter estimation problem based on an asymptotic analysis of a Kullback–Leibler divergence–based generalization error measure, leading to two main theoretical findings: 1) using all available source samples is always optimal when the weights are properly adjusted; 2) the optimal source weights are characterized by a principled optimization problem whose structure explicitly incorporates the Fisher information, parameter discrepancy, parameter dimensionality, and transfer quantities. Building on the theoretical results, we further propose a practical algorithm for multi-source transfer learning, and extend it to multi-task learning settings where each task simultaneously serves as both a source and a target. Extensive experiments on real-world benchmarks, including DomainNet and Office-Home, demonstrate that UOWQ consistently outperforms strong baselines. The results validate both the theoretical predictions and the practical effectiveness of our framework.
The one piece of data that could actually shed light on your job and AI
This story originally appeared in The Algorithm, our weekly newsletter on AI. To get stories like this in your inbox first, sign up here. Within Silicon


