Minimizing the Gap for Knowledge Sharing and Transfer
Authors: Boyu Wang, Jorge A. Mendez, Changjian Shui, Fan Zhou, Di Wu, Gezheng Xu, Christian Gagné, Eric Eaton; Published in 2023; Volume 24, Issue 33, Pages 1-57.
Abstract
Over the past few decades, there has been an increasing interest in learning from multiple related tasks through knowledge sharing and transfer. To effectively transfer information from one task to another, it is crucial to understand the similarities and differences between the domains. This paper introduces the concept of performance gap, a novel measure that quantifies the distance between learning tasks. Unlike existing measures used to bound the difference in expected risks between tasks (e.g., $\mathcal{H}$-divergence or discrepancy distance), we demonstrate that the performance gap can be considered as a data- and algorithm-dependent regularizer. It controls the complexity of the model and provides more precise guarantees. Furthermore, it inspires a new principle for designing strategies for knowledge sharing and transfer: gap minimization. To implement this principle, two algorithms are proposed: 1. gapBoost, a principled boosting algorithm that explicitly minimizes the performance gap between source and target domains for transfer learning, and 2. gapMTNN, a representation learning algorithm that reformulates gap minimization as semantic conditional matching for multitask learning. Extensive evaluations on benchmark datasets for both transfer learning and multitask learning demonstrate that our methods outperform existing baselines.
[abs]