Multi-task learning with labeled and unlabeled tasks Conference Paper


Author(s): Pentina, Anastasia; Lampert, Christoph H
Title: Multi-task learning with labeled and unlabeled tasks
Title Series: PMLR
Affiliation IST Austria
Abstract: In multi-task learning, a learner is given a collection of prediction tasks and needs to solve all of them. In contrast to previous work, which required that annotated training data must be available for all tasks, we consider a new setting, in which for some tasks, potentially most of them, only unlabeled training data is provided. Consequently, to solve all tasks, information must be transferred between tasks with labels and tasks without labels. Focusing on an instance-based transfer method we analyze two variants of this setting: when the set of labeled tasks is fixed, and when it can be actively selected by the learner. We state and prove a generalization bound that covers both scenarios and derive from it an algorithm for making the choice of labeled tasks (in the active case) and for transferring information between the tasks in a principled way. We also illustrate the effectiveness of the algorithm on synthetic and real data.
Conference Title: ICML: International Conference on Machine Learning
Volume: 70
Conference Dates: August 6 - August 11, 2017
Conference Location: Sydney, Australia
Publisher: Omnipress  
Date Published: 2017-06-08
Start Page: 2807
End Page: 2816
URL:
Notes: We thank Alexander Zimin and Marius Kloft for useful discussions. This work was in parts funded by the Euro- pean Research Council under the European Union’s Sev- enth Framework Programme (FP7/2007-2013)/ERC grant agreement no 308036
Open access: yes (repository)
IST Austria Authors
  1. Christoph Lampert
    87 Lampert
Related IST Austria Work