Learning from dependent data Dissertation Thesis

Author(s): Zimin, Alexander
Advisor(s): Lampert, Christoph
Title: Learning from dependent data
Affiliation IST Austria
Abstract: The most common assumption made in statistical learning theory is the assumption of the independent and identically distributed (i.i.d.) data. While being very convenient mathematically, it is often very clearly violated in practice. This disparity between the machine learning theory and applications underlies a growing demand in the development of algorithms that learn from dependent data and theory that can provide generalization guarantees similar to the independent situations. This thesis is dedicated to two variants of dependencies that can arise in practice. One is a dependence on the level of samples in a single learning task. Another dependency type arises in the multi-task setting when the tasks are dependent on each other even though the data for them can be i.i.d. In both cases we model the data (samples or tasks) as stochastic processes and introduce new algorithms for both settings that take into account and exploit the resulting dependencies. We prove the theoretical guarantees on the performance of the introduced algorithms under different evaluation criteria and, in addition, we compliment the theoretical study by the empirical one, where we evaluate some of the algorithms on two real world datasets to highlight their practical applicability.
Publication Title: IST Dissertation
Degree Granting Institution: IST Austria  
Degree: PhD
Degree Date: 2018-09-01
Start Page: 1
Total Pages: 92
DOI: 10.15479/AT:ISTA:TH1048
Notes: This thesis was partially funded by the European Research Council under the European Unions Seventh Framework Programme (FP7/2007-2013)/ERC grant agreement no 308036. From 2013 to 2016 I have been an OMV scholar.
Open access: yes (repository)
IST Austria Authors
  1. Alexander Zimin
    2 Zimin
Related IST Austria Work