Many real world datasets occur or can be arranged into multi-modal structures. With such datasets, the tasks to be learnt can be referenced by multiple indices. Current multitask learning frameworks are not designed to account for the preservation of this information. We propose the use of multilinear algebra as a natural way to model such a set of related tasks. We present two learning methods; one is an adapted convex relaxation method used in the context of tensor completion. The second method is based on the Tucker decomposition and on alternating minimization. Experiments on synthetic and real data indicate that the multilinear approaches provide a significant improvement over other multitask learning methods. Overall our second approach yields the best performance in all datasets.
|Number of pages||9|
|Publication status||Published - 1 Jan 2013|
|Event||30th International Conference on Machine Learning, ICML 2013 - Atlanta, GA, United States|
Duration: 16 Jun 2013 → 21 Jun 2013
|Conference||30th International Conference on Machine Learning, ICML 2013|
|Period||16/06/13 → 21/06/13|