Bayesian Max-margin Multi-Task Learning with Data Augmentation.
ICML'14: Proceedings of the 31st International Conference on International Conference on Machine Learning - Volume 32(2014)
摘要
Both max-margin and Bayesian methods have been extensively studied in multi-task learning, but have rarely been considered together. We present Bayesian max-margin multi-task learning, which conjoins the two schools of methods, thus allowing the discriminative max-margin methods to enjoy the great flexibility of Bayesian methods on incorporating rich prior information as well as performing nonparametric Bayesian feature learning with the latent dimensionality resolved from data. We develop Gibbs sampling algorithms by exploring data augmentation to deal with the non-smooth hinge loss. For nonparametric models, our algorithms do not need to make mean-field assumptions or truncated approximation. Empirical results demonstrate superior performance than competitors in both multitask classification and regression.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络