ARC: Adversarial Robust Cuts for Semi-Supervised and Multi-Label Classification.

IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops(2018)

引用 20|浏览67
暂无评分
摘要
Many structured prediction tasks arising in computer vision and natural language processing tractably reduce to making minimum cost cuts in graphs with edge weights learned using maximum margin methods. Unfortunately, the hinge loss used to construct these methods often provides a particularly loose bound on the loss function of interest (e.g., the Hamming loss). We develop Adversarial Robust Cuts (ARC), an approach that poses the learning task as a minimax game between predictor and "label approximator" based on minimum cost graph cuts. Unlike maximum margin methods, this game-theoretic perspective always provides meaningful bounds on the Hamming loss. We conduct multi-label and semi-supervised binary prediction experiments that demonstrate the benefits of our approach.
更多
查看译文
关键词
Hamming loss,semisupervised binary prediction experiments,ARC,multilabel classification,structured prediction tasks,computer vision,natural language processing,minimum cost cuts,edge weights,maximum margin methods,hinge loss,loss function,learning task,minimum cost graph cuts,adversarial robust cuts
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要