Neural fashion experts: I know how to make the complementary clothing matching.

Neurocomputing(2019)

引用 15|浏览86
暂无评分
摘要
Clothing has gradually become the beauty enhancing product, while the harmonious clothing matching is critical for a suitable outfit. The existing clothing matching techniques mainly rely on the visual features but overlook the textual metadata, which may be insufficient to comprehensively encode the fashion items. Nowadays, fashion experts are enabled to share their fashion tips by demonstrating their outfit compositions on the fashion-oriented online communities. Each outfit usually consists of several complementary fashion items (e.g., a top, a bottom and a pair of shoes), which involves an image along with the textual metadata (e.g., the categories and titles). The rich fashion data provide us an opportunity for the clothing matching, especially the complementary fashion item matching. In this work, we propose a multiple autoencoder neural network based on the Bayesian Personalized Ranking, dubbed BPR-MAE. Seamlessly exploring the multi-modalities (i.e., the visual and textual modalities) of fashion items, this framework is able to not only comprehensively model the compatibility between fashion items (e.g., tops and bottoms, bottoms and shoes) but also fulfill the complementary fashion item matching among multiple fashion items. Experimental results on the real-world dataset FashionVC+ demonstrate the effectiveness of BPR-MAE, based on which we provide certain deep insights that can benefit the future research.
更多
查看译文
关键词
Multi-modal,Compatibility modeling,Complementary fashion item matching,Fashion analysis
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要