Long-Term Ad Memorability: Understanding Generating Memorable Ads
arxiv(2023)
摘要
Marketers spend billions of dollars on advertisements, but to what end? At
purchase time, if customers cannot recognize the brand for which they saw an
ad, the money spent on the ad is essentially wasted. Despite its importance in
marketing, until now, there has been no large-scale study on the memorability
of ads. All previous memorability studies have been conducted on short-term
recall on specific content types like action videos. On the other hand, the
advertising industry only cares about long-term memorability, and ads are
almost always highly multimodal. Therefore, we release the first memorability
dataset, LAMBDA, consisting of 1749 participants and 2205 ads covering 276
brands. Running statistical tests over different participant subpopulations and
ad types, we find many interesting insights into what makes an ad memorable,
e.g., fast-moving ads are more memorable than those with slower scenes; people
who use ad-blockers remember a lower number of ads than those who don't. Next,
we present a model, Henry, to predict the memorability of a content. Henry
achieves state-of-the-art performance across all prominent literature
memorability datasets. It shows strong generalization performance with better
results in 0-shot on unseen datasets. Finally, with the intent of memorable ad
generation, we present a scalable method to build a high-quality memorable ad
generation model by leveraging automatically annotated data. Our approach, SEED
(Self rEwarding mEmorability Modeling), starts with a language model trained on
LAMBDA as seed data and progressively trains an LLM to generate more memorable
ads. We show that the generated advertisements have 44
scores than the original ads. We release this large-scale ad dataset,
UltraLAMBDA, consisting of 5 million ads. Our code and datasets are available
at https://behavior-in-the-wild.github.io/memorability.
更多查看译文
AI 理解论文
溯源树
样例
![](https://originalfileserver.aminer.cn/sys/aminer/pubs/mrt_preview.jpeg)
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要