AI helps you reading Science

AI generates interpretation videos

AI extracts and analyses the key points of the paper to generate videos automatically


pub
Go Generating

AI Traceability

AI parses the academic lineage of this thesis


Master Reading Tree
Generate MRT

AI Insight

AI extracts a summary of this paper


Weibo:
We introduce a new task in the area of computational creativity: acrostic poem generation in English

Acrostic Poem Generation

EMNLP 2020, (2020)

Cited by: 0|Views10
Full Text
Bibtex
Weibo

Abstract

We propose a new task in the area of computational creativity: acrostic poem generation in English. Acrostic poems are poems that contain a hidden message; typically, the first letter of each line spells out a word or short phrase. We define the task as a generation task with multiple constraints: given an input word, 1) the initial let...More

Code:

Data:

0
Introduction
  • Poetry, derived from the Greek word poiesis (”making”), is the art of combining rhythmic and aesthetic properties of a language to convey a specific message.
  • Acrostics, are a special type of poetry, in which typically the first letter of each line spells out a word or message, as in the example in Figure 1.
  • While this is the only formal characteristic of an acrostic, the authors here define the task of acrostic poem generation as generating poems such that, poems should both rhyme and relate to the topic of their hidden word, e.g., the content of the poem in Figure 1 should be related to the word ”poet”.
  • As the authors define it, is a challenging constrained generation task with multiple constraints: semantic ones, and structural ones
Highlights
  • Poetry, derived from the Greek word poiesis (”making”), is the art of combining rhythmic and aesthetic properties of a language to convey a specific message
  • Since the development of creative machines is a crucial step towards real artificial intelligence, automatic poem generation is an important task at the intersection of computational creativity and natural language generation, and earliest attempts date back several decades; see Goncalo Oliveira (2017) for an overview
  • Acrostics, are a special type of poetry, in which typically the first letter of each line spells out a word or message, as in the example in Figure 1. While this is the only formal characteristic of an acrostic, we here define the task of acrostic poem generation as generating poems such that, poems should both rhyme and relate to the topic of their hidden word, e.g., the content of the poem in Figure 1 should be related to the word ”poet”
  • We describe all models that are either part of our baseline for acrostic poem generation or used for data preprocessing
  • We introduce a new task in the area of computational creativity: acrostic poem generation in English
  • We further present a baseline for the task, based on a neural language model which has been pretrained on Wikipedia and fine-tuned on a combination of poems with gold standard and automatically predicted topics
Methods
  • ”+” and ”-” indicate if topics are fed into the model (+) or substituted by zero vectors (-)
Results
  • Results on the test split of KnownTopicPoems are shown in Table 2.
  • As shown in Table 6, the annotators agree on the poem generated by NeuralPoet to be closer related to its topic for 21 out of 40 poems.
  • In 15 cases, the two annotators disagree, and only in 4 cases they find the poem generated by NeuralPoetST-TP, i.e., the model that does not know about the topic, to be more similar to it.
  • The authors' model works well even for topics it has not seen during training
Conclusion
  • The authors introduce a new task in the area of computational creativity: acrostic poem generation in English.
  • The task consists of creating poems with the following constraints: 1) the first letters of all lines should spell out a given word, 2) the poem’s content should be related to that word, and 3) the poem should conform to a rhyming scheme.
  • The authors further present a baseline for the task, based on a neural language model which has been pretrained on Wikipedia and fine-tuned on a combination of poems with gold standard and automatically predicted topics.
  • The authors' model’s poems are topic-wise closely related to the acrostic word.
  • The authors' neural poet is available at https://nala-cub.github.io/resources as a baseline for future research on the task
Tables
  • Table1: Number of poems in our datasets used for training, listed by the number of lines they contain
  • Table2: Perplexity on the test set of KnownTopicPoems for all language models; best score in bold
  • Table3: Human evaluation and ablation study; F = Fluency; M = Meaning; P = Poeticness; A = Overall; ST=selecting first words for each line according to the acrostic; AC=acrostic forcing; RH=rhyming model; TP=feeding of topic vector
  • Table4: The acrostic words used to generate poems in our experiments, corresponding to known or unknown topics
  • Table5: Example poems generated by our model for the indicated topics and used in our evaluation.♠ = unknown topic; ♥ = known topic
  • Table6: Number of poems correctly or incorrectly identified by human annotators as belonging to the given topic. Disagreement denotes examples where the annotators selected different poems
  • Table7: Training times and number of parameters for our models. All models have been trained with a batch size of 128 on an NVIDIA Titan V GPU with 12 GB RAM
Download tables as Excel
Related work
  • Automated poetry generation has long been getting attention from researchers at the intersection of artificial intelligence and computational creativity. Even before the advent of deep learning, researchers used stochastic models and algorithms to generate poems (Queneau, 1961; Oulipo (Association), 1981; Gervas, 2000; Manurung, 2003). With the advancements in deep learning, more and more researchers are exploring possibilities of training neural networks to generate poems which mimic human creativity. The authors of Lau et al (2018) trained a model on generating Shakespearean sonnets. They used a hybrid word-character LSTMbased recurrent neural network to generate poems, and used separate rhythmic and rhyming models to enforce sonnet structure on the poems generated. All three component were trained in a multi-task fashion. Their crowd-work and expert evaluations suggested that the generated poems conformed to the sonnet structure, however lacked readability and coherent meaning. We make use of explicit representations of topics to address the poem coherence and readability concern: as our poems are generated based on a topic, we expect them to be more coherent. Authors of Wang et al (2018) generated Chinese poems based on images, rather than topic words. They used a combination of a convolutional neural network (CNN) and a gated recurrent unit (GRU) to generate poems which related to the target image. They also generated acrostic poems, but used character-level modelling to achieve this – which was simpler than in our case, since they worked with Chinese text where characters often correspond to entire words. Our preliminary experiments on English showed that character-level models learn easily to generate acrostics by themselves, however do not follow the topic as coherently as word-level models. Zhang and Lapata (2014); Zhang et al (2017); Yang et al (2017); Yi et al (2018a,c); Yang et al (2019) are other examples of work on generating Chinese poems, but did not focus on acrostics.
Funding
  • We use 80%, 10%, and 10% of the data for training, development, and test, respectively
Study subjects and analysis
datasets: 4
Finally, we show that model performance—in terms of perplexity on a held-out validation set—can be improved by pretraining on Wikipedia. To train the baseline model for our new task, we make use of 4 datasets, which we will describe here, before explaining the actual model in the next section. KnownTopicPoems

cases: 15
As shown in Table 6, our annotators agree on the poem generated by NeuralPoet to be closer related to its topic for 21 out of 40 poems. In 15 cases, the two annotators disagree, and only in 4 cases they find the poem generated by NeuralPoetST-TP, i.e., the model that does in fact not know about the topic, to be more similar to it. This indicates that our poems indeed confirm with the topic given by the acrostic word

Reference
  • Pablo Gervas. 2000. Wasp: Evaluation of different strategies for the automatic generation of spanish verse. In AISB symposium on creative & cultural aspects of AI.
    Google ScholarFindings
  • Marjan Ghazvininejad, Xing Shi, Yejin Choi, and Kevin Knight. 2016. Generating topical poetry. In Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pages 1183–1191, Austin, Texas. Association for Computational Linguistics.
    Google ScholarLocate open access versionFindings
  • Marjan Ghazvininejad, Xing Shi, Jay Priyadarshi, and Kevin Knight. 2017. Hafez: an interactive poetry generation system. In Proceedings of ACL 2017, System Demonstrations, pages 43–48, Vancouver, Canada. Association for Computational Linguistics.
    Google ScholarLocate open access versionFindings
  • Edward Loper and Steven Bird. 2002. NLTK: The natural language toolkit. In Workshop on Effective Tools and Methodologies for Teaching Natural Language Processing and Computational Linguistics.
    Google ScholarLocate open access versionFindings
  • Hisar Manurung. 2003. An evolutionary algorithm approach to poetry generation. Ph.D. thesis, University of Edinburgh, College of Science and Engineering.
    Google ScholarFindings
  • Oulipo (Association). 1981. Atlas de litterature potentielle. Gallimard.
    Google ScholarLocate open access versionFindings
  • Jeffrey Pennington, Richard Socher, and Christopher D. Manning. 2014. GloVe: Global vectors for word representation. In EMNLP.
    Google ScholarFindings
  • Raymond Queneau. 1961. 100.000. 000.000.
    Google ScholarFindings
  • Nitish Srivastava, Geoffrey Hinton, Alex Krizhevsky, Ilya Sutskever, and Ruslan Salakhutdinov. 2014. Dropout: a simple way to prevent neural networks from overfitting. JMLR, 15(1):1929–1958.
    Google ScholarLocate open access versionFindings
  • Xiaoyu Wang, Xian Zhong, and Lin Li. 2018. Generating chinese classical poems based on images. In IMECS.
    Google ScholarFindings
  • Xiaopeng Yang, Xiaowen Lin, Shunda Suo, and Ming Li. 2017. Generating thematic chinese poetry using conditional variational autoencoders with hybrid decoders. In IJCAI.
    Google ScholarFindings
  • Hugo Goncalo Oliveira. 2017. A survey on intelligent poetry generation: Languages, features, techniques, reutilisation and evaluation. In Proceedings of the 10th International Conference on Natural Language Generation, pages 11–20, Santiago de Compostela, Spain. Association for Computational Linguistics.
    Google ScholarLocate open access versionFindings
  • Sepp Hochreiter and Jurgen Schmidhuber. 1997. Long short-term memory. Neural computation, 9(8):1735–1780.
    Google ScholarLocate open access versionFindings
  • Diederik P Kingma and Jimmy Ba. 20Adam: A method for stochastic optimization. arXiv:1412.6980.
    Findings
  • Jey Han Lau, Trevor Cohn, Timothy Baldwin, Julian Brooke, and Adam Hammond. 2018. Deep-speare: A joint neural model of poetic language, meter and rhyme. arXiv:1807.03491.
    Findings
  • Zhichao Yang, Pengshan Cai, Yansong Feng, Fei Li, Weijiang Feng, Elena Suet-Ying Chiu, and Hong Yu. 2019. Generating classical chinese poems from vernacular chinese. In EMNLP-IJCNLP.
    Google ScholarFindings
  • Xiaoyuan Yi, Ruoyu Li, and Maosong Sun. 2018a. Chinese poetry generation with a salient-clue mechanism. In CoNLL.
    Google ScholarFindings
  • Xiaoyuan Yi, Maosong Sun, Ruoyu Li, and Wenhao Li. 2018b. Automatic poetry generation with mutual reinforcement learning. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 3143–3153, Brussels, Belgium. Association for Computational Linguistics.
    Google ScholarLocate open access versionFindings
  • Xiaoyuan Yi, Maosong Sun, Ruoyu Li, and Zonghan Yang. 2018c. Chinese poetry generation with a working memory model. In IJCAI.
    Google ScholarFindings
  • Bei Liu, Jianlong Fu, Makoto P Kato, and Masatoshi Yoshikawa. 2018. Beyond narrative description: generating poetry from images by multi-adversarial training. In ACM MM.
    Google ScholarLocate open access versionFindings
  • Malte Loller-Andersen and Bjorn Gamback. 2018. Deep learning-based poetry generation given visual input. In ICCC.
    Google ScholarFindings
  • Jiyuan Zhang, Yang Feng, Dong Wang, Yang Wang, Andrew Abel, Shiyue Zhang, and Andi Zhang. 2017. Flexible and creative Chinese poetry generation using neural memory. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 1364– 1373, Vancouver, Canada. Association for Computational Linguistics.
    Google ScholarLocate open access versionFindings
  • Xingxing Zhang and Mirella Lapata. 2014. Chinese poetry generation with recurrent neural networks. In Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 670–680, Doha, Qatar. Association for Computational Linguistics.
    Google ScholarLocate open access versionFindings
Author
Rajat Agarwal
Rajat Agarwal
Katharina Kann
Katharina Kann
Your rating :
0

 

Tags
Comments
数据免责声明
页面数据均来自互联网公开来源、合作出版商和通过AI技术自动分析结果,我们不对页面数据的有效性、准确性、正确性、可靠性、完整性和及时性做出任何承诺和保证。若有疑问,可以通过电子邮件方式联系我们:report@aminer.cn
小科