AI helps you reading Science
AI generates interpretation videos
AI extracts and analyses the key points of the paper to generate videos automatically
AI parses the academic lineage of this thesis
AI extracts a summary of this paper
We introduce a new task in the area of computational creativity: acrostic poem generation in English
Acrostic Poem Generation
EMNLP 2020, (2020)
We propose a new task in the area of computational creativity: acrostic poem generation in English. Acrostic poems are poems that contain a hidden message; typically, the first letter of each line spells out a word or short phrase. We define the task as a generation task with multiple constraints: given an input word, 1) the initial let...More
PPT (Upload PPT)
- Poetry, derived from the Greek word poiesis (”making”), is the art of combining rhythmic and aesthetic properties of a language to convey a specific message.
- Acrostics, are a special type of poetry, in which typically the first letter of each line spells out a word or message, as in the example in Figure 1.
- While this is the only formal characteristic of an acrostic, the authors here define the task of acrostic poem generation as generating poems such that, poems should both rhyme and relate to the topic of their hidden word, e.g., the content of the poem in Figure 1 should be related to the word ”poet”.
- As the authors define it, is a challenging constrained generation task with multiple constraints: semantic ones, and structural ones
- Poetry, derived from the Greek word poiesis (”making”), is the art of combining rhythmic and aesthetic properties of a language to convey a specific message
- Since the development of creative machines is a crucial step towards real artificial intelligence, automatic poem generation is an important task at the intersection of computational creativity and natural language generation, and earliest attempts date back several decades; see Goncalo Oliveira (2017) for an overview
- Acrostics, are a special type of poetry, in which typically the first letter of each line spells out a word or message, as in the example in Figure 1. While this is the only formal characteristic of an acrostic, we here define the task of acrostic poem generation as generating poems such that, poems should both rhyme and relate to the topic of their hidden word, e.g., the content of the poem in Figure 1 should be related to the word ”poet”
- We describe all models that are either part of our baseline for acrostic poem generation or used for data preprocessing
- We introduce a new task in the area of computational creativity: acrostic poem generation in English
- We further present a baseline for the task, based on a neural language model which has been pretrained on Wikipedia and fine-tuned on a combination of poems with gold standard and automatically predicted topics
- ”+” and ”-” indicate if topics are fed into the model (+) or substituted by zero vectors (-)
- Results on the test split of KnownTopicPoems are shown in Table 2.
- As shown in Table 6, the annotators agree on the poem generated by NeuralPoet to be closer related to its topic for 21 out of 40 poems.
- In 15 cases, the two annotators disagree, and only in 4 cases they find the poem generated by NeuralPoetST-TP, i.e., the model that does not know about the topic, to be more similar to it.
- The authors' model works well even for topics it has not seen during training
- The authors introduce a new task in the area of computational creativity: acrostic poem generation in English.
- The task consists of creating poems with the following constraints: 1) the first letters of all lines should spell out a given word, 2) the poem’s content should be related to that word, and 3) the poem should conform to a rhyming scheme.
- The authors further present a baseline for the task, based on a neural language model which has been pretrained on Wikipedia and fine-tuned on a combination of poems with gold standard and automatically predicted topics.
- The authors' model’s poems are topic-wise closely related to the acrostic word.
- The authors' neural poet is available at https://nala-cub.github.io/resources as a baseline for future research on the task
- Table1: Number of poems in our datasets used for training, listed by the number of lines they contain
- Table2: Perplexity on the test set of KnownTopicPoems for all language models; best score in bold
- Table3: Human evaluation and ablation study; F = Fluency; M = Meaning; P = Poeticness; A = Overall; ST=selecting first words for each line according to the acrostic; AC=acrostic forcing; RH=rhyming model; TP=feeding of topic vector
- Table4: The acrostic words used to generate poems in our experiments, corresponding to known or unknown topics
- Table5: Example poems generated by our model for the indicated topics and used in our evaluation.♠ = unknown topic; ♥ = known topic
- Table6: Number of poems correctly or incorrectly identified by human annotators as belonging to the given topic. Disagreement denotes examples where the annotators selected different poems
- Table7: Training times and number of parameters for our models. All models have been trained with a batch size of 128 on an NVIDIA Titan V GPU with 12 GB RAM
- Automated poetry generation has long been getting attention from researchers at the intersection of artificial intelligence and computational creativity. Even before the advent of deep learning, researchers used stochastic models and algorithms to generate poems (Queneau, 1961; Oulipo (Association), 1981; Gervas, 2000; Manurung, 2003). With the advancements in deep learning, more and more researchers are exploring possibilities of training neural networks to generate poems which mimic human creativity. The authors of Lau et al (2018) trained a model on generating Shakespearean sonnets. They used a hybrid word-character LSTMbased recurrent neural network to generate poems, and used separate rhythmic and rhyming models to enforce sonnet structure on the poems generated. All three component were trained in a multi-task fashion. Their crowd-work and expert evaluations suggested that the generated poems conformed to the sonnet structure, however lacked readability and coherent meaning. We make use of explicit representations of topics to address the poem coherence and readability concern: as our poems are generated based on a topic, we expect them to be more coherent. Authors of Wang et al (2018) generated Chinese poems based on images, rather than topic words. They used a combination of a convolutional neural network (CNN) and a gated recurrent unit (GRU) to generate poems which related to the target image. They also generated acrostic poems, but used character-level modelling to achieve this – which was simpler than in our case, since they worked with Chinese text where characters often correspond to entire words. Our preliminary experiments on English showed that character-level models learn easily to generate acrostics by themselves, however do not follow the topic as coherently as word-level models. Zhang and Lapata (2014); Zhang et al (2017); Yang et al (2017); Yi et al (2018a,c); Yang et al (2019) are other examples of work on generating Chinese poems, but did not focus on acrostics.
- We use 80%, 10%, and 10% of the data for training, development, and test, respectively
Study subjects and analysis
Finally, we show that model performance—in terms of perplexity on a held-out validation set—can be improved by pretraining on Wikipedia. To train the baseline model for our new task, we make use of 4 datasets, which we will describe here, before explaining the actual model in the next section. KnownTopicPoems
As shown in Table 6, our annotators agree on the poem generated by NeuralPoet to be closer related to its topic for 21 out of 40 poems. In 15 cases, the two annotators disagree, and only in 4 cases they find the poem generated by NeuralPoetST-TP, i.e., the model that does in fact not know about the topic, to be more similar to it. This indicates that our poems indeed confirm with the topic given by the acrostic word
- Pablo Gervas. 2000. Wasp: Evaluation of different strategies for the automatic generation of spanish verse. In AISB symposium on creative & cultural aspects of AI.
- Marjan Ghazvininejad, Xing Shi, Yejin Choi, and Kevin Knight. 2016. Generating topical poetry. In Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pages 1183–1191, Austin, Texas. Association for Computational Linguistics.
- Marjan Ghazvininejad, Xing Shi, Jay Priyadarshi, and Kevin Knight. 2017. Hafez: an interactive poetry generation system. In Proceedings of ACL 2017, System Demonstrations, pages 43–48, Vancouver, Canada. Association for Computational Linguistics.
- Edward Loper and Steven Bird. 2002. NLTK: The natural language toolkit. In Workshop on Effective Tools and Methodologies for Teaching Natural Language Processing and Computational Linguistics.
- Hisar Manurung. 2003. An evolutionary algorithm approach to poetry generation. Ph.D. thesis, University of Edinburgh, College of Science and Engineering.
- Oulipo (Association). 1981. Atlas de litterature potentielle. Gallimard.
- Jeffrey Pennington, Richard Socher, and Christopher D. Manning. 2014. GloVe: Global vectors for word representation. In EMNLP.
- Raymond Queneau. 1961. 100.000. 000.000.
- Nitish Srivastava, Geoffrey Hinton, Alex Krizhevsky, Ilya Sutskever, and Ruslan Salakhutdinov. 2014. Dropout: a simple way to prevent neural networks from overfitting. JMLR, 15(1):1929–1958.
- Xiaoyu Wang, Xian Zhong, and Lin Li. 2018. Generating chinese classical poems based on images. In IMECS.
- Xiaopeng Yang, Xiaowen Lin, Shunda Suo, and Ming Li. 2017. Generating thematic chinese poetry using conditional variational autoencoders with hybrid decoders. In IJCAI.
- Hugo Goncalo Oliveira. 2017. A survey on intelligent poetry generation: Languages, features, techniques, reutilisation and evaluation. In Proceedings of the 10th International Conference on Natural Language Generation, pages 11–20, Santiago de Compostela, Spain. Association for Computational Linguistics.
- Sepp Hochreiter and Jurgen Schmidhuber. 1997. Long short-term memory. Neural computation, 9(8):1735–1780.
- Diederik P Kingma and Jimmy Ba. 20Adam: A method for stochastic optimization. arXiv:1412.6980.
- Jey Han Lau, Trevor Cohn, Timothy Baldwin, Julian Brooke, and Adam Hammond. 2018. Deep-speare: A joint neural model of poetic language, meter and rhyme. arXiv:1807.03491.
- Zhichao Yang, Pengshan Cai, Yansong Feng, Fei Li, Weijiang Feng, Elena Suet-Ying Chiu, and Hong Yu. 2019. Generating classical chinese poems from vernacular chinese. In EMNLP-IJCNLP.
- Xiaoyuan Yi, Ruoyu Li, and Maosong Sun. 2018a. Chinese poetry generation with a salient-clue mechanism. In CoNLL.
- Xiaoyuan Yi, Maosong Sun, Ruoyu Li, and Wenhao Li. 2018b. Automatic poetry generation with mutual reinforcement learning. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 3143–3153, Brussels, Belgium. Association for Computational Linguistics.
- Xiaoyuan Yi, Maosong Sun, Ruoyu Li, and Zonghan Yang. 2018c. Chinese poetry generation with a working memory model. In IJCAI.
- Bei Liu, Jianlong Fu, Makoto P Kato, and Masatoshi Yoshikawa. 2018. Beyond narrative description: generating poetry from images by multi-adversarial training. In ACM MM.
- Malte Loller-Andersen and Bjorn Gamback. 2018. Deep learning-based poetry generation given visual input. In ICCC.
- Jiyuan Zhang, Yang Feng, Dong Wang, Yang Wang, Andrew Abel, Shiyue Zhang, and Andi Zhang. 2017. Flexible and creative Chinese poetry generation using neural memory. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 1364– 1373, Vancouver, Canada. Association for Computational Linguistics.
- Xingxing Zhang and Mirella Lapata. 2014. Chinese poetry generation with recurrent neural networks. In Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 670–680, Doha, Qatar. Association for Computational Linguistics.