Chrome Extension
WeChat Mini Program
Use on ChatGLM

TESS: Text-to-Text Self-Conditioned Simplex Diffusion

CoRR(2023)

Cited 0|Views117
No score
Abstract
Diffusion models have emerged as a powerful paradigm for generation,obtaining strong performance in various continuous domains. However, applyingcontinuous diffusion models to natural language remains challenging due to itsdiscrete nature and the need for a large number of diffusion steps to generatetext, making diffusion-based generation expensive. In this work, we proposeText-to-text Self-conditioned Simplex Diffusion (TESS), a text diffusion modelthat is fully non-autoregressive, employs a new form of self-conditioning, andapplies the diffusion process on the logit simplex space rather than thelearned embedding space. Through extensive experiments on natural languageunderstanding and generation tasks including summarization, textsimplification, paraphrase generation, and question generation, we demonstratethat TESS outperforms state-of-the-art non-autoregressive models, requiresfewer diffusion steps with minimal drop in performance, and is competitive withpretrained autoregressive sequence-to-sequence models. We publicly release ourcodebase at https://github.com/allenai/tess-diffusion.
More
Translated text
Key words
Topic Modeling,Language Modeling,Syntax-based Translation Models,Text Simplification,Statistical Language Models
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined