code2seq: Generating Sequences from Structured Representations of Code
ICLR, 2019.
EI
Abstract:
The ability to generate natural language sequences from source code snippets has a variety of applications such as code summarization, documentation, and retrieval. Sequence-to-sequence (seq2seq) models, adopted from neural machine translation (NMT), have achieved state-of-the-art performance on these tasks by treating source code as a ...More
Code:
Data:
Full Text
Tags
Comments