Self-Attention Networks for Connectionist Temporal Classification in Speech Recognition

Julián Salazar
Julián Salazar

ICASSP, 2019.

Cited by: 35|Bibtex|Views30
EI
Other Links: dblp.uni-trier.de|academic.microsoft.com|arxiv.org

Abstract:

The success of self-attention in NLP has led to recent applications in end-to-end encoder-decoder architectures for speech recognition. Separately, connectionist temporal classification (CTC) has matured as an alignment-free, non-autoregressive approach to sequence transduction, either by itself or in various multitask and decoding framew...More

Code:

Data:

Full Text
Your rating :
0

 

Tags
Comments