Label-Dependency Coding in Simple Recurrent Networks for Spoken Language Understanding

INTERSPEECH, pp. 2491-2495, 2017.

Cited by: 12|Views7
EI

Abstract:

Modelling target label dependencies is important for sequence labelling tasks. This may become crucial in the case of Spoken Language Understanding (SLU) applications, especially for the slot-filling task where models have to deal often with a high number of target labels. Conditional Random Fields (CRF) were previously considered as the ...More

Code:

Data:

Get fulltext within 24h
Bibtex
Your rating :
0

 

Tags
Comments