Building recurrent networks by unfolding iterative thresholding for sequential sparse recovery

ICASSP, pp. 4346-4350, 2017.

Cited by: 12|Bibtex|Views34
EI
Other Links: dblp.uni-trier.de|academic.microsoft.com

Abstract:

Historically, sparse methods and neural networks, particularly modern deep learning methods, have been relatively disparate areas. Sparse methods are typically used for signal enhancement, compression, and recovery, usually in an unsupervised framework, while neural networks commonly rely on a supervised training set. In this paper, we us...More

Code:

Data:

Your rating :
0

 

Tags
Comments