Building recurrent networks by unfolding iterative thresholding for sequential sparse recovery
ICASSP, pp. 4346-4350, 2017.
EI
Abstract:
Historically, sparse methods and neural networks, particularly modern deep learning methods, have been relatively disparate areas. Sparse methods are typically used for signal enhancement, compression, and recovery, usually in an unsupervised framework, while neural networks commonly rely on a supervised training set. In this paper, we us...More
Code:
Data:
Tags
Comments