Conditional Neural Processes

ICML, pp. 1690-1699, 2018.

Cited by: 156|Bibtex|Views32|Links
EI

Abstract:

Deep neural networks excel at function approximation, yet they are typically trained from scratch for each new function. On the other hand, Bayesian methods, such as Gaussian Processes (GPs), exploit prior knowledge to quickly infer the shape of a new function at test time. Yet GPs are computationally expensive, and it can be hard to de...More

Code:

Data:

Your rating :
0

 

Tags
Comments