TY - GEN
T1 - Conditional neural processes
AU - Gamelo, Marta
AU - Rosenbaum, Dan
AU - Maddison, Chris J.
AU - Ramalho, Tiago
AU - Saxton, David
AU - Shanahan, Murray
AU - Teh, Yee Whye
AU - Rezende, Danilo J.
AU - Eslami, S. M.Ali
N1 - Publisher Copyright: © 2018 35th International Conference on Machine Learning, ICML 2018. All rights reserved.
PY - 2018
Y1 - 2018
N2 - Deep neural networks excel at function approximation, yet they are typically trained from scratch for each new function. On the other hand, Baycsian methods, such as Gaussian Processes (GPS), exploit prior knowledge to quickly infer the shape of a new function at test time. Yet GPS are computationally expensive, and it can be hard to design appropriate priors. In this paper we propose a family of neural models, Conditional Neural Processes (CNPs), that combine the benefits of both. CNPs arc inspired by the flexibility of stochastic processes such as GPS, but are structured as neural networks and trained via gradient descent. CNPs make accurate predictions after observing only a handful of training data points, yet scale to complex functions and large datasets. We demonstrate the performance and versatility of the approach on a range of canonical machine learning tasks, including regression, classification and image completion.
AB - Deep neural networks excel at function approximation, yet they are typically trained from scratch for each new function. On the other hand, Baycsian methods, such as Gaussian Processes (GPS), exploit prior knowledge to quickly infer the shape of a new function at test time. Yet GPS are computationally expensive, and it can be hard to design appropriate priors. In this paper we propose a family of neural models, Conditional Neural Processes (CNPs), that combine the benefits of both. CNPs arc inspired by the flexibility of stochastic processes such as GPS, but are structured as neural networks and trained via gradient descent. CNPs make accurate predictions after observing only a handful of training data points, yet scale to complex functions and large datasets. We demonstrate the performance and versatility of the approach on a range of canonical machine learning tasks, including regression, classification and image completion.
UR - http://www.scopus.com/inward/record.url?scp=85057320262&partnerID=8YFLogxK
M3 - Conference contribution
T3 - 35th International Conference on Machine Learning, ICML 2018
SP - 2738
EP - 2747
BT - 35th International Conference on Machine Learning, ICML 2018
A2 - Dy, Jennifer
A2 - Krause, Andreas
T2 - 35th International Conference on Machine Learning, ICML 2018
Y2 - 10 July 2018 through 15 July 2018
ER -