Montréal by
David Iliff
/
CC-BY-2.5

This workshop will be hosted by NIPS, a large annual machine learning and computational neuroscience conference.

Integration is the central numerical operation required for Bayesian machine learning (in the form of marginalization and conditioning). Sampling algorithms still abound in this area, although it has long been known that Monte Carlo methods are fundamentally sub-optimal. The challenges for the development of better performing integration methods are mostly algorithmic. Moreover, recent algorithms have begun to outperform MCMC and its siblings, in wall-clock time, on realistic problems from machine learning.

The workshop will review the existing, by now quite strong, theoretical case against the use of random numbers for integration, discuss recent algorithmic developments, relationships between conceptual approaches, and highlight central research challenges going forward.

Among the questions to be addressed by the workshop are

- How fast can a practical integral estimate on a deterministic function converge (polynomially, super-polynomially, not just “better than sqrt(N)”)?
- How are these rates related, precisely, to prior assumptions about the integrand, and to the design rules of the integrator?
- To which degree can the source code of an integration problem be parsed to choose informative priors?
- Are random numbers necessary and helpful for efficient multivariate integration, or are they a conceptual crutch that cause inefficiencies?
- What are the practical challenges in the design of efficient multivariate integration methods that use such prior information?

The workshop builds upon the growing field of probabilistic numerics, for which probabilistic integration is a core component.

The workshop is organized by Mike Osborne and Philipp Hennig.

The workshop will be held on Friday, 11 December in room 512a.

- 09:00-09:10: Opening Remarks
- 09:10-09:40: Christian Robert (slides)
- 09:40-10:00: François-Xavier Briol (slides)
- 10.00-10.30:
*Coffee break* - 10:30-11:00: Arthur Gretton (slides)
- 11:00-11:30: Roman Garnett (slides)
- 11:30-11:45: George Papamakarios & Iain Murray, ‘Distilling Intractable Generative Models’ (slides)
- 11:45-12:00: Jan-Peter Calliess, ‘Bayesian Lipschitz Constant Estimation and Quadrature’ (slides)
- 12:00-14:30:
*Lunch break* - 14:30-15:00: Francis Bach (slides)
- 15:00-15:30: David Duvenaud (slides)
- 15:30-16:00: Max Welling (slides)
- 16:00-16:30:
*Coffee Break* - 16:30-17.30: Panel discussion

- Jan-Peter Calliess, ‘Bayesian Lipschitz Constant Estimation and Quadrature’
- George Papamakarios & Iain Murray, ‘Distilling Intractable Generative Models’

- Submission deadline: 18:00 GMT, 16 October 2015 (
**deadline extended**) - Notification of acceptance: 5 November 2015

- Bayesian quadrature
- Quadrature rules
- Kernel herding
- Quasi Monte Carlo
- Convergence diagnostics for MCMC

We welcome contributions from theoretical treatments, empirical studies, and applications, of the above. The list is not exhaustive, and we also welcome submissions that draw upon highly related topics.

Submissions should be in the NIPS 2015 format, with a maximum of 4 pages (excluding references). Accepted papers will be made available online at the workshop website, and will be presented in a spotlight talk at the workshop itself, but the workshop proceedings can be considered non-archival. Explicitly: shorter versions of relevant papers submitted or published elsewhere are encouraged. Submissions need not be anonymous.

Please mail pdf submissions to probnum@gmail.com.