fcai team’s Award Winning Research At AISTATS 2019

Markus Heinonen receiving the prize

Markus Heinonen receiving the prize

A paper by an FCAI team, “Deep learning with differential Gaussian process flows” was awarded the 2019 Notable paper award at the 2019 AI & Statistics conference, one of only three papers to be awarded the honour out of a field of over one thousand submissions. The international congress, which took place over three days in Okinawa, Japan, was an opportunity for several hundred A.I. researchers from around the globe to get together and discuss their work, and FCAI researchers and students were there presenting talks and posters.

The prize winning paper was written by Pashupati Hegde, Markus Heinonen, Harri Lähdesmäki, and Samuel Kaski and came out of a collaboration between the research groups of Professor Lähdesmäki and Professor Kaski.

New methods for Deep Learning

In deep learning, hundreds of successive computations are combined together to learn very complex tasks. This how computers and phones now recognize faces in images or translate languages. In the new paper by the FCAI team, combining all the computations together is replaced with a continuous transforming flow of inputs, which are used to perform the learning task in way that’s easier to interpret. The work also presents a new connection between deep learning and a group of mathematical models called “stochastic dynamical systems”. This connection means that, compared to common neural networks, the new method can understand how much uncertainty there is in the prediction process. This understanding of uncertainty means the new method excels at learning models where there are smaller amounts of data – potentially useful for future applications like personalized medicine or drug design.

Researchers from FCAI also presented the following talks and posters:

Talks

  • Deep learning with differential Gaussian process flows

    • Pashupati Hegde,  Markus Heinonen, Harri Lähdesmäki, Samuel Kaski

  • Nonlinear ICA Using Auxiliary Variables and Generalized Contrastive Learning

    • Aapo Hyvärinen

Posters

  • Analysis of Network Lasso for Semi-Supervised Regression

    • Alexander Jung, Natalia Vesselinova,

  • Variable selection for Gaussian processes via sensitivity analysis of the posterior predictive distribution

    • Topi Paananen, Juho Piironen (Curious AI); Michael Andersen, Aki Vehtari  

  • Know Your Boundaries: Constraining Gaussian Processes by Variational Harmonic Features

    • Arno Solin  

  • Harmonizable mixture kernels with variational Fourier features

    • Zheyang Shen, Markus Heinonen, Samuel Kaski  

  • On Structure Priors for Learning Bayesian Networks

    • Jussi Viinikka, Aleksis Vuoksenmaa, Mikko Koivisto

  • Estimation of Non-Normalized Mixture Models

    • Takeru Matsuda, Aapo Hyvärinen