Print Email Facebook Twitter Bayesian deep learning Title Bayesian deep learning: Insights in the Bayesian paradigm for deep learning Author Schipper, Wieger (TU Delft Electrical Engineering, Mathematics and Computer Science; TU Delft Statistics) Contributor van der Vaart, A.W. (mentor) Heinlein, A. (graduation committee) Degree granting institution Delft University of Technology Programme Applied Mathematics | Stochastics Date 2023-08-30 Abstract In this thesis, we study a particle method for Bayesian deep learning. In particular, we look at the estimation of the parameters of an ensemble of Bayesian neural networks by means of this particle method, called Stein variational gradient descent (SVGD). This method iteratively updates a collection of parameters and it has the property that its update directions are chosen such that they optimally decrease the Kullback-Leibler divergence. We also study gradient flows of probability measures and show how gradient flows corresponding to functionals on the space of probability measures can induce particle flows. We formulate SVGD as a method in this space. In the regime of infinite particles we show results about convergence of SVGD. An existing convergence result for SVGD can be extended by showing that the probability measures, governing the collection of SVGD particles, are uniformly tight. We give conditions under which this holds. Subject Stein variational gradient descentBayesian deep learningWasserstein gradient flows To reference this document use: http://resolver.tudelft.nl/uuid:a1af7799-3e0a-4028-8470-b7fc0dc1c4fd Part of collection Student theses Document type master thesis Rights © 2023 Wieger Schipper Files PDF MSc_Thesis_WR_Schipper.pdf 8.73 MB Close viewer /islandora/object/uuid:a1af7799-3e0a-4028-8470-b7fc0dc1c4fd/datastream/OBJ/view