Print Email Facebook Twitter Mixed Integer (Non-) Linear Programming Formulations of Graph Neural Networks Title Mixed Integer (Non-) Linear Programming Formulations of Graph Neural Networks Author Mc Donald, Tom (TU Delft Electrical Engineering, Mathematics and Computer Science) Contributor Schweidtmann, A.M. (mentor) Yorke-Smith, N. (graduation committee) Degree granting institution Delft University of Technology Programme Applied Mathematics Date 2022-11-11 Abstract Recently, ReLU neural networks have been modelled as constraints in mixed integer linear programming (MILP) enabling surrogate-based optimisation in various domains as well as efficient solution of machine learning verification problems. However, previous works have been limited to multilayer perceptrons (MLPs). The Graph Convolutional Neural Network (GCN) model and the GraphSAGE model can learn from non-euclidean data structures efficiently. We propose a bilinear formulation for ReLU GCNs and a MILP formulation for ReLU GraphSAGE models. We compare our formulations to a Genetic Algorithm (GA) by comparing solution times and optimality gaps while modelling a dataset of boiling points of different molecules. Our method guarantees to solve optimisation problems with trained GNNs embedded to global optimality. Between our two formulations the GraphSAGE neural network achieves similar model accuracy, and achieves faster solving times when embedded as a surrogate model in an MILP problem. Finally, we present a computer aided molecular design (CAMD) case study where the formulations of the trained GNNs are used to find molecules with optimal boiling points. Subject Graph neural networksMILPOptimization To reference this document use: http://resolver.tudelft.nl/uuid:b1d7ce2f-f773-4593-ac72-6e086c4d2d11 Part of collection Student theses Document type master thesis Rights © 2022 Tom Mc Donald Files PDF TomMcDonaldThesis.pdf 18.21 MB Close viewer /islandora/object/uuid:b1d7ce2f-f773-4593-ac72-6e086c4d2d11/datastream/OBJ/view