Professor Dr Max Welling is a full professor and research chair in machine learning at the University of Amsterdam where he directs the research group AMLAB. He is also a Distinguished Scientist at MSR. He is a fellow at the Canadian Institute for Advanced Research (CIFAR) and the European Lab for Learning and Intelligent Systems (ELLIS) where he serves on the founding board. His previous appointments include VP at Qualcomm Technologies, professor at UC Irvine, postdoc at University of Toronto and UCL under supervision of prof. Geoffrey Hinton, and postdoc at Caltech under supervision of prof. Pietro Perona. He finished his PhD in theoretical high energy physics under supervision of Nobel laureate prof. Gerard ‘t Hooft.
Max Welling has served as associate editor in chief of IEEE TPAMI from 2011-2015, he serves on the advisory board of the Neurips foundation since 2015 and has been program chair and general chair of Neurips in 2013 and 2014 respectively. He was also program chair of AISTATS in 2009 and ECCV in 2016 and general chair and co-founder of MIDL 2018. Max Welling is recipient of the ECCV Koenderink Prize in 2010 and the ICML Test of Time award in 2021. He directs the Amsterdam Machine Learning Lab (AMLAB) and co-directs the Qualcomm-UvA deep learning lab (QUVA) and the Bosch-UvA Deep Learning lab (DELTA).
Joint work with Johannes Brandstetter and Daniel Worrall.
Deep learning has seen amazing advances over the past years, completely replacing traditional methods in fields such as speech recognition, natural language processing, image and video analysis and so on. A particularly versatile deep architecture that has gained much traction lately is the graph neural network (GNN), of which transformers represent a special case. GNNs have the desirable property that they can process graph structured data while respecting permutation symmetry. Recently, GNNs have found new applications in scientific computation, for instance to predict the properties of molecules or to predict the forces that act on atoms when they evolve (e.g. fold). In this application it is also key that geometric symmetries, such as translation and rotation symmetries are taken into consideration. Professor Max Welling will report on yet another exciting application of using GNNs to solve partial differential equations (PDEs). It turns out that GNNs are an excellent tool to develop neural PDE integrators. Moreover, PDEs are full of surprising symmetries that can be leveraged to train neural integrators with less data. Professor Max Welling will discuss this very exciting new chapter in deep learning. He will end with a discussion of whether reversely, PDEs can also serve as a model for new deep architectures.