Supervision: Adám Gosztolai
Semester project (master)
Background: Networks are powerful mathematical objects which are widely used to model systems composed of interacting constituents such as those appearing in neuroscience, computational biology, finance etc. Networks are also used to approximate curved surfaces (e.g., meshes) which are highly relevant for computer vision, physics etc. The intersection of networks (graphs) and neural networks in machine learning has been particularly transformative in recent years through the ability to define convolutions on networks, thereby generalising CNNs to handle discrete or non-Euclidean data.
The project: One of the difficulties in defining convolutions on graphs is that they are often multi-scale. An example is in neuroscience where interactions can be defined between neurons on one scale or between brain regions on another scale. To overcome this, your task will be to define a scale-dependent graph convolution using graph curvature. Essentially, curvature allows you to approximate the graph locally as a given scale to bias which nodes should share information.
Outcome: Your aim will be to explore the role of multi scale geometry for graph convolutional networks. You will develop a neural network that uses curvature to parametrise the space in which the network lies and to tune the convolution operators. Depending on time, you will deploy this network to derive new insight on datasets arising in physical sciences (to be discussed depending on your interest). This project can lead to a publication.
Your background: I am looking for someone with a quantitative background (maths, physics, CS, EE etc). Programming experience is desirable. Mathematical aptitude is advantageous. If you enjoy getting lost to find new ideas then this project is for you. If just you just want to get some ML experience with off-the-shelf tools then maybe look elsewhere.
Supervisor: I am an applied mathematician extensive experience in computational biology and neuroscience. Click here to see my previous works.