Monthly Archive: January 2016

New paper: Stationary signal processing on graphs

I’m proud to present a new paper. Using the ideas presented inside, we should be able to improve many graph-based models.


Graphs are a central tool in machine learning and information processing as they allow to conveniently capture the structure of complex datasets. In this context, it is of high importance to develop flexible models of signals defined over graphs or networks. In this paper, we generalize the traditional concept of wide sense stationarity to signals defined over the vertices of arbitrary weighted undirected graphs. We show that stationarity is intimately linked to statistical invariance under a localization operator reminiscent of translation. We prove that stationary graph signals are characterized by a well-defined Power Spectral Density that can be efficiently estimated even for large graphs. We leverage this new concept to derive Wiener-type estimation procedures of noisy and partially observed signals and illustrate the performance of this new model for denoising and regression.


A starter kit for Deep Learning


  1. A MOOC from Geoffrey Hinton, one of the fathers of deep learning


Blog posts

Selected software
The three main tools are:

  1. The classic guy in python
    Tutorials found in
  2. The other guy in the competition. (I started with that one)
    Torch has the advantage to be interfaced with Lua. It offers a simple way to create the neural nets. Recently, pytorch gained a lot of attention
  3. Tensorflow, the new coming guy from google

As a recommendation, I would advice pytorch of tensorflow.

MATLAB is not a very appropriate language for deep learning. However, it is interesting to use for learning purposes.


Publications (To be done)