EURASIP Seminar on Sparsely Connected Deep Networks by Prof R. Gribonval
Speech icon

EURASIP Seminar on Sparsely Connected Deep Networks by Prof R. Gribonval


As a part of the SimulaMet ML Seminar Series, Professor Remi Gribonval from INRIA and University of Rennes, France, and the EURASIP Fellow 2018, will deliver a EURASIP seminar on “Approximation with sparsely connected deep networks”. The talk will take place between 2pm and 3pm at Pilestredet 48 (room P168) on Monday October 29, 2018. Everyone interested is invited to attend the talk.

Approximation with sparsely connected deep networks


Many of the data analysis and processing pipelines that have been carefully engineered by generations of mathematicians and practitioners can in fact be implemented as deep networks. Allowing the parameters of these networks to be automatically trained (or even randomized) allows to revisit certain classical constructions.

The talk first describes an empirical approach to approximate a given matrix by a fast linear transform through numerical optimization. The main idea is to write fast linear transforms as products of few sparse factors, and to iteratively optimize over the factors. This corresponds to training a sparsely connected, linear, deep neural network. Learning algorithms exploiting iterative hard-thresholding have been shown to perform well in practice, a striking example being their ability to somehow “reverse engineer” the fast Hadamard transform. Yet, developing a solid understanding of their conditions of success remains an open challenge.

In a second part, we study the expressivity of sparsely connected deep networks. Measuring a network's complexity by its number of connections, we consider the class of functions which error of best approximation with networks of a given complexity decays at a certain rate. Using classical approximation theory, we show that this class can be endowed with a norm that makes it a nice function space, called approximation space. We establish that the presence of certain “skip connections” has no impact of the approximation space, and discuss the role of the network's nonlinearity (also known as activation function) on the resulting spaces, as well as the benefits of depth. For the popular ReLU nonlinearity (as well as its powers), we relate the newly identified spaces to classical Besov spaces, which have a long history as image models associated to sparse wavelet decompositions. The sharp embeddings that we establish highlight how depth enables sparsely connected networks to approximate functions of increased “roughness” (decreased Besov smoothness) compared to shallow networks and wavelets. Joint work with Luc Le Magoarou (Inria), Gitta Kutyniok (TU Berlin), Morten Nielsen (Aalborg University) and Felix Voigtlaender (KU Eichstätt).

Short bio:

Rémi Gribonval is a Research Director (Directeur de Recherche) with INRIA in Rennes, France,and the scientific leader of the PANAMA research group on sparse audio processing. In 2011,he was awarded the Blaise Pascal Award of the GAMNI-SMAI by the French Academy ofSciences, and a starting investigator grant from the European Research Council in 2011. He isan IEEE fellow and a EURASIP Fellow. He founded the series ofinternational workshopsSPARS on Signal Processing with Adaptive/Sparse Representations. Since 2002 he has beenthe coordinator of several national, bilateral and European research projects. He has been amember of the IEEE SPTM Technical Committee and of the SPARS steering committee.Rémi Gribonval was a student at Ecole Normale Supérieure, Paris from 1993 to 1997. Hereceived the Ph. D. degree in applied mathematics from the University of Paris-IX Dauphine,Paris, France, in 1999, and his Habilitation à Diriger des Recherches in appliedmathematicsfrom the University of Rennes I, Rennes, France, in 2007.