Publications & Talks

Publications

Rethinking Parameter Counting in Deep Models: Effective Dimensionality Revisited

Pre-print, 2020

We show that many of the properties of generalization become understandable when viewed through the lens of effective dimensionality, which measures the dimensionality of the parameter space determined by the data. We relate effective dimensionality to posterior contraction in Bayesian deep learning, model selection, width-depth tradeoffs, double descent, and functional diversity in loss surfaces, leading to a richer understanding of the interplay between parameters and functions in deep models.

Download here

Talks