The Local Learning Coefficient: A Singularity-Aware Complexity Measure
Authors
Edmund Lau
University of Melbourne
Zach Furman
Timaeus
George Wang
Timaeus
Daniel Murfet
University of Melbourne
Susan Wei
University of Melbourne
Publication Details
Published:
August 23, 2023
Venue:
AISTATS 2025
Abstract
Deep neural networks (DNN) are singular statistical models which exhibit complex degeneracies. In this work, we illustrate how a quantity known as the learning coefficient introduced in singular learning theory quantifies precisely the degree of degeneracy in deep neural networks. Importantly, we will demonstrate that degeneracy in DNN cannot be accounted for by simply counting the number of 'flat' directions.
Research Details
Main contributions:
- The LLC is theoretically well-defined. Earlier, Watanabe introduced the global learning coefficient. This paper introduces a local variant and shows that this is well-defined.
- The LLC can be estimated. This paper shows that the LLC can be estimated by using SGLD-based posterior sampling combined with a Gaussian localization term.
- The estimated LLC respects ordinality. Given ground truth knowledge that one model is more complex than other, the estimated LLCs respect this ordering.
See the accompanying distillation.