- This event has passed.
Colloquium: Prof. Greg Ongie
A Function Space View of Neural Networks
Prof. Greg Ongie
Assistant Professor
Marquette University
Many mathematical analyses of deep learning focus on how neural network (NN) parameters evolve during training. A complementary perspective is to view NN training as fitting a function belonging to a function space implicitly defined by the architecture and training procedure. In particular, when parameter norms are explicitly or implicitly constrained, NNs exhibit a bias toward functions with low “representation cost,” defined as the minimal parameter norm required to realize the function with a given NN architecture. This talk surveys recent results that characterize representation cost of shallow NN architectures in terms of Banach space norms, and through non-linear notions of function rank for deeper NN architectures. Finally, we discuss how bias towards low representation cost functions helps to explain generalization in various applications.
