Probabilistic Generative Neural Priors For Enhanced Generalization and Regularization

Doctoral Candidate Name: 
Akarsh Pokkunuru
Program: 
Computing and Information Systems
Abstract: 

Learning continuous functions parameterized by neural networks has become a novel paradigm for representing complex, high-dimensional data, offering many benefits like shift-invariance and resolution-independent representations. However, these models struggle with data that is discontinuous, noisy, non-linear, and ill-posed, largely due to their inability to capture diverse data characteristics in a unified manner. To overcome these challenges, we introduce Probabilistic Generative Neural Priors, a Bayesian-inspired regularization framework that integrates probabilistic generative models—such as Energy-based Models (EBMs), Score-based Diffusion Models (SBMs), and Variational Autoencoders (VAEs)—with task-specific neural networks like Neural Fields (NFs) and classification models. Our framework leverages generative models as probabilistic priors to provide essential information during inference network training, facilitating faster and more accurate predictions by directly utilizing the prior's outputs. We validate our approach through extensive experiments on a diverse set of applications, including non-linear physics-based partial differential equation (PDE) inverse problems, linear image inverse problems, physics-based topology optimization, and time-series classification. Our results show significant improvements in accuracy metrics, convergence speed, generalization and regularization performance compared to existing methods, across all considered applications.

Defense Date and Time: 
Friday, August 30, 2024 - 2:00pm
Defense Location: 
WOODWARD 335
Committee Chair's Name: 
Razvan Bunescu
Committee Members: 
Amirmohammad Rooshenas, Gabriel Terejanu, Minwoo Lee, Alireza Tabarraei