Continuous Generative Neural Networks: A Wavelet-Based Architecture in Function Spaces

Numer Funct Anal Optim. 2024 Nov 19;46(1):1-44. doi: 10.1080/01630563.2024.2422064. eCollection 2025.

Abstract

In this work, we present and study Continuous Generative Neural Networks (CGNNs), namely, generative models in the continuous setting: the output of a CGNN belongs to an infinite-dimensional function space. The architecture is inspired by DCGAN, with one fully connected layer, several convolutional layers and nonlinear activation functions. In the continuous L 2 setting, the dimensions of the spaces of each layer are replaced by the scales of a multiresolution analysis of a compactly supported wavelet. We present conditions on the convolutional filters and on the nonlinearity that guarantee that a CGNN is injective. This theory finds applications to inverse problems, and allows for deriving Lipschitz stability estimates for (possibly nonlinear) infinite-dimensional inverse problems with unknowns belonging to the manifold generated by a CGNN. Several numerical simulations, including signal deblurring, illustrate and validate this approach.

Keywords: Generative models; injective networks; inverse problems; multi-resolution analysis; neural networks; variational autoencoders; wavelets.

Grants and funding

This material is based upon work supported by the Air Force Office of Scientific Research under award number FA8655-20-1-7027. Co-funded by the European Union (ERC, SAMPDE, 101041040).