InvertibleNetworks.jl - Memory efficient deep learning in Julia

TitleInvertibleNetworks.jl - Memory efficient deep learning in Julia
Publication TypeConference
Year of Publication2021
AuthorsPhilipp A. Witte, Mathias Louboutin, Ali Siahkoohi, Gabrio Rizzuti, Bas Peters, Felix J. Herrmann
Conference NameJuliaCon
Keywordsdeep learning, Invertible network, Julia, Normalizing flows, segmentation

We present InvertibleNetworks.jl, an open-source package for invertible neural networks and normalizing flows using memory-efficient backpropagation. InvertibleNetworks.jl uses manually implement gradients to take advantage of the invertibility of building blocks, which allows for scaling to large-scale problem sizes. We present the architecture and features of the library and demonstrate its application to a variety of problems ranging from loop unrolling to uncertainty quantification.


(JuliaCon, virtual)

Citation Keywitte2021JULIACONmedlj