InvertibleNetworks.jl - Memory efficient deep learning in Julia

TitleInvertibleNetworks.jl - Memory efficient deep learning in Julia
Publication TypeConference
Year of Publication2021
AuthorsPhilipp A. Witte, Mathias Louboutin, Ali Siahkoohi, Gabrio Rizzuti, Bas Peters, Felix J. Herrmann
Conference NameJuliaCon
Month07
Keywordsdeep learning, Invertible network, Julia, Normalizing flows, segmentation
Abstract

We present InvertibleNetworks.jl, an open-source package for invertible neural networks and normalizing flows using memory-efficient backpropagation. InvertibleNetworks.jl uses manually implement gradients to take advantage of the invertibility of building blocks, which allows for scaling to large-scale problem sizes. We present the architecture and features of the library and demonstrate its application to a variety of problems ranging from loop unrolling to uncertainty quantification.

Notes

(JuliaCon, virtual)

URLhttps://slim.gatech.edu/Publications/Public/Conferences/JuliaCon/2021/witte2021JULIACONmedlj/witte2021JULIACONmedlj.pdf
URL2
Citation Keywitte2021JULIACONmedlj