ML@scale using randomized linear algebra

TitleML@scale using randomized linear algebra
Publication TypeConference
Year of Publication2021
AuthorsFelix J. Herrmann, Mathias Louboutin, Ali Siahkoohi
Conference NameMicrosoft
Keywordsdeep learning, randomized linear algebra, Uncertainty quantification

Deep Learning for large-scale applications such as video encoding or seismic segmentation are challenged by the excessive amounts of memory that is required for training networks via backpropagation. In this talk, I will discuss how techniques from randomized linear algebra can be used to address these bottle necks and bring down the memory footprint of training CNNs by up to a factor of O(N) (where N is number of pixels) without increasing computational cost. Additionally, I will illustrate how the seemingly disparate technologies of deep learning and large-scale PDE-constrained optimization share important similarities that can be taken advantage of in the development of next-generation deep learning technologies, with possible applications in scientific computing and sustainability.


Talk at Microsoft

Citation Keyherrmann2021Microsoftrla