ML@scale using randomized linear algebra
Title | ML@scale using randomized linear algebra |
Publication Type | Conference |
Year of Publication | 2021 |
Authors | Felix J. Herrmann, Mathias Louboutin, Ali Siahkoohi |
Conference Name | Microsoft |
Month | 03 |
Keywords | deep learning, randomized linear algebra, Uncertainty quantification |
Abstract | Deep Learning for large-scale applications such as video encoding or seismic segmentation are challenged by the excessive amounts of memory that is required for training networks via backpropagation. In this talk, I will discuss how techniques from randomized linear algebra can be used to address these bottle necks and bring down the memory footprint of training CNNs by up to a factor of O(N) (where N is number of pixels) without increasing computational cost. Additionally, I will illustrate how the seemingly disparate technologies of deep learning and large-scale PDE-constrained optimization share important similarities that can be taken advantage of in the development of next-generation deep learning technologies, with possible applications in scientific computing and sustainability. |
Notes | Talk at Microsoft |
Presentation | |
Citation Key | herrmann2021Microsoftrla |