CNI Seminar Series

Mathematical ML and ML for Math: Alternating GD and Minimization (AltGDmin) for Secure Federated Low Rank Matrix Learning for Real-time MRI and ML-enabled K-12 Math Support

Prof. Namrata Vaswani, Endowed Anderlik Professor, Iowa State University

#286
Slides
Abstract

This talk will consist of two parts. The first will describe my research on the AltGDmin algorithm for Byzantine-resilient distributed structured matrix learning. Details below. The second part will be on “ML for better K-12 Math”. Here I will talk about how we (STEM professionals) need to start thinking about fixing the early math skills of school students, particularly those without such support at home, in order to improve the likelihood of their choosing and succeeding in Com/SP/IT or any STEM field. I’ll briefly talk about the CyMath program that I direct at ISU and how the use of ML-enabled math learning apps such as ALEKS or Khan Academy can make this task easier for to accomplish for non-teachers. Modern distributed and federated learning systems are vulnerable to various kinds of adversarial attacks. Byzantine attacks are one of the most difficult attacks to deal with: since these are model update poisoning attacks (poison algorithm iterates of the attacked nodes), and since the adversarial nodes are omniscient and can collude. We introduce provably Byzantine-resilient and communication-efficient algorithms for solving multiple different federated low-rank (LR) matrix learning problems – LR Column-wise Sensing, LR Phase Retrieval, (Robust) LR matrix completion and Robust PCA – all of which involve solving a partly-decoupled optimization problem, and all involve dealing with data heterogeneity across nodes. These problems find important applications in parameter efficient fine-tuning of LLMs, recommender system design, multi-task representation learning for few-shot learning, federated sketching, accelerated dynamic MRI, and Fourier ptychography. Alternating GD and minimization (AltGDmin), introduced in our recent work, is a novel faster, and more communication-efficient, alternative to Alternating Minimization (AltMin) for partly-decoupled optimization problems. These are problems in which the set of optimization variables can be split into two, or more, subsets such that the optimization with respect to at least one subset, keeping the other fixed, is decoupled. We also describe Byz-AltGDmin which is a provably Byzantine-resilient modification. Finally, if time permits, we will show real-data experimental results on the advantage (speed and generality) of AltGDmin-based methods over the existing state-of-the-art within dynamic MRI.


Bio
Prof. Namrata Vaswani, Endowed Anderlik Professor, Iowa State University

Namrata Vaswani is a Professor of Electrical and Computer Engineering, and the Anderlik Professor of Engineering at Iowa State University. She also holds a courtesy professorship in the department of Mathematics. She received a Ph.D. in 2004 from the University of Maryland, College Park and a B.Tech. from Indian Institute of Technology (IIT-Delhi) in 1999. Her research is in statistical ML and signal processing, and in imaging (MRI and video analytics). Vaswani is also the director of the CyMath K-12 math tutoring and support program at Iowa State. She is a recipient of the IEEE Signal Processing Society (SPS) Best Paper Award (2014), the University of Maryland ECE Distinguished Alumni Award (2019), and the Iowa State Mid-Career Achievement in Research Award (2019). Vaswani is an AAAS Fellow (class of 2023) and an IEEE Fellow (class of 2019).