Theory Seminar

New techniques for convex optimization and sparsification

Arun JambulapatiUniversity of Michigan
WHERE:
3725 Beyster BuildingMap
SHARE:
Abstract: The computational challenges posed by recent massive datasets motivate the design of more efficient algorithms. My work takes problems motivated by modern machine learning and develops theoretical tools to answer two central questions: 1) how can we learn from data faster, and 2) how can we select representative data (to potentially speed up downstream processing)
I will first present several results on faster convex optimization leveraging a new theoretical primitive called a “ball optimization oracle”. I will give near-optimal algorithms for minimizing convex functions assuming access to this oracle, and show how this framework can be applied to yield faster algorithms for important problems in computer science such as regression, differentially private optimization, and robust optimization.
I will follow with results on problems motivated by dataset compression. Leveraging techniques from high-dimensional geometry, I give near-optimal constructions of sparsifiers for sums of norms and generalized linear models. This directly implies new sparsifiers for hypergraphs, sums of symmetric submodular functions, and gives faster algorithms for linear regression.

Organizer

Greg Bodwin

Organizer

Euiwoong Lee