#### Faculty Candidate Seminar

# Solving Linear Equations

Add to Google Calendar

Solving systems of linear equations is one of the most fundamental algorithmic problems. In the past fifteen years, theoretical computer scientists have made tremendous progress in developing provably correct, asymptotically fast algorithms for solving structured linear equations.

Linear equations of a type known as Laplacians are extremely useful for analyzing graphs and networks, and they have become a central algorithmic building block of modern fast graph algorithms. Laplacian linear equation solvers also have many applications in scientific computing and engineering. I will present my 'approximate Gaussian Elimination' algorithm, a very simple procedure for solving Laplacian linear equations. This algorithm may finally give us a practical, fast, and robust solver for these systems, making theoretical developments in the area relevant in practice. We will see some experimental evidence for this.

I will survey my research developing the asymptotically fastest known algorithms for solving linear equations in matrix families known as Laplacians, Connection Laplacians, Directed Laplacians, and 3D truss stiffness matrices. These linear equations are used in algorithms for numerous problems in optimization, engineering, statistics, and data science broadly.

Next, I will explain limits on fast algorithms for solving structured linear equations. My work on complexity lower bounds for solving structured linear equations shows that several classes of linear equations that seem only slightly more general than Laplacians are in fact as hard to solve as arbitrary linear equations.

Finally, I will discuss the future of linear equation solvers, and how ideas from the area can be generalized beyond this setting to obtain new state-of-the-art algorithms for broad classes of problems in convex optimization.

Rasmus Kyng is currently a Postdoctoral Fellow at the Harvard Theory of Computation Group, mentored by Jelani Nelson. Before this, he was a Postdoctoral Research Fellow at the Simons Institute for the Theory of Computing at UC Berkeley. Rasmus obtained his PhD in Computer Science from Yale University under the supervision of Daniel Spielman. Rasmus has worked on algorithms for solving structured systems of linear equations, new approaches to convex optimization, learning and regression problems on graphs, random matrix theory, and fine-grained complexity lower bounds. His work won the FOCS 2017 Machtey award for best student paper.