# Core sets and streaming algorithms for MAX CUT

Suresh VenkatasubramanianUniversity of Utah
SHARE:

p>Streaming algorithms for graph problems typically fall into the "semi-streaming" model where the memory requirements are sublinear in the number of edges, but not necessarily in the number of vertices. Many of these results follow via edge sampling and sparsification. For example, we can get an approximation of the cuts in a graph by sampling roughly n log n edges (n being the number of vertices). This immediately yields a semi-streaming algorithm for MAX CUT, and a series of lower bounds suggest that we can't do much better in general.

In the offline setting though, dense instances of MAX CUT can be solved using vertex sampling to find small "core sets" on which we can compute the max cut via brute force, using the resulting value to estimate the size of the max cut in the in the full graph. Can such ideas be used to get truly sublinear space algorithms for MAX CUT?

In this talk, I'll describe a procedure that given a graph of average degree $\Delta = n^\delta, \delta > 0$ (i.e slightly non-sparse), produces a core set of size $O(n^{1-\delta})$ such that the max cut on this graph is close approximation of the optimal max cut. I'll then show how to use this core set as the basis for a 2-pass streaming algorithm that yields a 1-eps approximation for MAX CUT in sublinear space. Our main idea is to do a particular weighted vertex sampling and then analyze it by carefully adapting offline tools based on a particular primal-dual LP-framework.