Theory Seminar

Probabistic Transforms in the Analysis of Algorithms

Kevin Compton

Some algorithms such as quicksort and the simplex algorithm have bad worst-case behavior, but perform very well "on the average." To understand the behavior of these algorithms, we resort to an average-case analysis. This approach, however, is constrained by two difficulties: hard-to-evaluate mathematical expressions and unrealistic probabilistic models. Researchers have made some eadway
on the first problem with a method called Poissonization. The idea is that in certain cases, dependent random variables on finite probability spaces can be approximated by dependent, Poisson-distributed random variables, and the closeness of approximation can be gauged to obtain very good average case analyses. We will show that the idea of Poissonization can be extended to other probability distributions. This allows us to deal with the second problem by showing that some algorithms show very good average-case behavior for a wide variety of probability distributions, not just those that are Poisson approximable. This provides an explanation of why these algorithms are stable in many different applications.

Sponsored by