Home | Contact | About MOS | Membership | Publications | Prizes | Meetings | IPCO | ISMP | Links | Album | Jobs
Mathematical Optimization Society

MOS Prizes

MOS Prizes

 

2012 Fulkerson Prize Citation


Sanjeev Arora, Satish Rao, and Umesh Vazirani,
"Expander flows, geometric embeddings and graph partitioning", J. ACM, 56 (2009), 1-37.

In an n-vertex graph, how does one find an edge-cut of approximately minimum size such that the ratio of the numbers of vertices on either side is bounded? The Leighton-Rao algorithm from 1999 guarantees to find a cut within a factor O(log(n)) of optimal, but the prize paper improves this to O(sqrt(log n)).

The proof techniques build on a series of major developments in approximation algorithms, melding two different approaches to graph partitioning: a spectral method based on eigenvalues, and an approach based on linear programming and metric embeddings in high dimensional spaces. The key idea in both is to spread the vertices in some space while not stretching edges much; partitioning this space then gives a partition of the graph.

The performance guarantee for their simple and elegant algorithm is proved by the following geometric fact about the configuration of points produced by the semidefinite program: there must exist two well-separated sets, each one containing a constant fraction of the points. This geometric fact is established by a very delicate and sophisticated global analysis that makes essential use of measure concentration - a significant departure from the local analysis of SDPs in approximation algorithms thus far.


Anders Johansson, Jeff Kahn, and Van Vu,
"Factors in random graphs", Random Structures and Algorithms 33 (2008), 1-28.

Consider a random k-uniform hypergraph H with n vertices and edge-probability p. Assume that k divides n. What value of p guarantees that, with high probability, H contains a perfect matching?

This was solved for k = 2 by Erdos and Renyi in 1966, but the same question in general has remained a central open problem, known as "Shamir's problem". The answer, as anticipated, is just when the expected number of hyperedges is O(n log n) (for fixed k and large n); but nothing even close was proved until now. In addition, their result identifies the threshold for admitting a partition into vertex-disjoint copies of any given "l;balanced" hypergraph.

The method of proof is novel. The authors consider a hypergraph process where edges are removed from the complete hypergraph one by one, and they estimate the effect of removing each edge on the number of factors of the hypergraph, using concentration inequalities. Standard concentration methods fail in this situation, and the paper instead develops new techniques with a heavy reliance on inequalities derived from entropy.


Lászlo Lovász and Balázs Szegedy,
"Limits of dense graph sequences", Journal of Combinatorial Theory Series B 96 (2006), 933-957.

The paper shows that for a natural metric, graph sequences have a limit object in the space of symmetric measurable functions from the unit square to the unit line. This yields a natural framework to study extremal and probabilistic properties of graphs and graph sequences. For instance, the compactness of the latter space (modulo the group of measure-preserving bijections) implies Szemerádi's regularity lemma, a fundamental new insight.

The paper develops highly original and nontrivial new methods connecting extremal combinatorics and measure theory, and provides important topological and measure-theoretic tools to study large networks. It has led to a new field in probabilistic and extremal combinatorics, yielding results like the (long sought for) hypergraph regularity lemma, and tools to uncover the relations between the numbers of copies of different subgraphs in a graph. It also triggered new developments in mathematical physics (energy minimization, reflection positivity) and in algebraic combinatorics (in representation theory, invariant theory and algebraic geometry).