Monday, March 28th, 2016
There has been great progress in classifying the approximation complexity of anti-ferro 2-spin systems. We will first review these results, and then talk about some new results regarding ferro 2-spin systems. In particular, we conjecture a complexity transition threshold that is almost tight for ferro systems. We confirm the conjecture for some range of the parameters and provide evidence for the rest. In addition we will show some (hardness) results when going beyond non-negative weights to complex weights.
It includes joint work with Pinyan Lu and Leslie Goldberg.
In this light-hearted talk, we will adopt the unconventional viewpoint that a "polynomial time" vs. "probably not polynomial time" dichotomy for a counting problem may not be a stopping point for further research on the same problem. We will survey different ways to classify the problem further by providing faster exponential-time algorithms, or by proving hardness results under hypotheses such as the exponential time hypothesis.
Computability theory, as developed most notably by Turing, is a successful mathematical theory of the limits of what can be computed. Perhaps remarkably, it does not significantly involve quantities. Theories similarly light on quantitative analysis have also been provided for important issues in the complexity of computation, such as NP-completeness. However, there remain wide areas of computation, such as the evaluation of probabilities, multivariate integration, and the counting of discrete structures, where the objects to be manipulated are inherently quantitative. In this talk we shall focus on this area, in which computational complexity becomes inseparable from quantitative concerns. We highlight counting problems, in which one seeks to compute the number of discrete objects of a certain kind. The number may be exponentially large compared to the size of the objects allowed, but can be written down as a polynomial number of digits. We shall review recent progress that makes it possible to classify a wide range of such counting problems, both exact and approximate, according to their computational difficulty. This classification is sometimes offered in the strong sense of a dichotomy theorem, which classifies according to difficulty not just a single problem but a whole class. Proofs of such theorems exploit a variety of techniques, including holographic transformations.
Light refreshments will be served before the lecture at 3:30 p.m.
Tuesday, March 29th, 2016
The theory of holographic algorithms introduced by Valiant represents a novel approach to achieving polynomial-time algorithms for seemingly intractable counting problems via a reduction to counting planar perfect matchings and a linear change of basis. Two fundamental parameters in holographic algorithms are the domain size and the basis size. Roughly, the domain size is the range of colors involved in the counting problem at hand (e.g. counting graph k-colorings is a problem over domain size k), while the basis size \ell captures the dimensionality of the representation of those colors. A major open problem has been: for a given k, what is the smallest \ell for which any holographic algorithm for a problem over domain size k ``collapses to'' (can be simulated by) a holographic algorithm with basis size \ell? Cai and Lu showed in 2008 that over domain size 2, basis size 1 suffices, opening the door to an extensive line of work on the structural theory of holographic algorithms over the Boolean domain. Cai and Fu later showed for signatures of full rank that over domain sizes 3 and 4, basis sizes 1 and 2, respectively, suffice, and they conjectured that over domain size k there is a collapse to basis size \lfloor\log_2 k\rfloor. In this talk, we give an algebraic proof resolving this conjecture in the affirmative for signatures of full rank for all k.
We consider the importance of the choice of metric in path coupling, and its relationship to the technique of path coupling with stopping times. We provide strong evidence that the stopping time method is no more powerful than standard path coupling using a carefully chosen metric. However, the stopping time approach does give insight into the design of good metrics for specific problems. We give illustrative applications, to sampling independent sets in hypergraphs, and to sampling colourings of hypergraphs and bipartite graphs.
Joint work with Magnus Bordewich and Marek Karpinski
No abstract available.
Wednesday, March 30th, 2016
H-colourings of a graph G (a.k.a. homomorphisms from G to H) generalise familiar proper vertex colourings of G. More than 15 years ago, Dyer and Greenhill presented a dichotomy for exactly counting H-colourings. That was for exact counting, and, even now, there is only a partial complexity classification for approximate counting. A familiar theme in the study of CSPs is that the “conservative” case is often easier to resolve. This is one reason to look at list H-colourings. The decision problem for list H-colourings was studied by Feder, Hell and Huang, who proved a dichotomy result. In this talk, I’ll present a classification for the problem of approximately counting list H-colourings of a graph, and briefly consider weighted variants.
This is joint work with Andreas Galanis and Leslie Ann Goldberg (Oxford), and others.
Although a general dichotomy for exact counting CSP was proved, the complexity for approximation remains wildly open. I will focus on the program to classifying approximability for bounded-degree Boolean #CSP. Dyer, Goldberg, Jalsenius and Richerby gave a beautiful full classification when the maximum degree is six or above, in which no non-trivial approximation result appears. However, for smaller degrees (maximum degree is five or below), interesting tractable (in term of approximation) cases do appear and a complete classification is far from known. In this talk, I will summary the known results in this range and propose a number of interesting open questions.
Thursday, March 31st, 2016
The complexity landscape of approximate computing partition functions appears to be quite complicated. This makes the problem of classifying such problems according to their approximation complexity very difficult. We try to separate counting problems with respect to gadget reductions, which is equivalent to the study of clones related to the problem, that is, sets of functions or relations closed under gadgets. We survey the several main approaches to such classification, as well as some new and existing results.
Evaluation of the chromatic polynomial is easy on finitely many points, and #P hard everywhere else. We call this the difficult point property $DPP$. Let $F$ be a graph property and $k$ be a positive integer. A function $f: V(G) \rightarrow [k]$ is an $F$-coloring if for every $i \in [k]$ the set $f^{-1}(i)$ induces a graph in $F$. The author and Boris Zilber have shown in 2006 that counting $F$-colorings with $k$ colors is a polynomial $P_F(G;k)$ in $k$. We show infinitely many examples of properties $F$, where $DPP$ holds for $P_F(G;k)$, and formulate several conjectures, including also multivariate graph polynomials.
Many computational complexity problems, including constraint satisfaction and weighted counting constraint satisfaction problems, graph isomorphism, and others, can be expressed in a unified way as word problems in certain kinds of monoidal categories over finite tensor signatures. Using categories axiomitizes notions such as planarity and avoids long combinatorial descriptions, as well as characterizing the difference between problems. It allows the tools of category theory to be brought to bear on complexity problems and allows reductions to be expressed as functors. This approach also generates many new open problems as we vary the kind of monoidal category, the representation functor, and the tensor signature.
Systems of polynomial equations can be used to compactly and elegantly model combinatorial problems in such a way that if a given combinatorial property is not satisfied, then the system of polynomial equations has no solution. If a system of polynomial equations has no solution, there exists a very specific algebraic proof of infeasibilty via the famous Hilbert's Nullstellensatz. In this talk we compare and contrast the complexity of Hilbert's Nullstellensatz certificates when the underlying combinatorial problem is NP-complete (such as Partition), as compared to problems known to be in P (such as 2-colorability or Matching).
Friday, April 1st, 2016
We survey recent results on the power of LP relaxations for optimisation problems known as valued constraint satisfaction problems (CSPs) which might be of interest to the counting community.
We consider counting CSPs and Holant problems defined by general asymmetric constraints of unbounded arity. We give a fixed parameter tractable (FPT) algorithm for Holant problems, in terms of two parameters: the treewidth of the underlying graph and the "regularity" of the signatures. We define a notion of regularity for the signatures (constraint functions). And we show that the Holant problem defined by r-regular signatures on a graph of treewidth k can be computed within time r^O(k) poly(n). This covers the known FPT algorithms in terms of treewidth for spin systems, counting graph homomorphisms, counting CSPs with bounded-arity constraints, Holant problems and tensor networks on bounded-degree graphs, and counting matchings and perfect matchings on general graphs.