Archive for September, 2010

Efficiency in the news

Rasmus Resen Amossen’s PhD project is an article on the front page of the engineer’s magazine Version2. The (slightly exaggerated) title reads: “Danish PhD student improves performance of database searches by a factor 10″.

Report from ESA 2010

I’m at European Symposium on Algorithms (ESA 2010), which together with affiliated workshops has drawn around 220 participants. More than 25% are from outside Europe, so this is truly an international conference. The number of participants is somewhat higher than last year, mostly because of the WABI workshop.

Highlights of the business meeting:

  • 245 submissions to ESA, 27% accepted.
  • Acceptance probability was highly dependent on the number of words in the title, with 2 and 16 words being best (100%) and 8-9 words being worst (<10%). I am proud to say that our paper was accepted, even though we made the mistake of having an 8-word title!
  • Italy had the most convincing acceptance statistics, 54%.
  • 12 papers were rejected because of not adhering to the submission guidelines! The steering committee (where I am member) decided that the strict enforcement should continue, but perhaps more should be done to inform authors in advance about this.
  • ESA 2011 will be in Saarbrücken (September 5-9), ESA 2012 will be in Ljubljana, Slovenia. It looks like the event is growing, with more workshops joining.
  • Mark de Berg replaces Lars Arge as chair of the ESA steering committee.

Beyond attending my own talk, I have seen a number of nice results presented. For example, Justin Thaler talked about ways of certifying computations such that they can be checked by a weak (streaming algorithm) verifier, with possible applications in cloud computing. Hannah Bast talked about her work that allows Google to now present routes involving public transportation in milliseconds. Eran Halperin gave a nice invited talk about finding genetic reasons for human diseases. Paolo Ferragina is giving an interesting invited talk tomorrow that should involve a model for the energy consumption of memory access! The best student paper award is given (tomorrow) to a paper authored by Christian Wulff-Nielsen from University of Copenhagen. Congratulations! After that, I will be heading back home.

RANDOM + APPROX 2010

Today is raining. Usual.
It happens to me to be in Barcelona, Catalunya.
I am here because I am attending RANDOM + APPROX 2010, (13th Intl. Workshop
on Approximation Algorithms for Combinatorial Optimization Problems - APPROX 2010
14th Intl. Workshop on Randomization and Computation - RANDOM 2010… well, for the
record).
Yesterday, the first invited lecture, has been given by Leslie A Goldberg; title: Approximating
the Tutte polynomial of a graph.
The talk was very nice and covered many approximation results concerning the famous
polynomial. Something quite interesting is that, apart from some results in special cases
where the Tutte pol. admits a FPRAS, in many cases, like for binary matroids, the hardness
of the problem, in terms of optimization problems, is defined in a way that is essentially
the same used in game theory. In particular, as Nash equilibria is in PPAD and is moreover
complete in the class, Tutte (in some cases) is hard in \#RH\Pi_1 (unless NP = RP) with
respect to AP polynomial time reductions. \#RH\Pi_1 contains the problems that can be
expressed in terms of counting the number of models of a logical formula from a certain
syntactically restricted class which is also known as “restricted Krom SNP”. As PPAD,
this class descends from a semantic class of problems (classes in which properties are
defined over all possible inputs). These classes are unlikely to admit complete problems,
so syntactical restrictions are applied in order to find hardness results.

Today, Oded Goldreich has given a very vivid talk about pseudo random generators
and the problem of deciding whether P=BPP or not.
The talk was really really nice: Oded strongly supported his ideas on where is more
likely to find a proof for the inequality to hold or not. Essentially he thinks that it is not
using lower bounds that it will be possible to face the problem, but using some sort
of diagonalization. He also presented a recent result of his, stating that if BPP=P then
a suitable pseudo number generator must exist.
There is a bunch of interesting consequences associated with this result, but I need to
read the paper before writing anything about it!

Smart thing of the organizers to schedule the invited lectures in the second half of the
morning, so that also the lazier ones can stand a chance of getting something without
falling tremendously asleep (if attending the talk at all…).
Tomorrow is the last day, then I will be flying over the Pyrenees, passing by the Alps
and away form the mare nostrum again.