Archive for August, 2010

PageRank for Movies

Over the past two weeks I played around with using the PageRank algorithm (which Google uses to rank web pages) to rank movies. It’s pure fun, and primarily meant to illustrate how PageRank works, but I quite like the results.

New Q&A site for theoretical computer science

A new Q&A site for theoretical computer science has opened its doors to the public. I encourage all theoretically minded (or curious) readers of this blog to support it. (And yes, the link will perhaps give it a little more Page Rank.)

IPEC accepted papers

The list of accepted papers for IPEC 2010 is out. IPEC is the specialist conference for parametrized and exact computation. This is the field that most of my current work is in, it’s a young but very active field, and IPEC 2010 this is only the 5th such conference. The meeting has now shed the “workshop” monicker, having been known as IWPEC even last year, when we hosted it at ITU.

Of particular interest to my own agenda are two papers that show the hardness of certain graph parameters under the counting exponential time hypothesis (#ETH) introduced with Dell and Wahlén at the past ICALP. The first, co-authored with Nina, is about graph reliability and also continues the exploration of the complexity of the Tutte polynomial.

  • Thore Husfeldt and Nina Taslaman. The exponential time complexity of computing the probability that a graph is connected.
  • Christian Hoffmann. Exponential Time Complexity of Weighted Counting of Independent Sets. arXiv:1007.1146.

MADALGO summer school on geometric data structures (2).

Today is raining. Usual.
Yesterday Sariel Har-Paled has started is series of talks about data structures for
geometric approximation. He has presented various topics like grids, quadtrees
and a technique for representing (approximately) the distance between points in
the space (well separated pairs decomposition).
Talking about grids, it is interesting how this data structure can be used to find the
closest pair in a set P of points on the plane. The data structure is able to
determine in linear time, given P, whether its closest pair has distance
CP(P) equal, smaller or larger than a real number \alpha.
A grid of cells of size \alpha \times \alpha is built on top of P. The data structure
maintains, for each cell, a linked list of points falling in that cell. This data structure
allows for constant time insertions (using hashing) and each time a new point is
added, it can check in constant time if the smaller distance between points changes.
Since n insertions are needed, the result follows.
Now, using this data structure in a randomized algorithm, the closest pair of a set
can be found in expected linear time bounding the expected number of times the
closest pair changes (note that the model of computation relies on hashing in
constant time, true randomization and a constant time computation for the floor
function).

Today there have also been the first lectures of John Iacono concerning the point
location problem, that is, given a space and a partition of it in polygons, find the
polygon containing a point given as the input.
A technique for facing this problem is the Kirkpatrick’s method. Let’s look at the
method in the case of \mathbb{R}^2 where we have a triangulation of the space. The
technique relies on removing an independent set of vertices. This operation could
break the triangulation, so a new one has to be computed (and this can be done
in constant time in this particular case). Then it is just sufficient to proceed in the
same way on the resulting instance, until we end up with a single, big, triangle.
Then we look at the point we want to localize and start the process of rebuilding
the triangulation keeping track of the triangle the points lies in at each step (this
can be done easily since the data structure maintains pointers from triangles at
various levels). It turns out that the number of steps needed in order to carry out
the search of the point is \log n, where n is the number of vertices, while
building the structure (as pointed out before there are pointers in the game that have
to be set up) takes O(n). Moreover the algorithm is completely deterministic.

Tomorrow is the last day of the summer school, heading back to København.

MADALGO summer school on geometric data structures

Me and Nina are not on the news (well, I am sure only about me) but we are at
MADALGO in Aarhus for the summer school on geometric data structures.
This was the first day of lectures. Speakers: Mihai Patrascu (I cannot write it with
the right letters I fear) and Timothy Chan.
In the morning, Patrascu has introduced some general concepts about dimensional
range queries, leaving to Chan the task of going in the analysis of some techniques.
In the afternoon the dimensions of the queries increased, while orthogonality
decreased, making the techniques more and more challenging. Moreover, Mihai
presented some lower bounds for the predecessor problem (in the morning he also
presented some efficient techniques for the same problem), with a nice final speech
on the beauty of lower bounds and their undeserved bad reputation.

Now, I will make a personal use of a public media: in all of this I managed to get my
laptop broken so my screen now looks like there is phosphorus explosion in the
right side…

More details and facts will follow.

Efficient computation in the news

Thore Husfeldt was interviewed on Saturday on Danish national radio about the newly claimed P!=NP proof. Sudoku was used as an example of an NP-hard problem. Of course, he could not resist squeezing in an explanation of how efficient computation (in P) is affecting all of our lives. Definitely this will do more good than my own appearance in the news!