Joris Kinable
Joris Kinable

Reputation: 2411

Clarkson's 2-approximation Weighted Vertex Cover Algorithm Runtime analysis

A well-known 2-approximation for a Minimum Weighted Vertex Cover Problem is the one proposed by Clarkson:

Clarkson, Kenneth L. "A modification of the greedy algorithm for vertex cover." Information Processing Letters 16.1 (1983): 23-25.

Easy-to-read pseudo code of the algorithm can be found here see section 32.1.2. The algorithm, according to the paper, has a runtime complexity of O(|E|*log|V|) where E is the set of edges and V the set of vertices. I'm not entirely sure how they get this result.

Let d(v) be the degree of vertex v in a graph, and w(v) be some weight function. Excluding some of the technicalities from the algorithm, the algorithm looks like this:

while( |E| != 0){ //While there are still edges in the graph
    Pick a vertex v \in V for which w(v)/d(v) is minimized;
    for( u : (u,v) \in E){
        update w(u);
        ...
    }
    delete v and all edges incident to it from the graph.
}

The outer loop produces the term |E| in the runtime complexity. That means that picking a vertex out of a list of vertices which minimizes some ratio can be done in log n time. As far as I can tell, finding a minimum value out of a list of values takes n-1 comparisons, not log n. Finally, the inner for loop runs for every neighbor of v, so yields a complexity of d(v) which is dominated by n-1. Hence I would conclude that the algorithm has a runtime complexity of O(|E|*|V|). What am I missing here?

Upvotes: 0

Views: 505

Answers (1)

David Eisenstat
David Eisenstat

Reputation: 65488

Keep the vertices in a balanced binary search tree ordered by w(v)/d(v). Finding the min is O(log |V|). Each time we delete an edge uv, we have to update u's key (by removing it and reinserting it into the tree with the new key), which takes time O(log |V|). Each of these steps is done at most |E| times.

Upvotes: 2

Related Questions