Reputation: 11
since I think many of us don't have the same edition of "Introduction to algorithms" of Prof. Cormen et al., I'm gonna write the Lemma (and my question) in the following.
Edmonds-Karp-Algorithm
Lemma 26.7 (in 3rd edition; in 2nd it may be Lemma 26.8): If the Edmonds-Karp algorithm is run on a flow network G=(V,E) with source s and sink t, then for all vertices v in V{s,t}, the shortest-path distance df(s,v) in the residual network Gf increases monotonically with each flow augmentation
Proof: First, suppose that for some vertex v in V{s,t}, there is a flow augmentation that causes the shortest-path distance from s to v to decrease, then we will derive a contradiction. Let f be the flow just before the first augmentation that decreases some shortest-path distance, and let f' be the flow just afterward. Let v be the vertex with the minimum df'(s,v), whose distance was decreased by the augmentation, so that df'(s,v) < df(s,v). Let p = s ~~> u -> u be a shortest path from s to v in Gf', so that (u,v) in Ef' and
df'(s,u) = df'(s,v) - 1. (26.12)
Because of how we chose v, we know that the distance of vertex u from soruce s did not decrease, i.e.
df'(s,u) >= df(s,u). (26.13)
...
My question is: I don't really understand the phrase
"Because of how we chose v, we know that the distance of vertex u from soruce s did not decrease, i.e. df'(s,u) >= df(s,u). (26.13)"
How does the way we chose v affect the property that "the distance of vertex u from s did not decrease" ? How can I derive the equation (26.13).
We know, u is a vertex on the path (s,v) and (u,v) is also a part in (s,v). Why can (s,u) not decrease as well?
Thank you all for your help.
Upvotes: 1
Views: 425
Reputation: 11
My answer may be drawn out, but hopefully it helps for an all around understanding.
For some history, note that the Ford-Fulkerson algorithm came first. Ford-Fulkerson simply selects any path from the source to the sink, adds the amount of flow to the current capacity, then augments the Residual graph accordingly. Since the path that is selected could hypothetically be anything, there are scenarios where this approach takes 'forever' (figuratively and literally speaking, if the edge weights are allowed to be irrational) to actually terminate.
Edmonds-Karp does the same thing as the Ford-Fulkerson, only it chooses the 'shortest' path, which can be found via a breadth-first search (BFS).
BFS guarantees a certain (partial) ordering among the traversed vertices. For example, consider the following graph: A -> B -> C, BFS guarantees that B will be traversed before C. (You should be able to generalize this argument with more sophisticated graphs, an exercise I leave to you.) For the remainder of this post, let "n" denote the number of levels it takes in BFS to reach the target node. So if we were searching for node C in the example above, n = 2.
Edmonds-Karp behaves similarly to Ford-Fulkerson, only it guarantees that the shortest paths are chosen first. When Edmonds-Karp updates the residual graph, we know that only nodes at a level equal to or smaller than n have actually been traversed. Similarly, only edges between nodes for the first n levels could have possibly been updated in the residual graph.
I'm pretty sure that the 'how we chose v' reflects the ordering that BFS guarantees, since the added residual edges necessarily flow in the opposite direction of any selected path. If the residual edges were to create a shorter path, then it would have been possible to find a shorter path than n in the first place, because the residual edges are only created when a path to the target node has been found and BFS guarantees that the shortest such path has already been found.
Hope this helps and at least gives some insight.
Upvotes: 1
Reputation: 1
I don't quite understand either. But I think that "how we choose v" here means that the flow augmentation only causes the path from s to v becomes shorter, in another way, v is the first node whose path from s becomes shorter because of the augmentation, thus the node u's distance from s does not become shorter.
Upvotes: 0