Reputation: 39
How would I prove the constraints in line 35.19 are redundant in this linear program in that if we remove them in lines (35.17) – (35.20), any optimal solution to the linear program must satisfy x(v)≤1 for each v∈V. LP
I think I need to use the relaxation version of linear program but am not sure. Beyond that I am not sure how I would prove this.
Upvotes: 0
Views: 430
Reputation: 5388
Your claim is (almost) true assuming that the weights w
are non-negative (otherwise, if some w_v
is negative then removing x_v <= 1
will allow x_v
go to infinity, and the problem will be unbounded).
Now, assume that in an optimal solution there exists one x_v = 1 + ε
, with ε > 0
. If we change this solution to one in which x_v = 1
, then the problem is still feasible, and the objective value is no worse than what it used to be before, so it is also optimal.
This proves that there exist optimal solutions in which x_v <= 1
(if bounded optimal solutions exist at all).
Thus, while it is not true that every optimal solution has x_v <= 1
, it is true that every finite optimal solution can be converted to a solution with x_v <= 1
without loss of generality, and in this sense the constraints are redundant.
For reasons I do not consider here, regular solvers will most likely return a solution in which x_v <= 1
anyway, because of the way they work (they return basic solutions).
I hope this helps.
Upvotes: 0
Reputation: 61289
You're already looking at a relaxed version of the program, otherwise your variables would be constrained to being members of a set.
Likely you need to consider the dual of this program, which will convert >=
constraints into <=
constraints.
Upvotes: 0