Reputation: 33
Given N integers. Each of these numbers can be increased or decreased once by no more than given positive integer L. After each operation if any numbers become equal we consider them as one number. The problem is to calculate cardinality of minimal set of distinct integers.
Constraints: N <= 100, L <= 3200, integers are in the range [-32000, 32000]
Example: N = 3, L = 10 11 21 27
1) increase 11 by 10 => 21 21 27
2) decrease 27 by 6 => 21 21 21
The answer is 1.
Algo in C++ language:
sort(v.begin(), v.end());
// the algo tries to include elements in interval of length 2 * L
int ans = 0;
int first = 0;
for(int i = 1; i < N; ++i) {
if(v[i] - v[first] > 2 * L) { // if we can't include i-th element
ans++; // into the current interval
first = i; // the algo construct new
}
}
ans++;
printf("%d", ans);
I try to understand why this greedy algo is optimal. Any help is appreciated.
Upvotes: 1
Views: 151
Reputation: 65427
Reframed, we're trying to cover the set of numbers that appear in the input with as few intervals of cardinality 2*L + 1
as possible. You can imagine that, for an interval [C - L, C + L]
, all numbers in it are adjusted to C
.
Given any list of intervals in sorted order, we can show inductively in k
that, considering only the first k
intervals, the first k
of greedy covers at least as much of the input. The base case k = 0
is trivial. Inductively, greedy covers the next uncovered input element and as much as it can in addition; the interval in the arbitrary solution that covers its next uncovered input element must be not after greedy's, so the arbitrary solution has no more coverage.
Upvotes: 2