Reputation: 13
I'm writing my own Intensity histogram for greyscale images where the number of bins is passed into the function. This is what i have so far:
std::vector<unsigned int> Image::histogram(const int bins)
{
std::vector<unsigned int> histogram(bins ,0);
for (unsigned int i(0); i < bins; i++)
{
for (unsigned int j(0); j < m_height * m_width; ++j)
{
if (i == m_p_image[j])
{
histogram[i]++;
}
}
}
return histogram;
}
This works perfectly for 256 bins as each count is added to histogram, but for 128 bins its misses the second half of the image, I know I need to implement a way of grouping points together if the bin size is less than 256 but I'm unsure how to do this.
Upvotes: 1
Views: 462
Reputation: 490108
Your code strikes me as unnecessarily clumsy. There's no real need for the outer loop.
To answer the question you asked, however, the usual way to do this would be to use linear interpolation--that is, find the proportional position of a value in the input range, then increment the same proportional position in the output range.
for (j =0; j<height * width; j++) {
double input_pos = image[j] / 256.0;
int output_pos = int(input_pos * bin_count);
++histogram[output_pos];
}
Given that these are colors, you could (if you chose to) apply a gamma curve instead of doing linear interpolation. The reason to do that would be if you wanted to model how you see colors instead of just basing the histogram on the input numbers themselves. The difference between the two is based on the fact that vision is something like logarithmic instead of linear, so a linear histogram (especially if you're using relatively few bins compared to the number of possible input values) doesn't represent what we see very accurately.
Upvotes: 1