Reputation: 3266
I'm developing a routine for automatic enhancement of scanned 35 mm slides. I'm looking for a good algorithm for increasing contrast and removing color cast. The algorithm will have to be completely automatic, since there will be thousands of images to process. These are a couple of sample images straight from the scanner, only cropped and downsized for web:
I'm using the AForge.NET library and have tried both the HistogramEqualization
and ContrastStretch
filters. HistogramEqualization
is good for maximizing local contrast but does not produce pleasing results overall. ContrastStretch
is way better, but since it stretches the histogram of each color band individually, it sometimes produces a strong color cast:
To reduce the color shift, I created a UniformContrastStretch
filter myself using the ImageStatistics
and LevelsLinear
classes. This uses the same range for all color bands, preserving the colors at the expense of less contrast.
ImageStatistics stats = new ImageStatistics(image);
int min = Math.Min(Math.Min(stats.Red.Min, stats.Green.Min), stats.Blue.Min);
int max = Math.Max(Math.Max(stats.Red.Max, stats.Green.Max), stats.Blue.Max);
LevelsLinear levelsLinear = new LevelsLinear();
levelsLinear.Input = new IntRange(min, max);
Bitmap stretched = levelsLinear.Apply(image);
The image is still quite blue though, so I created a ColorCorrection
filter that first calculates the mean luminance of the image. A gamma correction value is then calculated for each color channel, so that the mean value of each color channel will equal the mean luminance. The uniform contrast stretched image has mean values R=70 G=64 B=93
, the mean luminance being (70 + 64 + 93) / 3 = 76
. The gamma values are calculated to R=1.09 G=1.18 B=0.80
and the resulting, very neutral, image has mean values of R=76 G=76 B=76
as expected:
Now, getting to the real problem... I suppose correcting the mean color of the image to grey is a bit too drastic and will make some images quite dull in appearance, like the second sample (first image is uniform stretched, next is the same image color corrected):
One way to perform color correction manually in a photo editing program is to sample the color of a known neutral color (white/grey/black) and adjust the rest of the image to that. But since this routine has to be completely automatic, that is not an option.
I guess I could add a strength setting to my ColorCorrection
filter, so that a strength of 0.5 will move the mean values half the distance to the mean luminance. But on the other hand, some images might do best without any color correction at all.
Any ideas for a better algoritm? Or some method to detect whether an image has a color cast or just has lots of some color, like the second sample?
Upvotes: 31
Views: 4112
Reputation: 4151
You can try auto brightness and contrast from this link : http://answers.opencv.org/question/75510/how-to-make-auto-adjustmentsbrightness-and-contrast-for-image-android-opencv-image-correction/
void Utils::BrightnessAndContrastAuto(const cv::Mat &src, cv::Mat &dst, float clipHistPercent)
{
CV_Assert(clipHistPercent >= 0);
CV_Assert((src.type() == CV_8UC1) || (src.type() == CV_8UC3) || (src.type() == CV_8UC4));
int histSize = 256;
float alpha, beta;
double minGray = 0, maxGray = 0;
//to calculate grayscale histogram
cv::Mat gray;
if (src.type() == CV_8UC1) gray = src;
else if (src.type() == CV_8UC3) cvtColor(src, gray, CV_BGR2GRAY);
else if (src.type() == CV_8UC4) cvtColor(src, gray, CV_BGRA2GRAY);
if (clipHistPercent == 0)
{
// keep full available range
cv::minMaxLoc(gray, &minGray, &maxGray);
}
else
{
cv::Mat hist; //the grayscale histogram
float range[] = { 0, 256 };
const float* histRange = { range };
bool uniform = true;
bool accumulate = false;
calcHist(&gray, 1, 0, cv::Mat(), hist, 1, &histSize, &histRange, uniform, accumulate);
// calculate cumulative distribution from the histogram
std::vector<float> accumulator(histSize);
accumulator[0] = hist.at<float>(0);
for (int i = 1; i < histSize; i++)
{
accumulator[i] = accumulator[i - 1] + hist.at<float>(i);
}
// locate points that cuts at required value
float max = accumulator.back();
clipHistPercent *= (max / 100.0); //make percent as absolute
clipHistPercent /= 2.0; // left and right wings
// locate left cut
minGray = 0;
while (accumulator[minGray] < clipHistPercent)
minGray++;
// locate right cut
maxGray = histSize - 1;
while (accumulator[maxGray] >= (max - clipHistPercent))
maxGray--;
}
// current range
float inputRange = maxGray - minGray;
alpha = (histSize - 1) / inputRange; // alpha expands current range to histsize range
beta = -minGray * alpha; // beta shifts current range so that minGray will go to 0
// Apply brightness and contrast normalization
// convertTo operates with saurate_cast
src.convertTo(dst, -1, alpha, beta);
// restore alpha channel from source
if (dst.type() == CV_8UC4)
{
int from_to[] = { 3, 3 };
cv::mixChannels(&src, 4, &dst, 1, from_to, 1);
}
return;
}
Or apply Auto Color Balance from this link : http://www.morethantechnical.com/2015/01/14/simplest-color-balance-with-opencv-wcode/
void Utils::SimplestCB(Mat& in, Mat& out, float percent) {
assert(in.channels() == 3);
assert(percent > 0 && percent < 100);
float half_percent = percent / 200.0f;
vector<Mat> tmpsplit; split(in, tmpsplit);
for (int i = 0; i < 3; i++) {
//find the low and high precentile values (based on the input percentile)
Mat flat; tmpsplit[i].reshape(1, 1).copyTo(flat);
cv::sort(flat, flat, CV_SORT_EVERY_ROW + CV_SORT_ASCENDING);
int lowval = flat.at<uchar>(cvFloor(((float)flat.cols) * half_percent));
int highval = flat.at<uchar>(cvCeil(((float)flat.cols) * (1.0 - half_percent)));
cout << lowval << " " << highval << endl;
//saturate below the low percentile and above the high percentile
tmpsplit[i].setTo(lowval, tmpsplit[i] < lowval);
tmpsplit[i].setTo(highval, tmpsplit[i] > highval);
//scale the channel
normalize(tmpsplit[i], tmpsplit[i], 0, 255, NORM_MINMAX);
}
merge(tmpsplit, out);
}
Or apply CLAHE to BGR image
Upvotes: 2
Reputation: 606
I needed to do the same thing over a big library of video thumbnails. I wanted a solution that would be conservative, so that I didn't have to spot check for thumbnails getting completely trashed. Here's the messy, hacked-together solution I used.
I first used this class to calculate the distribution of colors in an image. I first did one in HSV-colorspace, but found a grayscale-based one was way faster and almost as good:
class GrayHistogram
def initialize(filename)
@hist = hist(filename)
@percentile = {}
end
def percentile(x)
return @percentile[x] if @percentile[x]
bin = @hist.find{ |h| h[:count] > x }
c = bin[:color]
return @percentile[x] ||= c/256.0
end
def midpoint
(percentile(0.25) + percentile(0.75)) / 2.0
end
def spread
percentile(0.75) - percentile(0.25)
end
private
def hist(imgFilename)
histFilename = "/tmp/gray_hist.txt"
safesystem("convert #{imgFilename} -depth 8 -resize 50% -colorspace GRAY /tmp/out.png")
safesystem("convert /tmp/out.png -define histogram:unique-colors=true " +
" -format \"%c\" histogram:info:- > #{histFilename}")
f = File.open(histFilename)
lines = f.readlines[0..-2] # the last line is always blank
hist = lines.map { |line| { :count => /([0-9]*):/.match(line)[1].to_i, :color => /,([0-9]*),/.match(line)[1].to_i } }
f.close
tot = 0
cumhist = hist.map do |h|
tot += h[:count]
{:count=>tot, :color=>h[:color]}
end
tot = tot.to_f
cumhist.each { |h| h[:count] = h[:count] / tot }
safesystem("rm /tmp/out.png #{histFilename}")
return cumhist
end
end
I then created this class to use the histogram to figure out how to correct an image:
def safesystem(str)
out = `#{str}`
if $? != 0
puts "shell command failed:"
puts "\tcmd: #{str}"
puts "\treturn code: #{$?}"
puts "\toutput: #{out}"
raise
end
end
def generateHist(thumb, hist)
safesystem("convert #{thumb} histogram:hist.jpg && mv hist.jpg #{hist}")
end
class ImgCorrector
def initialize(filename)
@filename = filename
@grayHist = GrayHistogram.new(filename)
end
def flawClass
if !@flawClass
gapLeft = (@grayHist.percentile(0.10) > 0.13) || (@grayHist.percentile(0.25) > 0.30)
gapRight = (@grayHist.percentile(0.75) < 0.60) || (@grayHist.percentile(0.90) < 0.80)
return (@flawClass="low" ) if (!gapLeft && gapRight)
return (@flawClass="high" ) if ( gapLeft && !gapRight)
return (@flawClass="narrow") if ( gapLeft && gapRight)
return (@flawClass="fine" )
end
return @flawClass
end
def percentileSummary
[ @grayHist.percentile(0.10),
@grayHist.percentile(0.25),
@grayHist.percentile(0.75),
@grayHist.percentile(0.90) ].map{ |x| (((x*100.0*10.0).round)/10.0).to_s }.join(', ') +
"<br />" +
"spread: " + @grayHist.spread.to_s
end
def writeCorrected(filenameOut)
if flawClass=="fine"
safesystem("cp #{@filename} #{filenameOut}")
return
end
# spread out the histogram, centered at the midpoint
midpt = 100.0*@grayHist.midpoint
# map the histogram's spread to a sigmoidal concept (linearly)
minSpread = 0.10
maxSpread = 0.60
minS = 1.0
maxS = case flawClass
when "low" then 5.0
when "high" then 5.0
when "narrow" then 6.0
end
s = ((1.0 - [[(@grayHist.spread - minSpread)/(maxSpread-minSpread), 0.0].max, 1.0].min) * (maxS - minS)) + minS
#puts "s: #{s}"
safesystem("convert #{@filename} -sigmoidal-contrast #{s},#{midpt}% #{filenameOut}")
end
end
I ran it like so:
origThumbs = `find thumbs | grep jpg`.split("\n")
origThumbs.each do |origThumb|
newThumb = origThumb.gsub(/thumb/, "newthumb")
imgCorrector = ImgCorrector.new(origThumb)
imgCorrector.writeCorrected(newThumb)
end
Upvotes: 1
Reputation: 679
Convert your RGB to HSL using this:
System.Drawing.Color color = System.Drawing.Color.FromArgb(red, green, blue);
float hue = color.GetHue();
float saturation = color.GetSaturation();
float lightness = color.GetBrightness();
Adjust your Saturation and Lightness accordingly
Convert HSL back to RGB by:
/// <summary>
/// Convert HSV to RGB
/// h is from 0-360
/// s,v values are 0-1
/// r,g,b values are 0-255
/// Based upon http://ilab.usc.edu/wiki/index.php/HSV_And_H2SV_Color_Space#HSV_Transformation_C_.2F_C.2B.2B_Code_2
/// </summary>
void HsvToRgb(double h, double S, double V, out int r, out int g, out int b)
{
// ######################################################################
// T. Nathan Mundhenk
// [email protected]
// C/C++ Macro HSV to RGB
double H = h;
while (H < 0) { H += 360; };
while (H >= 360) { H -= 360; };
double R, G, B;
if (V <= 0)
{ R = G = B = 0; }
else if (S <= 0)
{
R = G = B = V;
}
else
{
double hf = H / 60.0;
int i = (int)Math.Floor(hf);
double f = hf - i;
double pv = V * (1 - S);
double qv = V * (1 - S * f);
double tv = V * (1 - S * (1 - f));
switch (i)
{
// Red is the dominant color
case 0:
R = V;
G = tv;
B = pv;
break;
// Green is the dominant color
case 1:
R = qv;
G = V;
B = pv;
break;
case 2:
R = pv;
G = V;
B = tv;
break;
// Blue is the dominant color
case 3:
R = pv;
G = qv;
B = V;
break;
case 4:
R = tv;
G = pv;
B = V;
break;
// Red is the dominant color
case 5:
R = V;
G = pv;
B = qv;
break;
// Just in case we overshoot on our math by a little, we put these here. Since its a switch it won't slow us down at all to put these here.
case 6:
R = V;
G = tv;
B = pv;
break;
case -1:
R = V;
G = pv;
B = qv;
break;
// The color is not defined, we should throw an error.
default:
//LFATAL("i Value error in Pixel conversion, Value is %d", i);
R = G = B = V; // Just pretend its black/white
break;
}
}
r = Clamp((int)(R * 255.0));
g = Clamp((int)(G * 255.0));
b = Clamp((int)(B * 255.0));
}
/// <summary>
/// Clamp a value to 0-255
/// </summary>
int Clamp(int i)
{
if (i < 0) return 0;
if (i > 255) return 255;
return i;
}
Original Code:
Upvotes: 2
Reputation: 311
there is no aforge.net code, because it processed by php prototype code, but afaik there is no any problem to do such with aforge.net. results are:
Upvotes: 2
Reputation: 9899
In order to avoid changing the color of your image when stretching the constrast, convert it first to HSV/HSL color space. Then, apply regular constrast stretching in the L or V channel but do not chagen H or S channels.
Upvotes: 1