Tom
Tom

Reputation: 464

combined Scharr derivatives in opencv

I have few questions regarding Scharr derivatives and its OpenCV implementation.

I am interested in second order image derivatives with (3X3) kernels. I started with Sobel second derivative, which failed to find some thin lines in the images. After reading the Sobel and Charr comparison in the bottom of this page, I decided to try Scharr instead by changing this line:

Sobel(gray, grad, ddepth, 2, 2, 3, scale, delta, BORDER_DEFAULT);

to this line:

Scharr(img, gray, ddepth, 2, 2, scale, delta, BORDER_DEFAULT );

My problem is that it seems like cv::Scharr allows performing an only first order of one partial derivative at a time, So I get the following error:

error: (-215) dx >= 0 && dy >= 0 && dx+dy == 1 in function getScharrKernels

(see assertion line here)

Following this restriction, I have a few questions regarding Scharr derivatives:

  1. Is it considered bad-practice to use high order Scharr derivatives? Why did OpenCV choose to assert dx+dy == 1?
  2. If I am to call Scharr twice for each axis, What is the correct way to combine the results? I am currently using:

    addWeighted( abs_grad_x, 0.5, abs_grad_y, 0.5, 0, grad );

    but I am not sure that this how the Sobel function combines the two axis and in what order it should be done for all 4 derivatives.

  3. If I am to compute the (dx=2,dy=2) derivative by using 4 different kernels, I would like to reduce processing time by unifying all 4 kernels into 1 before applying it on the image (I assume that this is what cv::Sobel does). Is there a reasonable way to create such combined Shcarr kernel and convolve it with my image?

Thanks!

Upvotes: 2

Views: 3285

Answers (1)

alkasm
alkasm

Reputation: 23032

  1. I've never read the original Scharr paper (the dissertation is in German) so I don't know the answer to why the Scharr() function doesn't allow higher order derivatives. Maybe because of the first point I make in #3 below?

  2. The Scharr function is supposed to be a derivative. And the total derivative of a multivariable function f(x) = f(x0, ..., xN) is

    df/dx = dx0*df/dx0 + ... + dxN*df/dxN
    

    That is, the sum of the partials each multiplied by the change. In the case of images of course, the change dx in the input is a single pixel, so it's equivalent to 1. In other words, just sum the partials; not weighting them by half. You can use addWeighted() with 1s as the weights, or you can just sum them, but to make sure you won't saturate your image you'll need to convert to a float or 16-bit image first. However, it's also pretty common to compute the Euclidean magnitude of the derivatives, too, if you're trying to get the gradient instead of the derivative.

    However, that's just for the first-order derivative. For higher orders, you need to apply some chain ruling. See here for the details of combining a second order.

  3. Note that an optimized kernel for first-order derivatives is not necessarily the optimal kernel for second-order derivatives by applying it twice. Scharr himself has a paper on optimizing second-order derivative kernels, you can read it here.

    With that said, filters are split into x and y directions to make linear separable filters, which basically turn your 2d convolution problem into two 1d convolutions with smaller kernels. Think of the Sobel and Scharr kernels: for the x direction, they both just have the single column on either side with the same values (except one is negative). When you slide the kernel across the image, at the first location, you're multiplying the first column and the third column by the values in your kernel. And then two steps later, you're multiplying the third and the fifth. But the third was already computed, so that's wasteful. Instead, since both sides are the same, just multiply each column by the vector since you know you need those values, and then you can just look up the values for the results in column 1 and 3 and subtract them.

    In short, I don't think you can combine them with built-in separable filter functions, because certain values are positive sometimes, and negative otherwise; and the only way to know when applying a filter linearly is to do them separately. However, we can examine the result of applying both filters and see how they affect a single pixel, construct the 2D kernel, and then convolve with OpenCV.

Suppose we have a 3x3 image:

image
=====
a b c 
d e f 
g h i

And we have the Scharr kernels:

kernel_x
========
-3    0   3
-10   0   10
-3    0   3

kernel_y
========
-3   -10  -3
 0    0    0
 3    10   3

The result of applying each kernel to this image gives us:

image * kernel_x
================
-3a  -10b  -3c 
+0d  +0e   +0f
+3g  +10h  +3i  

image * kernel_y  
================
-3a   +0b  +3c 
-10d  +0e  +10f 
-3g   +0h  +3i

These values are summed and placed into pixel e. Since the sum of both of these is the total derivative, we sum all these values into pixel e at the end of the day.

image * kernel_x + image * kernel y
===================================
-3a -10b -3c           +3g +10h +3i
-3a      +3c -10d +10f -3g      +3i
+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+
-6a -10b +0c -10d +10f +0g +10h +6i

And this is the same result we'd have gotten if we multiplied by the kernel

kernel_xy
=============
-6   -10   0
-10   0    10
 0    10   6

So there's a 2D kernel that does a single-order derivative. Notice anything interesting? It's just the addition of the two kernels. Is that surprising? Not really, as x(a+b) = ax + bx. Now we can pass that into filter2D() to compute the addition of the derivatives. Does that actually give the same result?

import cv2
import numpy as np

img = cv2.imread('cameraman.png', 0).astype(np.float32)

kernel = np.array([[-6, -10, 0],
                   [-10, 0, 10],
                   [0, 10, 6]])

total_first_derivative = cv2.filter2D(img, -1, kernel)

scharr_x = cv2.Scharr(img, -1, 1, 0)
scharr_y = cv2.Scharr(img, -1, 0, 1)

print((total_first_derivative == (scharr_x + scharr_y)).all())

True

Yep. Now I guess you can just do it twice.

Upvotes: 5

Related Questions