hajo
hajo

Reputation: 344

I only get wrong distances / deep from my disparity Map OpenCV SBGM

I used the SBGM algorithm to create a disparity Image and it gets me a beautiful image. Here is my code :

import numpy as np
import cv2
#load unrectified images
unimgR =cv2.imread("R.jpg")
unimgL =cv2.imread("L.jpg")
#load calibration from calibration file
calibration = np.load(r"C:\Users\XXX\PycharmProjects\rectify\Test3_OpenCV_Rectified.npz", allow_pickle=False)  # load variables from calibration file
imageSize = tuple(calibration["imageSize"])
leftMatrix = calibration["leftMatrix"]
leftDist = calibration["leftDist"]
leftMapX = calibration["leftMapX"]
leftMapY = calibration["leftMapY"]
leftROI = tuple(calibration["leftROI"])
rightMatrix = calibration["rightMatrix"]
rightDist = calibration["rightDist"]
rightMapX = calibration["rightMapX"]
rightMapY = calibration["rightMapY"]
rightROI = tuple(calibration["rightROI"])
disparityToDepthMap = calibration["disparityToDepthMap"]
# Rectify images (including monocular undistortion)
imgL = cv2.remap(unimgL, leftMapX, leftMapY, cv2.INTER_LINEAR)
imgR = cv2.remap(unimgR, rightMapX, rightMapY, cv2.INTER_LINEAR)
# SGBM Parameters 
window_size = 15  # wsize default 3; 5; 7 for SGBM reduced size image; 15 for SGBM full size image (1300px and above); 5 Works nicely
left_matcher = cv2.StereoSGBM_create(
    minDisparity=0,
    numDisparities=160,  # max_disp has to be dividable by 16 f. E. HH 192, 256
    blockSize=5,
    P1=8 * 3 * window_size ** 2,
    # wsize default 3; 5; 7 for SGBM reduced size image; 15 for SGBM full size image (1300px and above); 5 Works nicely
    P2=32 * 3 * window_size ** 2,
    disp12MaxDiff=1,
    uniquenessRatio=15,
    speckleWindowSize=0,
    speckleRange=2,
    preFilterCap=63,
    mode=cv2.STEREO_SGBM_MODE_SGBM_3WAY
)
right_matcher = cv2.ximgproc.createRightMatcher(left_matcher)
# FILTER Parameters
lmbda = 80000
sigma = 1.2
visual_multiplier = 1.0
# Weighted least squares filter to fill sparse (unpopulated) areas of the disparity map
    # by aligning the images edges and propagating disparity values from high- to low-confidence regions
wls_filter = cv2.ximgproc.createDisparityWLSFilter(matcher_left=left_matcher)
wls_filter.setLambda(lmbda)
wls_filter.setSigmaColor(sigma)
# Get depth information/disparity map using SGBM
displ = left_matcher.compute(imgL, imgR)  # .astype(np.float32)/16
dispr = right_matcher.compute(imgR, imgL)  # .astype(np.float32)/16
displ = np.int16(displ)
dispr = np.int16(dispr)
filteredImg = wls_filter.filter(displ, imgL, None, dispr)  # important to put "imgL" here!!!
filteredImg = cv2.normalize(src=filteredImg, dst=filteredImg, beta=0, alpha=255, norm_type=cv2.NORM_MINMAX);
filteredImg = np.uint8(filteredImg)
print("Distance:", 0.12*0.006/displ[1000][500]) #depth= Baseline * focal-lens / disparity
cv2.imshow('Disparity Map', filteredImg)
cv2.waitKey()
cv2.destroyAllWindows()

I use the formula distance = Baseline * focal-lens / disparity

my baseline is 12cm, my focal lens 6mm

The point X,Y=1000,550 should have the distance 10m, but it gives me 1.5550755939524837e-06

I don't understand why is this happening. Here is the image.

Upvotes: 1

Views: 1117

Answers (1)

Madhu Soodhan
Madhu Soodhan

Reputation: 180

The disparity image seems to be correct. But for the depth/distance calculation, you should not hard code the baseline and focal length. You should rather take it from the calibration matrix. Q matrix contains the baseline. This is mainly because, the units of distance(cm/mm/m) is present in the calibration process and later stored in calibration matrices.

So I advice you to take it from the Q matrix.

Upvotes: 1

Related Questions