Distorted / curved point clouds

I am working on a rare reconstruction using a calibrated stereo pair. This is the approach I took step by step:

1st Calibrate your stereo cameras using the Stereo Camera Calibrator app in MATLAB.

2- I Took a couple of stereo images and Undistorted each image.

3rd Functions of detection, extraction and matching.

4th Use the triangulation function in MATLAB to obtain the 3D coordinates of the coincident points by passing a stereoParametes object to the triangulation. The obtained 3D coordinates refer to the optical center of camera 1 (right camera) and are in millimeters.

The problem is that the point clouds appear to be distorted and curved towards the edges of the image. at first it seemed like a barrel lens distortion. so I recalibrated the bumblebee cameras XB3 using the MATLAB camera calibrator app. but this time I used 3 radial distortion factors and also included tangential and oblique parameters. but the results are the same. I also tried Caltech's camera calibration toolkit, but had the same results as MATLAB. radial distortion ratios are similar in both sets of instruments. Another problem is that the Z values ​​in the point cloud are all negative, but I think it might be because I am using the right camera as camera 1 and the left camera as camera 2, as opposed to the MATLAB coordinate system the link is attached.

I've attached a couple of 3D point cloud imagery from a sparse and dense 3D reconstruction. I'm not going to go to Dense 3D, but just wanted to do so to see if the problem exists as it is. I believe the main problem is with the image and camera calibration, not the algorithms.

Now my questions are:

1. What is the main cause / reasons for the presence of deformed / curved 3D clouds? is it just calibrating the camera or may other steps lead to error? how can i check this?

2- Can you suggest another set of camera calibration tools besides MATLAB and Caltech? perhaps the one that is more suitable for radial distortion?

thank

Images:

enter image description here

enter image description here

Links:

coordinate system

code:

clear
close all
clc

load('mystereoparams.mat');
I11 = imread('Right.tif');
I22 = imread('Left.tif');
figure, imshowpair(I11, I22, 'montage');
title('Pair of Original Images');

[I1, newOrigin1] = undistortImage(I11,stereoParams.CameraParameters1);
[I2, newOrigin2] = undistortImage(I22,stereoParams.CameraParameters2);
figure, imshowpair(I1, I2, 'montage');
title('Undistorted Images');

% Detect feature points
imagePoints1 = detectSURFFeatures(rgb2gray(I1), 'MetricThreshold', 600);
imagePoints2 = detectSURFFeatures(rgb2gray(I2), 'MetricThreshold', 600);

% Extract feature descriptors
features1 = extractFeatures(rgb2gray(I1), imagePoints1);
features2 = extractFeatures(rgb2gray(I2), imagePoints2);

% Visualize several extracted SURF features
figure;
imshow(I1);
title('1500 Strongest Feature Points from Image1');
hold on;
plot(selectStrongest(imagePoints1, 1500));

indexPairs = matchFeatures(features1, features2, 'MaxRatio', 0.4);
matchedPoints1 = imagePoints1(indexPairs(:, 1));
matchedPoints2 = imagePoints2(indexPairs(:, 2));

% Visualize correspondences
figure;
showMatchedFeatures(I1, I2, matchedPoints1, matchedPoints2,'montage');
title('Original Matched Features from Globe01 and Globe02');

% Transform matched points to the original image coordinates
matchedPoints1.Location = bsxfun(@plus, matchedPoints1.Location, newOrigin1);
matchedPoints2.Location = bsxfun(@plus, matchedPoints2.Location, newOrigin2);

[Cloud, reprojErrors] = triangulate(matchedPoints1, matchedPoints2, stereoParams);
figure;plot3(Cloud(:,1),Cloud(:,2),Cloud(:,3),'b.');title('Point Cloud before noisy match removal');
xlabel('X'), ylabel('Y'), zlabel('Depth (Z) in mm')

% Eliminate noisy points
meanmean=mean(sqrt(sum(reprojErrors .^ 2, 2)))
standdev=std(sqrt(sum(reprojErrors .^ 2, 2)))
errorDists = max(sqrt(sum(reprojErrors.^2,2)),[],14);

validIdx = errorDists < meanmean+standdev;
tt1=find(Cloud(:,3)>0);
validIdx(tt1)=0;
tt2=find(abs(Cloud(:,3))>1800);
validIdx(tt2)=0;
tt3=find(abs(Cloud(:,3))<1000);
validIdx(tt3)=0;

points3D = Cloud(validIdx, :);

figure;plot3(points3D(:,1),points3D(:,2),points3D(:,3),'b.');title('Point Cloud after noisy match removal');
xlabel('X'), ylabel('Y'), zlabel('Depth (Z) in mm')

validPoints1 = matchedPoints1(validIdx, :);
validPoints2 = matchedPoints2(validIdx, :);

figure;
showMatchedFeatures(I1, I2, validPoints1,validPoints2,'montage');
title('Matched Features After Removing Noisy Matches');

% get the color of each reconstructed point
validPoints1 = round(validPoints1.Location);
numPixels = size(I1, 1) * size(I1, 2);
allColors = reshape(im2double(I1), [numPixels, 3]);
colorIdx = sub2ind([size(I1, 1), size(I1, 2)], validPoints1(:,2), ...
    validPoints1(:, 1));
color = allColors(colorIdx, :);

% add green point representing the origin
points3D(end+1,:) = [0,0,0];
color(end+1,:) = [0,1,0];

% show images
figure('units','normalized','outerposition',[0 0 .5 .5])
subplot(1,2,1);
imshowpair(I1, I2, 'montage');
title('Original Images')

% plot point cloud
hAxes = subplot(1,2,2);
showPointCloud(points3D, color, 'Parent', hAxes, ...
    'VerticalAxisDir', 'down', 'MarkerSize', 40);
xlabel('x-axis (mm)');
ylabel('y-axis (mm)');
zlabel('z-axis (mm)')
title('Reconstructed Point Cloud');

figure, scatter3(points3D(:,1),points3D(:,2),points3D(:,3),50,color,'fill')
xlabel('x-axis (mm)');ylabel('y-axis (mm)');zlabel('z-axis (mm)')
title('Final colored Reconstructed Point Cloud');

      

+3


source to share


1 answer


Your code looks correct. The problem seems to be with the sizing. The fact that you are getting a distorted image with three factors tells me that you may not have enough data points close to the edges of the image to accurately estimate the distortion. However, with your images, this is difficult to see. If you photograph a scene with many straight edges and do not resolve it, you get a better idea.

Therefore, I would recommend taking more checkerboard images, as close to the edges of the image as possible. See if this helps.



Another thing to look out for is estimation errors. In R2014b, the Calibrator Stereo Camera application can optionally return standard error values ​​for each of the estimated parameters. This can give you confidence intervals and tell you if you need more data. See this example.

Oh, and also make sure your calibration images are not saved as jpeg. Use a lossless format like tiff or png.

+1


source







All Articles