MATLAB: 3D Reconstruction Using an Eight-Point Algorithm

I am trying to do a 3d reconstruction from 2 images. The steps I followed are

1. Found corresponding points between 2 images using SURF.
2. Implemented eight point algo to find "Fundamental matrix"
3. Then, I implemented triangulation.

      

So far, I have gotten the Fundamental Matrix and the results of the triangulation. How do I go further to get a 3D reconstruction? I am confused reading all the material available on the internet.

Also, this is the code. Let me know if this is correct or not.

Ia=imread('1.jpg');
Ib=imread('2.jpg');
Ia=rgb2gray(Ia);
Ib=rgb2gray(Ib);
% My surf addition
% collect Interest Points from Each Image
blobs1 = detectSURFFeatures(Ia);
blobs2 = detectSURFFeatures(Ib);
figure;
imshow(Ia);
hold on;
plot(selectStrongest(blobs1, 36));
figure;
imshow(Ib);
hold on;
plot(selectStrongest(blobs2, 36));
title('Thirty strongest SURF features in I2');
[features1, validBlobs1] = extractFeatures(Ia, blobs1);
[features2, validBlobs2] = extractFeatures(Ib, blobs2);
indexPairs = matchFeatures(features1, features2);
matchedPoints1 = validBlobs1(indexPairs(:,1),:);
matchedPoints2 = validBlobs2(indexPairs(:,2),:);
figure;
showMatchedFeatures(Ia, Ib, matchedPoints1, matchedPoints2);
legend('Putatively matched points in I1', 'Putatively matched points in I2');

for i=1:matchedPoints1.Count
    xa(i,:)=matchedPoints1.Location(i);
    ya(i,:)=matchedPoints1.Location(i,2);
    xb(i,:)=matchedPoints2.Location(i);
    yb(i,:)=matchedPoints2.Location(i,2);
end

matchedPoints1.Count
figure(1) ; clf ;
imshow(cat(2, Ia, Ib)) ;
axis image off ;
hold on ;
xbb=xb+size(Ia,2);
set=[1:matchedPoints1.Count];
h = line([xa(set)' ; xbb(set)'], [ya(set)' ; yb(set)']) ;

pts1=[xa,ya];
pts2=[xb,yb];
pts11=pts1;pts11(:,3)=1;
pts11=pts11';
pts22=pts2;pts22(:,3)=1;pts22=pts22';

width=size(Ia,2);
height=size(Ib,1);
F=eightpoint(pts1,pts2,width,height);

[P1new,P2new]=compute2Pmatrix(F);
XP = triangulate(pts11, pts22,P2new);

      

eightpoint()

function [ F ] = eightpoint( pts1, pts2,width,height)

X = 1:width;
Y = 1:height;
[X, Y] = meshgrid(X, Y);
x0 = [mean(X(:)); mean(Y(:))];
X = X - x0(1);
Y = Y - x0(2);
denom = sqrt(mean(mean(X.^2+Y.^2)));
N = size(pts1, 1);

%Normalized data
T = sqrt(2)/denom*[1 0 -x0(1); 0 1 -x0(2); 0 0 denom/sqrt(2)];
norm_x = T*[pts1(:,1)'; pts1(:,2)'; ones(1, N)];
norm_x_ = T*[pts2(:,1)';pts2(:,2)'; ones(1, N)];
x1 = norm_x(1, :)';
y1= norm_x(2, :)';
x2 = norm_x_(1, :)';
y2 = norm_x_(2, :)';

A = [x1.*x2, y1.*x2, x2, ...
       x1.*y2, y1.*y2, y2, ...
       x1,       y1,     ones(N,1)];

% compute the SVD
[~, ~, V] = svd(A);
F = reshape(V(:,9), 3, 3)';
[FU, FS, FV] = svd(F);
FS(3,3) = 0; %rank 2 constrains
F = FU*FS*FV';

% rescale fundamental matrix
F = T' * F * T;

end

      

triangulate()

function [ XP ] = triangulate( pts1,pts2,P2 )

n=size(pts1,2);
X=zeros(4,n);
for i=1:n
    A=[-1,0,pts1(1,i),0;
        0,-1,pts1(2,i),0;
        pts2(1,i)*P2(3,:)-P2(1,:);
        pts2(2,i)*P2(3,:)-P2(2,:)];
  [~,~,va] = svd(A);
  X(:,i) = va(:,4);
end
XP(:,:,1) = [X(1,:)./X(4,:);X(2,:)./X(4,:);X(3,:)./X(4,:); X(4,:)./X(4,:)];

end

function [ P1,P2 ] = compute2Pmatrix( F )

P1=[1,0,0,0;0,1,0,0;0,0,1,0];
[~, ~, V] = svd(F');
ep = V(:,3)/V(3,3);
P2 = [skew(ep)*F,ep];
end

      

+3


source to share


3 answers


With a quick glance, it looks right. Some notes look like this:

You normalized the code at the eighth point (), not ideal.

This is best done with glasses. Each set of points will have its own scaling matrix. I.e:

[pts1_n, T1] = normalize_pts(pts1);
[pts2_n, T2] = normalize-pts(pts2);

% ... code
% solution
F = T2' * F * T

      

As a side note (for efficiency) you should do

[~,~,V] = svd(A, 0);

      

You also want to enforce the constraint that the base matrix is ​​rank-2. After calculating F, you can:

[U,D,v] = svd(F);
F = U * diag([D(1,1),D(2,2), 0]) * V';

      



In any case, normalization is not the only key for the algorithm to work. You will want to wrap the assessment of the fundamental matrix in a robust assessment framework like RANSAC.

Estimating problems like this are very sensitive to non-Gaussian noise and outliers. If you have a small number of incorrect correspondences or high error points, the algorithm will break.

Finally, in the "triangulator" you want to make sure that the points are not at infinity before uniform division.

I would recommend testing the code with "synthetic" data. That is, generate your own matrices and corresponding matrices. Submit them to the evaluation routine with different noise levels. With zero noise, you should get an accurate solution with floating point precision. As the noise increases, your estimation error increases.

In its current form, doing this on real data is likely to fail unless you "validate" the algorithm with RANSAC or some other reliable estimate.

Good luck.

Good luck.

+1


source


What version of MATLAB do you have?

The System Vision System Tool has a feature called estimateFundamentalMatrix

that will give you a basic matrix. This may give you better results than your code because it uses RANSAC under the hood, making it reliable for false matches. There is also a feature triangulate

starting from R2014b.



What you get is a rare 3D reconstruction. You can plot the resulting 3D points and you can match the color of the corresponding pixel to each one. However, for what you want, you will need to place a surface or triangular mesh at points. Unfortunately, I cannot help you.

0


source


If you are asking how I proceed from the fundamental matrix + corresponding points to the dense model, then you still have a lot of work ahead of you.

relative camera locations (R, T) can be calculated from the fundamental matrix, assuming you know the parameters of the inner camera (down to scale, rotation, translation). There are several ways to get a complete dense matrix. you can try using an existing library (like PMVS). I would look in OpenMVG, but I'm not sure about Matlab's interface.

Another way, you can calculate dense optical flow (many of them are available for Matlab). Look for the epipolarity of OF (it takes on a fundamental matrix and limits the decision to lie on the epipolar lines). Then you can triangulate each pixel to get a depth map.

Finally, you will have to play with the format conversions to get from depth map to VRML (you can look at meshlab)

Sorry my answer is no longer Matlab oriented.

0


source







All Articles