我想从立体相机创建一个3D地图,为了测试这个,我使用了两个给定的MATLAB示例:
我将这两个脚本合并为以下一个:
% load left and right images
leftImages = imageDatastore(fullfile(toolboxdir('vision'),'visiondata', ...
'calibration','stereo','left'));
rightImages = imageDatastore(fullfile(toolboxdir('vision'),'visiondata', ...
'calibration','stereo','right'));
% calculate image points
[imagePoints,boardSize] = ...
detectCheckerboardPoints(leftImages.Files,rightImages.Files);
% calculate world points
squareSize = 108;
worldPoints = generateCheckerboardPoints(boardSize,squareSize);
% calculate camera paramters
I = readimage(leftImages,1);
imageSize = [size(I,1),size(I,2)];
stereoParams = estimateCameraParameters(imagePoints,worldPoints, ...
'ImageSize',imageSize);
% get left and right image
frameLeftGray = imread(leftImages.Files{1});
frameRightGray = imread(rightImages.Files{1});
[frameLeftRect, frameRightRect] = ...
rectifyStereoImages(frameLeftGray, frameRightGray, stereoParams);
% get disparity map
disparityMap = disparity(frameLeftRect, frameRightRect);
figure;
imshow(disparityMap, [0, 128]);
title('Disparity Map');
colormap jet
colorbar
% create 3D Bar
points3D = reconstructScene(disparityMap, stereoParams);
% Convert to meters and create a pointCloud object
points3D = points3D ./ 1000;
% This will fail
ptCloud = pointCloud(points3D, 'Color', frameLeftRect);
% Create a streaming point cloud viewer
player3D = pcplayer([-3, 3], [-3, 3], [0, 8], 'VerticalAxis', 'y', ...
'VerticalAxisDir', 'down');
% Visualize the point cloud
view(player3D, ptCloud);
但是,upoon执行我将收到以下错误消息:
使用pointCloud / set.Color时出错(第545行)'Color'必须对应于输入点的数量。
pointCloud(第151行)出错this.Color = C;
DepthEstimation中的错误(第45行)ptCloud = pointCloud(points3D,'Color',frameLeftRect);
在尝试示例1)和2)时,它们分别正常工作。我认为它与图像尺寸本身有关。无论如何调整大小,都会导致错误的摄像机参数。
那么有没有其他方法来修复错误与“颜色”参数存在?
先感谢您
您使用灰度图像作为输入,因此它与RGB点不匹配。从灰度图像创建RGB图像,然后使用它。
rgb = cat(3,frameRightRect,frameRightRect,frameRightRect);
ptCloud = pointCloud(points3D, 'Color', rgb);