使用 OpenCV 计算仿射变换的部分误差

问题描述 投票:0回答:1

我最近一直在使用 OpenPnP(这太棒了!),并且我的兴趣达到了顶峰,足以尝试并真正理解它的工作原理。

我正在尝试使用 OpenCVSharp 在 C# 中对具有 3 个基准点和 3 个特征的 PCB 进行基本转换。

我已经在一些 CAD 软件中对几何图形进行了建模,以再次进行完整性检查,这是我的“PCB”。

F1-3 是 3 个基准点,其中 F1 是 PCB 坐标系的 0,0。 P1-3 是 PCB 上我感兴趣的 3 个“功能”。

enter image description here

这是叠加在“机器”上的“PCB”,其中绿点是机器的0,0。因此,从这里我基本上已经知道了相对于机器 0,0 的 3 个特征位置。

enter image description here

我认为我需要进行“仿射变换”,并且花了几个小时将其整合在一起。

using OpenCvSharp;

namespace affinetransform
{
    class Program
    {
        static void Main(string[] args)
        {
            // Define fiducials in PCB coordinates
            Point2f[] pcbFiducials = new Point2f[]
            {
                new Point2f(0, 0),
                new Point2f(100, 0),
                new Point2f(100, -80)
            };

            // Corresponding fiducial points as measured in machine coordinates
            Point2f[] cameraFiducials = new Point2f[]
            {
                new Point2f(190.62f, -83.7f),
                new Point2f(290.24f, -74.99f),
                new Point2f(297.21f, -154.68f)
            };

            // Compute the affine transformation matrix
            Mat affineTransform = Cv2.GetAffineTransform(InputArray.Create(pcbFiducials), InputArray.Create(cameraFiducials));

            // Define the 3 'features' on the PCB we're interested in, in PCB coordinates
            Point2f[] pcbFeatures = new Point2f[]
            {
                new Point2f(24, -18),
                new Point2f(35, -62),
                new Point2f(74, -28)
            };

            // Convert feature points to a Mat object, because we need to pass Mat type to the Transform method
            Mat pcbFeaturesMat = new Mat(pcbFeatures.Length, // Rows
                                            1,                  // Columns
                                            MatType.CV_32FC2,
                                            pcbFeatures);       // The features on the PCB we want to transform
            for (int i = 0; i < pcbFeaturesMat.Rows; i++)
            {
                for (int j = 0; j < pcbFeaturesMat.Cols; j++)
                {
                    Console.Write($"{pcbFeaturesMat.At<float>(i, j)}\t");
                }
                Console.WriteLine();
            }


            // Transform feature points to machine/camera coordinates
            Mat cameraFeaturesMat = new Mat();
            Cv2.Transform(pcbFeaturesMat, cameraFeaturesMat, affineTransform);

            // Save the transformed points to a CSV file
            SavePointsToCsv(cameraFeaturesMat, @"C:\users\user\desktop\cameraFeatures.csv");
        }

        static void SavePointsToCsv(Mat points, string filename)
        {
            using (var writer = new System.IO.StreamWriter(filename))
            {
                writer.WriteLine("X,Y");
                for (int i = 0; i < points.Rows; i++)
                {
                    float x = points.At<float>(i, 0);
                    float y = points.At<float>(i, 1);
                    writer.WriteLine($"{x},{y}");
                }
            }
            Console.WriteLine($"Saved transformed points to {filename}");
        }
    }
}

这是它输出的转换后的数据,这并不是“完全”错误。正如您从我的图像中看到的,216.09、230.89 和 266.78 是该表中 X 的 3 个正确转换值,但位置错误 - 其余 3 个数据点只是 F1 的机器 X 位置,重复两次,并且数量巨大。 X,Y 216.09705,230.88875 230.88875,266.7783 266.7783,1.11E-43

这段代码计算出的仿射变换矩阵本身是:

Transformation Matrix: 0.99619995117187, -0.08712501525878906, 190.6199951171875 0.08709999084472657, 0.9961249351501464, -83.69999694824219

完全被难住了,希望一些聪明的 OpenCV 人能给我指出正确的方向!

c# opencv computer-vision affinetransform
1个回答
0
投票

计算变换后,我现在将结果转换回 Point2f[],这样更容易使用,然后将其打印出来。

using OpenCvSharp; using System; namespace AffineTransform { class Program { static void Main(string[] args) { // Define the locations of the fiducials on the PCB in PCB coordinates Point2f[] pcbFiducials = new Point2f[] { new Point2f(0, 0), new Point2f(100, 0), new Point2f(100, -80) }; // Define the same fiducial points as measured in machine coordinates, i.e. the position the PCB sits in the machine Point2f[] cameraFiducials = new Point2f[] { new Point2f(190.62f, -83.7f), new Point2f(290.24f, -74.99f), new Point2f(297.21f, -154.68f) }; // Compute the affine transformation matrix Mat affineTransform = Cv2.GetAffineTransform(pcbFiducials, cameraFiducials); // Print the affine transformation matrix to check it's correctness Console.WriteLine("Affine Transformation Matrix:"); for (int i = 0; i < affineTransform.Rows; i++) { for (int j = 0; j < affineTransform.Cols; j++) { Console.Write($"{affineTransform.At<double>(i, j)}\t"); } Console.WriteLine(); } // Define the 3 'features' on the PCB we're interested in transforming to machine coordinates, in PCB coordinates Point2f[] pcbFeatures = new Point2f[] { new Point2f(24, -18), new Point2f(35, -62), new Point2f(74, -28) }; // Convert feature points to a Mat object, as Mat is needed to pass to Transform() later Mat pcbFeaturesMat = new Mat(pcbFeatures.Length, 1, MatType.CV_32FC2, pcbFeatures); // Transform feature points to machine/camera coordinates, same reason as above Mat cameraFeaturesMat = new Mat(); // Compute the actual transform Cv2.Transform(pcbFeaturesMat, cameraFeaturesMat, affineTransform); // Convert the result back to Point2f[], to make them much easier to actually use Point2f[] cameraFeatures = new Point2f[cameraFeaturesMat.Rows]; for (int i = 0; i < cameraFeaturesMat.Rows; i++) { cameraFeatures[i] = new Point2f(cameraFeaturesMat.At<Point2f>(i).X, cameraFeaturesMat.At<Point2f>(i).Y); } // Print the transformed points Console.WriteLine("Transformed Points:"); foreach (var point in cameraFeatures) { Console.WriteLine($"X = {point.X}, Y = {point.Y}"); } // Save the transformed points to a CSV file SavePointsToCsv(cameraFeatures, @"C:\users\user\desktop\cameraFeatures.csv"); } static void SavePointsToCsv(Point2f[] points, string filename) { using (var writer = new System.IO.StreamWriter(filename)) { writer.WriteLine("X,Y"); foreach (var point in points) { writer.WriteLine($"{point.X},{point.Y}"); } } Console.WriteLine($"Saved transformed points to {filename}"); } } }

	
© www.soinside.com 2019 - 2024. All rights reserved.