python – 为什么OpenCV认为我使用的是非平面校准装置?

我正在玩模拟摄像机,试图了解OpenCV的calib3d模块如何工作和执行.

我在3D空间中创建了一组人工对象点,对应于z = 50处的九个点的平面网格:

obj_pts = np.zeros((9, 3), dtype='float32')
obj_pts[0] = np.array([40, 40, 50], dtype='float32')
obj_pts[1] = np.array([50, 40, 50], dtype='float32')
obj_pts[2] = np.array([60, 40, 50], dtype='float32')
obj_pts[3] = np.array([40, 50, 50], dtype='float32')
obj_pts[4] = np.array([50, 50, 50], dtype='float32')
obj_pts[5] = np.array([60, 50, 50], dtype='float32')
obj_pts[6] = np.array([40, 60, 50], dtype='float32')
obj_pts[7] = np.array([50, 60, 50], dtype='float32')
obj_pts[8] = np.array([60, 60, 50], dtype='float32')

然后在创建人造相机后使用cv2.projectPoints()对其进行成像:

rvec = (0, 0, 0)  # rotation relative to the frame
tvec = (0, 0, 0)  # translation relative to the frame
distCoeffs = (0, 0, 0, 0)
cameraMatrix = np.zeros((3, 3))
focalLength = 50
cx = 0
cy = 0
setupCameraMatrix(cameraMatrix, focalLength, cx, cy) # my own routine

img_pts, jacobian = cv2.projectPoints(obj_pts, rvec, tvec, cameraMatrix, distCoeffs)

使用上述参数投影到图像平面中,图像点看起来像这样(红点仅指示左下角的方向):

最后我试图检索原始相机校准:

obj_pts_list = [obj_pts]
img_pts_list = [img_pts]
ret, mtx, dist, rvecs, tvecs = cv2.calibrateCamera(obj_pts_list, img_pts_list, (200, 200), None, None)

但是,最后一步会出现此错误:

OpenCV Error: Bad argument (For non-planar calibration rigs the initial intrinsic matrix must be specified) in cvCalibrateCamera2, file /tmp/opencv20150527-4924-hjrvz/opencv-2.4.11/modules/calib3d/src/calibration.cpp, line 1592

我的问题不在于如何解决这个错误本身 – 而是为什么它首先被抛出?当所有物点位于同一平面时,为什么这个装置构成非平面装备?我误解了吗?

最佳答案 OpenCV期望z = 0,因为平面校准目标就是这种情况.

查看代码,OpenCV检查此值如下:

Scalar mean, sdv;
meanStdDev(matM, mean, sdv);
if( fabs(mean[2]) > 1e-5 || fabs(sdv[2]) > 1e-5 )
        CV_Error( CV_StsBadArg,
"For non-planar calibration rigs the initial intrinsic matrix must be specified" );

这意味着设置z不是0会触发错误.然后必须使用z = 0,除非使用非平面校准目标.

点赞