I’m utilizing the iPad Professional digital camera with AVFoundation
for a marker-based pose estimation pipeline. I’ve confirmed that the intrinsic matrix (AVCameraCalibrationData.intrinsicMatrix
or kCMSampleBufferAttachmentKey_CameraIntrinsicMatrix
) is secure inside a single AVCaptureSession
, however its baseline values change at any time when a brand new session is began — and the distinction depends upon the system’s temperature at that second.
What I observe:
-
If I begin a session proper after cooling the system (e.g. refrigerated), fx/fy are noticeably larger (~3132 / 3130).
-
If I begin a session when the system is already heat, fx/fy are decrease (~3105).
-
As soon as the session is working, intrinsics stay constant and don’t drift over time.
-
The difficulty solely seems between classes, the place the beginning baseline shifts with thermal state.
Affect:
-
Utilizing per-frame intrinsics works high quality inside a single session.
-
However throughout classes, the change in baseline intrinsics introduces inconsistencies (e.g. scale mismatch in reprojection error or marker distances).
Questions:
-
Is that this anticipated conduct — that iPad intrinsics are recalculated at session initialization primarily based on the system’s thermal/optical state?
-
Is there any supported option to “lock” or reuse the identical intrinsics throughout a number of classes, no matter temperature?
-
If not, what is taken into account greatest observe?
- Recalibrate per session?
- All the time belief the dynamic intrinsics?
- Or construct a correction mannequin that accounts for temperature-dependent shifts?
Any insights from Apple engineers or builders who’ve labored with long-duration AVFoundation seize and calibration can be enormously appreciated.