Its not often that someones asks you to re-invent the wheel. But this was one one these times and it was mostly necessary.
The task involved projecting already digitized city blocks outlines onto camera images.
A normal approach would be something among the lines of drawing the shapes in a 3d space (directX, openGL etc) and showing the image in the background. But when you want to accurately simulate a “real” existing camera then it seemed to me easier to build the whole construct from scratch.
In Order to do that we had to know both the interior and exterior orientation of the camera.
- the interior orientation is the position of the principal point, the focal length and the radial distortion. Some manufacturers provide those values (when it comes to photogrammetric cameras at least) but most do not. There are ways to calibrate yourself a camera but that’s completely a different topic.
- the exterior orientation of the camera is the position (x,y,z) and rotations (ω,φ,κ) of the camera at the time of the shooting
So what we need to do is to go from city block coordinates to image coordinates.
The main idea is described in the following steps