This is a cool example of the intersection of two fields: computational geometry/GIS, and image filtering algorithms.<p>In this case, information from GIS land (e.g. the orientation of the satellite’s aperture relative to true north on the ground below) was fed into the image filter’s autocalibration, because “natural” grid-like structures that should be preserved (cities, farm plots) tend to be aligned to true north, unlike the camera’s distortion, which is aligned to the aperture. (In the article, this is the part where they draw a diagonal mask stripe across the FFT image.)<p>Anyone working on a cross-subdisciplinary problem like this? Have any interesting “systems of algorithms” to share?