This is a cool example of the intersection of two fields: computational geometry/GIS, and image filtering algorithms.<p>In this case, information from GIS land (e.g. the orientation of the satellite’s aperture relative to true north on the ground below) was fed into the image filter’s autocalibration, because “natural” grid-like structures that should be preserved (cities, farm plots) tend to be aligned to true north, unlike the camera’s distortion, which is aligned to the aperture. (In the article, this is the part where they draw a diagonal mask stripe across the FFT image.)<p>Anyone working on a cross-subdisciplinary problem like this? Have any interesting “systems of algorithms” to share?
working on FFT-transformed images is quite the dark magic if you don't understand what you're looking at; this is especially interesting when you look at what e.g. H.264 does to images in FFT: <a href="https://sidbala.com/h-264-is-magic/" rel="nofollow">https://sidbala.com/h-264-is-magic/</a> (look at the 2% mask and the resulting image - it provokes a 'this shouldn't be possible' reaction).
I remember an image processing application back in the late 90s that would do FFT/reverse FFT that I used a couple of times to “paint out” certain regular artifacts. Same basic idea as this, but not nearly as successful. I can’t remember the app name though.