> <i>Right – the original image, Left – the image with...</i><p>Few issues on this site:<p>1) "Right" and "Left" are wrongly mapped (i.e. "Left" — is the original image in all cases).<p>2) On 1280px width screen pair images shown in vertical order, instead of horizontal.
I actually made a command line program in Python that lets you apply CUBE LUTs to any image of your choice:<p><a href="https://github.com/yoonsikp/pycubelut" rel="nofollow">https://github.com/yoonsikp/pycubelut</a><p>I’m trying to add a GPU acceleration feature using wgpu-py, but it was unfortunately too buggy last time I tried in January
The article title is a bit misleading, I was expecting actual simulation of film stock processing and rendering in Python. Here this is more about 3D LUTs usage, not much to do with film simulation itself.
Completely OT: in case anyone is wondering that's not what garbage trucks look like in Rome, that thing is someone Ape [0] filled with garbage.<p>[0] <a href="https://en.wikipedia.org/wiki/Piaggio_Ape" rel="nofollow">https://en.wikipedia.org/wiki/Piaggio_Ape</a><p>(ape = bee, vespa = wasp: one is for work, the other for leisure, but same company)
The thing about LUTs is that it's only mathematically valid if you know the <i>type</i> of data the "input" is--and I'm not just talking about oh is it an 8 bit image or a 12 bit image.<p>It needs to be aware of the color space and EOTF (an extended idea of "gamma")--which is why LUTs are only used in very controlled scenarios (e.g. for videography, the input color settings are fully detailed, for example Sony's slog, so the LUT is a reproducible, mathematically sound operation)<p>"RAW" photos from cameras are what we call linear color space, where the RGB values correspond linearly to the amount of light received by each photosite. If you try to use a LUT designed for RAW on an sRGB JPEG image, you're gonna have some problems, at least without screwing with the color space.<p>It's why I kind of gave up on trying to use LUTs in photo editing, it's just so unreliable.
The title reminds me of a past submission which is about the use of python in the film industry: <a href="https://news.ycombinator.com/item?id=24826873" rel="nofollow">https://news.ycombinator.com/item?id=24826873</a>
LUTs are quite fun to play with. If you look for videos on "3D LUT Creator" you will find some cool things done using LUTs.<p>If you are looking for a great and free tool to create LUTs, have a look at <a href="https://grossgrade.com/en/" rel="nofollow">https://grossgrade.com/en/</a>
It is not easy to find IMO; I knew existed because I had used it and it took me ages to find it again...<p>Also, while I had no luck with 3D LUT Creator (trial) on wine, Grossgrade works fine :-)
Some feedback, the image with the caption:
Left – the original image, right – the image after applying the 12-bit identity CLUT<p>Looks the most convincing film like. The last sample (Fuji Velvia 50), absolutely does not look like Film at all (let along Velvia 50), main culprit is the shadows underneath the truck. I understand you're just applying RawTherapee's LUT there, but maybe you need to tweak the intensity down or play with the brightness.
Skips the interesting question of how the LUT tables are made, but still nice introduction to the topic.<p>I guess to make film simulation, you could photograph bunch of color calibration targets (eg IT8) in different lighting conditions with both the film and digital sensor, and then try match them <i>somehow</i>. That is assuming the film is still available.
The Velvia simulation at the end is very good, nice work!<p>As someone that still regularly shoots film and also owns a Fuji X Series camera, I don't find the film simulations that Fujifilm puts in the X models to be any good, so I feel like there is still a lot of worthwhile work to be done here.
Back when Instamatic got popular, I was using an N900 and of course didn't have access to the app. So I made my own, by piping images over SSH to a server running a couple of Imagemagick scripts that applied one of a few LUTs I cooked up, and optionally some vignetting maybe.<p>Worked OK, was completely pointless of course.<p>Edit: I meant Hipstamatic. It's been a while.
I thought this was about thin film simulations based on the title :)<p>There is a wonderful transfer matrix method library in Python for reflectometry simulation too, if that's what you were hoping to find.<p><a href="https://github.com/kitchenknif/PyTMM" rel="nofollow">https://github.com/kitchenknif/PyTMM</a>
It's hard for me to see this as doing a good job of simulating film when it doesn't mention grains. Film has a granular structure where each "pixel" is a grain (crystal). Film is essentially already digital, but with a higher count of less regular pixels.