I'm the author of the osxphotos[0] tool mentioned in the article. For photos in an Apple Photos library, osxphotos gives you access to a rich set of metadata beyond what's in the actual EXIF/IPTC/XMP of the image. Apple performs object classification and other AI techniques on your images but generally doesn't expose this to the user. For example, photos are categorized as to object in them (dog, cat, breed of dog, etc.), rich reverse geolocation info (neighborhood, landmarks, etc.) and an interesting set of scores such as "overall aesthetic", "pleasant camera tilt", "harmonious colors", etc. These can be queried using osxphotos, either from the command line, or in your own python code. (Ref API docs[1])<p>For example, to find your "best" photos based on overall aesthetic score and add them to the album "Best Photos" you could run:<p>osxphotos query --query-eval "photo.score.overall > 0.8" --add-to-album "Best Photos"<p>To find good photos with trees in them you could try something like:<p>osxphotos query --query-eval "photo.score.overall > 0.5" --label Tree --add-to-album "Good Tree Photos"<p>There's quite a bit of other interesting data in Photos that you can explore with osxphotos. Run `osxphotos inspect` and it will show you all the metadata for whichever photo is currently selected in the Photos app.<p>[0] <a href="https://github.com/RhetTbull/osxphotos">https://github.com/RhetTbull/osxphotos</a>
[1] <a href="https://rhettbull.github.io/osxphotos/" rel="nofollow noreferrer">https://rhettbull.github.io/osxphotos/</a>