Hey, HN! I am colorblind and I can't reliably identify when a banana is ripe enough to eat, so I made a machine learning model that does it for me. The model (92% val accuracy, ~12MB) runs on your device (tf.js) and you can upload directly from the camera on mobile.<p>Feedback is always appreciated!<p>Data, code and trained model are available here:
<a href="https://github.com/giovannipcarvalho/banana-ripeness-classification" rel="nofollow">https://github.com/giovannipcarvalho/banana-ripeness-classif...</a>
I tried to guess an album covers decade:
<a href="https://uncover-a-covers-decade.onrender.com/" rel="nofollow">https://uncover-a-covers-decade.onrender.com/</a><p>You are doing the fast.ai course too, right? [edit: looking at your code, no, you did not :)]<p>I chose my challenge to maximize my learning through something challenging enough it might fail. It's only right 60-70% of the time.<p>What I learned:<p>- Binning in to decades might not be Ideal. Something from 79 and 80 is more similar than something from 71 and 79. I guess a better approach would have been smaller bins (3 or more per decade) and than rounding the result to the decade.<p>- Be careful how you gather your learning data. I pulled from the google image search with something like "album OR LP cover 20s/30s/40s..."; this has ~20% "best of the XXs" collections in them where the cover was designed in a much later decade. I should have searched for "1920 OR 1921 OR 1922...".
Nice, pretty cool project. It correctly identified some bananas I had just bought as a little green.<p>Any chance for making an avocado identifier, because those are pretty tough to guess wether they are ripe.