TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Show HN: I used AI to write my wife a Valentine's day love letter (in French)

5 pointsby timohearover 5 years ago

1 comment

timohearover 5 years ago
She said it was amazingly beautiful and would have been really moved by it had it not been written by an AI. I counter-argued that human curation of AI generated text was a perfectly valid expression of love. I lost :-|.<p>The model generates results that are formatted like love letters with reasonably structured phrases. Verbatim reproduction of training content seems very low. But the sentences are often nonsensical and there&#x27;s not much context preservation apart from some repetition.<p>However it does generate some quite interesting original phrases: &quot; Mon âme a moins de conversations que le désir &quot; &quot; C&#x27;est une joie bien trop violente pour moi &quot; &quot; vous [êtes] comme l&#x27;eau d&#x27;un pur amour &quot;<p>These phrases can be cherry-picked based on the feeling you&#x27;re trying to convey and used to compose unique letters influenced by the best french writers in history.<p>Technically: an OpenAI GPT-2 model (117&#x2F;124M parameters) trained on around 1Gb of french classical litterature was fine-tuned on 300Kb of french love poetry and love songs.<p>We used GPT-2-Simple <a href="https:&#x2F;&#x2F;github.com&#x2F;minimaxir&#x2F;gpt-2-simple" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;minimaxir&#x2F;gpt-2-simple</a> and GPT-2-Cloud-Run <a href="https:&#x2F;&#x2F;github.com&#x2F;minimaxir&#x2F;gpt-2-cloud-run" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;minimaxir&#x2F;gpt-2-cloud-run</a> by Max Woolf.<p>The base french model (GPT2-French) was trained by William Jacques: <a href="https:&#x2F;&#x2F;github.com&#x2F;aquadzn&#x2F;gpt2-french" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;aquadzn&#x2F;gpt2-french</a>. We used his &quot;romans&quot; model which is trained using text from <a href="https:&#x2F;&#x2F;www.bibebook.com&#x2F;download" rel="nofollow">https:&#x2F;&#x2F;www.bibebook.com&#x2F;download</a>, which looks like a compilation of project Gutenberg french novels. His model is trained over 5000 steps.<p>Fine-tuning was done using Max Woolf&#x27;s Colab notebook: <a href="https:&#x2F;&#x2F;colab.research.google.com&#x2F;drive&#x2F;1VLG8e7YSEwypxU-noRNhsv5dW4NfTGce" rel="nofollow">https:&#x2F;&#x2F;colab.research.google.com&#x2F;drive&#x2F;1VLG8e7YSEwypxU-noRN...</a> and lasted only 600 steps (less than 30 minutes). Less steps and the content wasn&#x27;t great, more and it started to reproduce training content verbatim.
评论 #22342517 未加载