I implemented webmentions for my static site back around the end of 2019. My solution for receiving on my static site was to just log POSTs and review them with my eyes. Then if someone has actually sent a webmention (as opposed to pingback spam) I just use curl to respond. There's no need to have anything automated at all.<p>ref: "A static, manual system for <i>receiving</i> webmentions (and pingback) with nginx" <a href="http://superkuh.com/blog/2020-01-10-1.html" rel="nofollow">http://superkuh.com/blog/2020-01-10-1.html</a>
Why do you need an always on service for this? Just send the webmentions whenever you compile your static website - that's the only time content actually changes.<p>On Pelican, you can do it using this plugin <a href="https://chezsoi.org/lucas/blog/pelican-pingback-and-webmentions.html" rel="nofollow">https://chezsoi.org/lucas/blog/pelican-pingback-and-webmenti...</a>
I also use Remy Sharp’s code, but as a library. I have netlify call this glitch [1] I wrote on successful deploy. It compares the new sitemap.txt with one it has cached and scans new URLs which match a pattern for mentions to dispatch.<p>1: <a href="https://glitch.com/edit/#!/lean-send-webmentions" rel="nofollow">https://glitch.com/edit/#!/lean-send-webmentions</a>
I like the idea, but I wonder if I should bother: in other words, are people actually using this? Is this the new pingback, because that seemed like a good idea, too, but it never got off the ground.
It still amazes me that people have managed to create a viable system of decentralized comments across independent, statically-generated websites. Perhaps I should heed the (humorous) warning by Ben Werdmuller here, though:<p><a href="https://aaronparecki.com/2013/05/21/4/xkcd#mentions" rel="nofollow">https://aaronparecki.com/2013/05/21/4/xkcd#mentions</a>