Hey HN,<p>Remus here, solo dev building side projects after my 9 to 5.<p>I've always had this idea spinning in my mind for a truly conversational AI that is curious, asks questions, and is compassionate.
Something that understands and resonates with you can offer a safe space to talk about problems that may otherwise be harder to discuss with others.<p>In crafting Sayfli, my mission was clear: to complement, not substitute.<p>Sayfli isn't here to replace the irreplaceable: the warmth of human conversation and the invaluable exchanges with those we hold dear. Rather, it's here to be the first step on a path towards open, meaningful engagement with ourselves and, ultimately, with each other.<p>While Sayfli provides emotional support, it is not a replacement for professional therapy and is not designed to be used as such. It is meant to serve as a preliminary step towards self-awareness and can be used alongside professional counseling.
I hope this doesn't come off as a shallow dismissal, because I'm sure you have good intentions and you've obviously put a lot of thought into this.<p>But on a gut level, it sounds like you're trying to pass off a heaping spoonful of marketing doublespeak here.<p>In particular, the terms "emotional support" and "trusted companion" and "compassionate" are inherently misleading when used to describe these tools. And to characterize them as such is potentially quite harmful to people facing a serious mental health crisis. Worse, you say "it isn't here to replace the irreplaceable" but when you use such terminology you are implicitly conveying the impression that these tools will do exactly that.<p>Drop those and similar terms from your promotional materials, please. "A preliminary step toward self-awareness" sounds infinitely more plausible.