> Enter ChatGPT. Over the past year, I’ve been using it as my therapist and coach<p>I would think twice to give my soul to an unregulated tech company on the quest to monetize, and I seriously advise anyone to do the same.<p>It is amazing how often people let themselves be screwed over in exchange for convenience or quick results.<p>More broadly, I feel I cannot use AI unless it runs locally and nothing would ever leave my device.
While I'm happy it works for this person, I wouldn't do it myself for 2 big reasons:<p>1. I don't need more technology in my life and more detachment from human connection. If I need a therapist, I'll work with a human.<p>2. I don't trust companies with such sensitive and personal data.<p>RE 2nd point, this comment is spot on - <a href="https://news.ycombinator.com/item?id=39999142">https://news.ycombinator.com/item?id=39999142</a>.
I am working on my own AI Therapist right now to assist in the real therapy that I'm undergoing. It runs locally on my own machine, and with some tailscale, I also have access to it on my mobile devices.<p>I would never use an external system for this. Even ignoring the privacy issues, and they are plentiful, it makes a lot more sense that you tailor the AI therapist to your needs, which most of the time are not generic needs. If you're in a therapy program you probably know those needs better than most people and can concretely ask the AI to assist you in meeting them over a longer period of time.<p>I'm happy with my system, it's a little clunky still but it is doing the job in assisting my recovery, which is all I want.
"There is nothing quite so useless as doing with great efficiency something that should not be done at all." (Peter Drucker)<p>Some experiential inquiry exposing the extent to which vast swathes of "suffering" are purely the result of overidentifying with a meaningless stream of thoughts would go a lot farther than building an AI coach that can only perpetuate engagement with same.
ELIZA returns…<p><a href="https://en.wikipedia.org/wiki/ELIZA" rel="nofollow">https://en.wikipedia.org/wiki/ELIZA</a>
I tried it and while I like the idea to talk somebody. Everytime I asked for advice I got the „it is totally normal to feel…“ and then a standard response I would get from the first 2 clicks in google.<p>It reminds me alot of the AI therapist from the „pod generation“ movie. Frankly the voice got better and building sentences, but the underlying personalised part is just not there.
Neato. I’m building my own as well but not as a therapist (I pay humans for that, which is what I recommend. Pay for a pro)<p>My approach is more how can I make a todo list app that’s not pushing hustle but also not ignorable.
Looks interesting! We built an extension for something similar.<p><a href="https://github.com/CominiLearning/recenter">https://github.com/CominiLearning/recenter</a><p>Often find myself mindlessly clicking over to reddit or some other site while thinking and before I know it I am being sucked in by something there. Those minutes add up. Wanted something that proactively tells me that this is happening and also summarizes how time was spent. Have found it be a huge help!
I have been doing this with local LLMs that are "unsafe" using some basic python. I write a diary entry and ask it "Tell me some uncomfortable truths about this person." as the prompt and it roasts me. I then ask it, "Tell me why this person is not on the path to becoming a billionaire" and it says I spend way too much time on my personal life and relationships when I should instead be focused on starting businesses in high-growth industries and networking with high net worth individuals.<p>Some of the more standard LLMs try very very hard not to insult me and tell me to talk to a licensed professional for everything. That's not what I want. I wanted to get the most impolite LLMs I could, so I looked up some list about which ones were the best for topics like erotic roleplay and those worked way better for even non-sex topics because they're not RLHF'd to be unconditionally nice or excessively cautious.
I was looking at pricing for an hour chat with such an AI "therapist"<p><a href="https://docs.vapi.ai/pricing">https://docs.vapi.ai/pricing</a><p>If you add up costs of the entire stack: Vapi + Deepgram + GPT4 + ElevenLabs, it comes to be $18/hr<p>That's not exactly cheap.
Are there easy ways to implement something like that (without the phone part) with gpt4all, or other open alternatives? It would be very interesting to have history as well and the assistant should be able to use the history for future answers.