ChatCBT is an AI-powered cognitive behavioral therapist for your local Obsidian notes.<p>You have the choice to use OpenAI, or a 100% local model with Ollama for total data privacy.<p>When you're done with your conversation, ChatCBT can automatically summarize the chat into a table listing your negative beliefs, emotions, categories of negative thinking, and reframed thoughts. This way you can start to recognize patterns in your thinking and begin to rewire your reactions to disturbing circumstances.<p>Conversations are stored in markdown files on your local machine, ensuring privacy and portability while giving you the freedom to organize your sessions as you please. You could easily share these files with a therapist.<p>I built this for myself when I noticed the patterns of chat help I was getting from my therapist in between therapy sessions was essentially coaching that didn't require much context beyond the immediate situation and emotions. This felt like a particularly good use case for LLMs.<p>ChatCBT has been pretty effective for me to talk myself through spiraling episodes of negative thinking. I've been able to get back on my horse faster, and it's convenient that it's available 24/7 and 5000x cheaper than a therapy session (or free if using Ollama). That's why I'd like to share it - curious if it helps anyone else.<p>It's under review to become an Obsidian community plugin, but in the meantime it's available now via git clone (see readme). Happy for feedback
I think LLMs have a lot of therapeutic potential, but I am worried about how easily they can cause harm in this area. CBT is the gold standard treatment for many problems, but applying it correctly requires some nuance and an accurate diagnosis of the problem.<p>Consider a teenager who is fixated on the idea that they're a "bad person" because they're having homosexual thoughts. You might think they need positive affirmations or exercises to challenge their fears with evidence. But these symptoms are sometimes a manifestation of OCD. If so, using CBT to "argue away" the fear could end up reinforcing the OCD cycle and causing more and more self-doubt. For this person, the better treatment would be a different CBT tool: exposure. It may seem odd, but they would be better served with exercises such as repeatedly thinking sexual thoughts and then telling themselves, "I might be a horrible person because of this." (The purpose is to desensitize them to the idea - eventually, they will just get bored.) Needless to say, this type of treatment needs to be implemented with care.<p>I think it's beyond the capabilities of LLMs to reliably distinguish problems like this. So then, I think the systems have to be designed so that their output is at least harmless for all people, and that sounds really hard.
Seems irresponsible to advertise to anyone who doesn’t have an understanding of deep learning methods. A big part of therapy is simply getting patients out of the house and interacting with another human being for a time. LLM’s could speak identically and still have failed at providing that. Hence, not good to suggest it as a cheaper/free option in my opinion.<p>Having said that, people tend to try to make these a subscription service, while this indeed appears to be entirely free ignoring openAI costs. Still I think if you’re not careful someone might get hurt or might get worse care than they otherwise would have because of the appeal of lower prices.
I can't make it work through the plugin, I get an "Invalid initialization vector" error which seems like an node issue ; the ollama server is up and running and I can curl it and prompt it through the cli, no problem.