TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Show HN: ChatCBT – AI-powered cognitive behavioral therapist for Obsidian

69 pointsby marjipan200over 1 year ago
ChatCBT is an AI-powered cognitive behavioral therapist for your local Obsidian notes.<p>You have the choice to use OpenAI, or a 100% local model with Ollama for total data privacy.<p>When you&#x27;re done with your conversation, ChatCBT can automatically summarize the chat into a table listing your negative beliefs, emotions, categories of negative thinking, and reframed thoughts. This way you can start to recognize patterns in your thinking and begin to rewire your reactions to disturbing circumstances.<p>Conversations are stored in markdown files on your local machine, ensuring privacy and portability while giving you the freedom to organize your sessions as you please. You could easily share these files with a therapist.<p>I built this for myself when I noticed the patterns of chat help I was getting from my therapist in between therapy sessions was essentially coaching that didn&#x27;t require much context beyond the immediate situation and emotions. This felt like a particularly good use case for LLMs.<p>ChatCBT has been pretty effective for me to talk myself through spiraling episodes of negative thinking. I&#x27;ve been able to get back on my horse faster, and it&#x27;s convenient that it&#x27;s available 24&#x2F;7 and 5000x cheaper than a therapy session (or free if using Ollama). That&#x27;s why I&#x27;d like to share it - curious if it helps anyone else.<p>It&#x27;s under review to become an Obsidian community plugin, but in the meantime it&#x27;s available now via git clone (see readme). Happy for feedback

6 comments

grammarsaintover 1 year ago
I think LLMs have a lot of therapeutic potential, but I am worried about how easily they can cause harm in this area. CBT is the gold standard treatment for many problems, but applying it correctly requires some nuance and an accurate diagnosis of the problem.<p>Consider a teenager who is fixated on the idea that they&#x27;re a &quot;bad person&quot; because they&#x27;re having homosexual thoughts. You might think they need positive affirmations or exercises to challenge their fears with evidence. But these symptoms are sometimes a manifestation of OCD. If so, using CBT to &quot;argue away&quot; the fear could end up reinforcing the OCD cycle and causing more and more self-doubt. For this person, the better treatment would be a different CBT tool: exposure. It may seem odd, but they would be better served with exercises such as repeatedly thinking sexual thoughts and then telling themselves, &quot;I might be a horrible person because of this.&quot; (The purpose is to desensitize them to the idea - eventually, they will just get bored.) Needless to say, this type of treatment needs to be implemented with care.<p>I think it&#x27;s beyond the capabilities of LLMs to reliably distinguish problems like this. So then, I think the systems have to be designed so that their output is at least harmless for all people, and that sounds really hard.
评论 #38505540 未加载
ShamelessCover 1 year ago
Seems irresponsible to advertise to anyone who doesn’t have an understanding of deep learning methods. A big part of therapy is simply getting patients out of the house and interacting with another human being for a time. LLM’s could speak identically and still have failed at providing that. Hence, not good to suggest it as a cheaper&#x2F;free option in my opinion.<p>Having said that, people tend to try to make these a subscription service, while this indeed appears to be entirely free ignoring openAI costs. Still I think if you’re not careful someone might get hurt or might get worse care than they otherwise would have because of the appeal of lower prices.
评论 #38503626 未加载
评论 #38505082 未加载
评论 #38505008 未加载
JCharanteover 1 year ago
How do you prevent openais model telling you to talk to a licensed therapist anytime you say anything slightly negative?
bionsystemover 1 year ago
I can&#x27;t make it work through the plugin, I get an &quot;Invalid initialization vector&quot; error which seems like an node issue ; the ollama server is up and running and I can curl it and prompt it through the cli, no problem.
评论 #38506499 未加载
yz_codingover 1 year ago
This is really cool, well done on shipping, great naming too :)
staflowover 1 year ago
Another day another great AI tool!