TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Eating Disorder Helpline Chatbot Gives 'Harmful' Responses After Firing Staff

14 pointsby hwaynealmost 2 years ago

1 comment

hwaynealmost 2 years ago
According to a separate article (<a href="https:&#x2F;&#x2F;www.vice.com&#x2F;en&#x2F;article&#x2F;n7ezkm&#x2F;eating-disorder-helpline-fires-staff-transitions-to-chatbot-after-unionization" rel="nofollow">https:&#x2F;&#x2F;www.vice.com&#x2F;en&#x2F;article&#x2F;n7ezkm&#x2F;eating-disorder-helpl...</a>) the chatbot wasn&#x27;t supposed to be ChatGPT-based and instead is a &quot;rule-based, guided conversation&quot;. Whether that means it&#x27;s a standard chatbot or an LLM is unclear.<p>- If the former, the problems speak to a systemic issue, where the management had a fundamental misunderstanding of what the helpline was actually doing. Instead they implemented a chatbot based on what they <i>thought</i> it did, which was wholly unhelpful<p>- If the latter, then the managers are idiots.