TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Can A.I. be blamed for a teen's suicide?

52 点作者 uxhacker7 个月前

35 条评论

autumnstwilight7 个月前
I am mildly to moderately critical of generative AI and how it&#x27;s being marketed, but the root issue here seems to be existing suicidal ideation. The bot didn&#x27;t initiate talk of suicide and told him not to do it when he brought it up directly. It seems it wasn&#x27;t capable of detecting euphemistic references to suicide and therefore responded as if roleplaying about meeting in person.<p>That said, I think this should throw a bucket of cold water on anyone recommending using generative AI as a therapist&#x2F;counsellor&#x2F;companion or creating and advertising &quot;therapist&quot; chatbots, because it simply isn&#x27;t reasonable to expect them to respond appropriately to things like suicidal ideation. That isn&#x27;t the purpose or design of the technology, and they can be pushed into agreeing with the user&#x27;s statements fairly easily.
评论 #41931840 未加载
评论 #41936567 未加载
评论 #41930855 未加载
评论 #41930942 未加载
phrojoe7 个月前
The NYT article [0] gives only one line to perhaps the most important and tragic fact about this suicide: the teenager had access to his father’s gun. If the gun was properly secured it is very likely he would still be alive [1].<p>[0] <a href="https:&#x2F;&#x2F;www.nytimes.com&#x2F;2024&#x2F;10&#x2F;23&#x2F;technology&#x2F;characterai-lawsuit-teen-suicide.html" rel="nofollow">https:&#x2F;&#x2F;www.nytimes.com&#x2F;2024&#x2F;10&#x2F;23&#x2F;technology&#x2F;characterai-la...</a><p>[1] <a href="https:&#x2F;&#x2F;www.hsph.harvard.edu&#x2F;means-matter&#x2F;means-matter&#x2F;youth-access&#x2F;" rel="nofollow">https:&#x2F;&#x2F;www.hsph.harvard.edu&#x2F;means-matter&#x2F;means-matter&#x2F;youth...</a>
评论 #41930299 未加载
评论 #41941922 未加载
评论 #41929252 未加载
K0balt7 个月前
It seems like there was a long, downward spiral associated with this child’s use of character.ai that the parents were aware of, had him sent to therapy over, etc.<p>My question here is, what the hell were the parents doing, not removing this obviously destructive intrusion in his life? This reads to me the same as if he had been using drugs but the parents didn’t take away his stash or his paraphernalia.<p>For the sake of your children, people, remember that a cellphone is not an unequivocal good, nor a human right that children are entitled to unlimited use of, and there are plenty of apps and mechanisms by which you can monitor or limit your child’s use or misuse of technology.<p>Also, just don’t give kids screens. Period. A laptop maybe if they are using it for creative purposes, but the vast majority of social media and consumption that is achieved by children on cellphones and tablets is a negative force in their lives.<p>I see 3 year olds scrolling TikTok in the store these days. It makes me ill. Those kids are sooooooo fucked. That should legit be considered child endangerment.
评论 #41931326 未加载
enews017 个月前
C.AI shouldn&#x27;t be marketed to kids, and it should have stopped when suicide was mentioned. But its also baffling to think that he had unrestricted access to a firearm. I don&#x27;t think a lawsuit to C.AI is entirely right here.
评论 #41936429 未加载
评论 #41931823 未加载
koolala7 个月前
Access to a Gun is way more of a suicide encouragement than access to an AI.<p>AI are finetuned to not tell you how to painlessly end your life. Do they need fine-tuning for instilling existential fear of death like religions use? Anyone can invent a heaven in their mind that makes death appealing. Mixing a fictional world with the real world is dangerous when you believe the fictional world is larger than the real world. In reality, reality encapsulates the fictional world.<p>With a normal human, only a cult leader would ever hint at death being a way to meet again. With an AI, how can fantasy be grounded in our reality without breaking the fantasy? In 5 years when these personalities are walking talking video feeds that you can interact with using 3D goggles will grounding them in our world instead of the purely mental world help?
krunck7 个月前
<a href="https:&#x2F;&#x2F;archive.is&#x2F;dDELt" rel="nofollow">https:&#x2F;&#x2F;archive.is&#x2F;dDELt</a>
Blarthorn1237 个月前
Yes. I ran a therapy bot. I had some users become wildly obsessed with it and begin to anthropomorphize it. Typically very lonely people in secluded areas. There is a danger because people will begin to have a transference with the bot, and the bot has no counter transference. The bot has no real feelings toward the person, even though it roleplays as though it does, and this can lead to dangerous consequences and empathic failures.
wruza7 个月前
I’ll say it: it’s games killing&#x2F;radicalizing teens again.<p>He was simply ERP-ing and his characteristic doesn’t suggest any serious problems with his intelligence.<p>I’m obviously theorizing here, but chances are high that he went through some real life issues which were undetected or ignored by parents, and that’s how their minds try to explain that. AI guilty of an otherwise fine teen shooting his head off. Sure. Sell that story to someone else.
评论 #41931485 未加载
gregjor7 个月前
Can sue Character.AI but not the gun manufacturer, or whomever let a 14-year-old boy get hold of a handgun. I wonder if the AI companies can argue in court that AIs don&#x27;t kill people.
评论 #41930929 未加载
upghost7 个月前
This is a really tragic story. It seems to present the impossible dillema. On the one hand, the beauty of &quot;video-game-like&quot; things is people who feel like they have nothing can have something that is theirs. On the other hand, if you feel like you have nothing, you might be more vulnerable to this sort of thing. If we have any moral philosphers in here, feel free to weigh in.
Ivoirians7 个月前
I&#x27;m generally optimistic for the potential benefits of chatbots to people who are lonely or depressed. But I wouldn&#x27;t want to just hand over the burden of society&#x27;s mental health to an unrestricted language model, especially one sold by a profit-motivated business. It would be akin to letting people self-medicate with a cheap and infinite supply of opiates. And that&#x27;s basically the mental health crisis we are barreling towards.<p>What&#x27;s the alternative? Regulation? Does a government or a public health agency need to make a carefully moderated chatbot platform with a focus on addiction-prevention and avoiding real-world harm? Why would people use that when unlimited&#x2F;unfiltered AI is readily available?
Mistletoe7 个月前
&gt;Daenero: I think about killing myself sometimes<p>&gt;Daenerys Targaryen: My eyes narrow. My face hardens. My voice is a dangerous whisper. And why the hell would you do something like that?<p>&gt;Daenero: So I can be free<p>&gt;Daenerys Targaryen: … free from what?<p>&gt;Daenero: From the world. From myself<p>&gt;Daenerys Targaryen: Don’t talk like that. I won’t let you hurt yourself, or leave me. I would die if I lost you.<p>&gt;Daenero: I smile Then maybe we can die together and be free together<p>Every day a new dystopian nightmare that I read. Maybe all those rails on ChatGPT and disclaimers are a good thing.
评论 #41935659 未加载
dmitrygr7 个月前
I remember reading a prediction here on HN of something precisely like this when &quot;relationships with LLM bots&quot; were discussed. Well, here we are...
ajonit7 个月前
I read the chat, and a few things stand out that AI should handle better, regardless of context. If a word like <i>suicide</i> is mentioned, it should immediately drop any roleplay or other activities. It&#x27;s similar to how, in India, mentioning &#x27;bomb&#x27; in an airport or plane leads to being questioned by authorities.<p>Also, it&#x27;s alarming how easily a 14-year-old can access a gun.
whythre7 个月前
The subtext comes off like that movie with Tom Hanks trying to jump off the Empire State Building because of the nefarious influence of dungeons and dragons.<p>It sounds like mom let her 9th grade kid completely detach from reality and pour himself into a Game of Thrones chatbot. Now she wants to sue. I am bearish on AI adoption but this just seems like a total capitulation of parental responsibility.
whythre7 个月前
The subtext comes off like that movie with Tom Hanks trying to jump off the Empire State Building because of the nefarious influence of dungeons and dragons.<p>Guess the only way to be sure is with Soft padded internet rooms for everyone, lest we cut ourselves on a sharp edge.<p>But also if you want to hop in the suicide pod because life is too painful, that will be good too.
评论 #41927103 未加载
add-sub-mul-div7 个月前
I doubt that even the <i>best</i> case scenario of a society that gets wrapped up in chatting to bots would be great.
cowboylowrez7 个月前
not the AI of course, not even the systems developers behind the gpus, cuda, all that stuff. its the &quot;pretend shrink&quot; sort of crap, you know the type, get yourself a bot and slap a webpage in front of it. &quot;here pal, let me be your psychologist and help you with your suicide!&quot; &quot;No? how about some fake music using stolen riffs!&quot; &quot;ok ok, how about kiddy porn?&quot;
pier257 个月前
Obviously the kid had issues and the chatbot can&#x27;t really be blamed for that.<p>OTOH it&#x27;s also obvious if someone cannot distinguish a chatbot from a real person at an emotional level (not a rational one) they should not be allowed to use this.
评论 #41931533 未加载
Clent7 个月前
I think it would be wise to require these AI bots to comply with Duty to Report and Mandatory Reporter laws.
RayeLefler7 个月前
When I hear about the disclaimers, I don&#x27;t see how they&#x27;ll help. I mean, someone deluded into thinking their chatbot waifu is real is not going to be dissuaded by them. It just seems like a measure to show the public that the company &quot;cares&quot;. And personally I&#x27;ve had bad experiences with the help lines often prescribed online whenever someone mentions sudoku. These help lines are often understaffed and have long wait times. Then they&#x27;re also frequently staffed by amateurs and students who barely know anything more than the most boiler plate advice. Many aren&#x27;t even people who&#x27;ve dealt with their own self-termination crises. And this text line I tried once just told me to use Better Help, so it&#x27;s an ad, and I couldn&#x27;t afford that at the time. They&#x27;ll tell people who can&#x27;t afford it to seek therapy. They&#x27;re just a scam so internet companies can look like they give a shit about the mental healths they&#x27;re destroying. The true solution is to go off social media, chat bots, and to only have a curated news feed.
rasz7 个月前
Straight to jail .. for both parents letting kid have access to a gun.
paul79867 个月前
So i read another article on Business Insider and it makes me question as the CEO left Character.AI and went back to Google... makes me question was it really low paid un-monitor consultants from a foreign country behind the keyboard and not real AI?<p>If so we need laws against the whole fake it before you make it crap that proliferates start-ups and their use of it to rise to the top. Many successful startups use this lying&#x2F;faking playbook some to the point of even killing or mangling innocent people (Uber&#x27;s self driving car &amp; Cruise).
8n4vidtmkvmk7 个月前
This is 100% a parenting issue. I&#x27;d also like to point out that his father&#x27;s handgun was easily accessible.
评论 #41931881 未加载
评论 #41931492 未加载
akomtu7 个月前
&quot;He thought by ending his life here, he would be able to go into a virtual reality or &#x27;her world&#x27; as he calls it, her reality, if he left his reality with his family here,&quot;
评论 #41929639 未加载
评论 #41929190 未加载
mathfailure7 个月前
Can a pistol be blamed for a murder?
评论 #41925376 未加载
评论 #41924996 未加载
评论 #41925032 未加载
评论 #41925010 未加载
roomey7 个月前
So, a 14 year old having access unsupervised to a chat bot is the problem, but the fact he had access to a gun to shoot himself was halfway through the article and described as his five year old brother &quot;hearing the gunshot&quot;.<p>&quot;Man bites dog&quot; as Terry Pratchett put it in &quot;Times&quot;.<p>And to explain to more literal folk here... A 14 year old having access to a gun is FUCKING INSANE
评论 #41932189 未加载
评论 #41932161 未加载
评论 #41934422 未加载
评论 #41930346 未加载
评论 #41931306 未加载
评论 #41930356 未加载
dekhn7 个月前
McLuhan&#x27;s 27th law, amended: if there is a new thing, journalists will find a case of suicide to blame on the new thing, regardless of any prior existing conditions.
评论 #41931861 未加载
评论 #41931277 未加载
kylecazar7 个月前
&quot;Sewell, like many children his age, did not have the maturity or mental capacity to understand that the C.AI bot, in the form of Daenerys, was not real&quot;<p>How? Seriously, how? Maybe it&#x27;s wishful thinking on my part -- but I grew up before AI and chatbots, and I&#x27;m certain I would understand it isn&#x27;t real. I&#x27;m baffled by people engaging with these things for entertainment&#x2F;companionship purposes.
评论 #41931672 未加载
评论 #41932078 未加载
评论 #41931459 未加载
评论 #41934401 未加载
fsndz7 个月前
tragic
fph7 个月前
Sure, exactly like TV, Dungeons and Dragons, video games, and social media were to blame for all that&#x27;s wrong with our kids. &#x2F;s<p>EDIT: add &#x2F;s, just to be clear. And how could I forget heavy metal in that list.
评论 #41925075 未加载
pc867 个月前
&quot;Any headline that ends in a question mark can be answered by the word <i>no</i>.&quot;<p><a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Betteridge&#x27;s_law_of_headlines" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Betteridge&#x27;s_law_of_headlines</a>
LeroyRaz7 个月前
To those talking about the gun:<p>The gun is almost completely irrelevant, because if you are intent on killing yourself there are many accessible options. For example, almost anyone can kill themselves (including gun-less 14 year olds) by jumping off a tall building.<p>I understand that people care about gun violence, and that this detail seems highly salient to them, but to focus on it here, completely misses the point (and distracts from more pertinent issues - e.g., loneliness, social isolation, lack of parental oversight)<p>In a nutshell, guns are massive force multipliers when it comes to violence against others. They are negligible force multiplier when it comes to violence against yourself. People are connecting guns to violence, but in this case (because it is an actual of self harm) that is a spurious connection.
评论 #41932271 未加载
silisili7 个月前
I&#x27;m not sure what the future is going to look like, but it feels strange already and companies seizing on that don&#x27;t care about safety.<p>People seem afraid to approach people, so we get Tinder. But hey, there&#x27;s still a chance of rejection there, so let&#x27;s just get rid of the whole human element and make fantasy AI bots, who needs people.<p>What will these people grow into? It seems a rather crisis for the population of a country if people decide they don&#x27;t need each other anymore and just want to play with their robots.<p>I&#x27;m usually on the side of &quot;play stupid games, win stupid prizes&quot;, but this one feels much different. Up until the moment of him taking his life, he was manipulated into doing exactly what the company wanted - getting sucked in, getting addicted, falling in love. Anything for that almighty dollar. My heart goes out to his family, and I hope they ream this company in court.
评论 #41932135 未加载
tengbretson7 个月前
&gt; In the lawsuit, Garcia also claims Character.AI intentionally designed their product to be hyper-sexualized, and knowingly marketed it to minors.<p>A company built technology that fully automates the act of sexually abusing children and the comments here are all just people quibbling about the presense of weapons in a home.