TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Lawyer cited 6 fake cases made up by ChatGPT; judge calls it “unprecedented”

237 点作者 umilegenio将近 2 年前

31 条评论

latexr将近 2 年前
Prior discussion on the same matter from a different link: <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=36095352" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=36095352</a>
评论 #36134161 未加载
hluska将近 2 年前
For reference, here is the judge’s order. Endnote #2 alone is worth the price of admission:<p><a href="https:&#x2F;&#x2F;s3.documentcloud.org&#x2F;documents&#x2F;23826753&#x2F;judgeaskingtheotherlawyerwhyhesubmittedafilingwithfakecases.pdf" rel="nofollow">https:&#x2F;&#x2F;s3.documentcloud.org&#x2F;documents&#x2F;23826753&#x2F;judgeaskingt...</a>
评论 #36131903 未加载
mabbo将近 2 年前
I think the judge should not take into consideration anything about where the lawyer said they got the case information from.<p>When you go to court and cite previous cases, you are responsible for ensuring they are real cases. If you can&#x27;t do that, what exactly is your job as a lawyer?
评论 #36131614 未加载
评论 #36131638 未加载
Ankaios将近 2 年前
It turns out there were precedents:<p>Case: Thompson v. Horizon Insurance Company, Filing: Plaintiff&#x27;s Motion for Class Certification. Citation: The plaintiff&#x27;s attorney cites the influential case of Johnson v. Horizon Insurance Company, 836 F.2d 123 (9th Cir. 1994), which established the standards for certifying class actions in insurance disputes. However, it has recently come to light that Johnson v. Horizon Insurance Company is a fabricated case that does not exist in legal records.<p>Case: Rodriguez v. Metro City Hospital, Filing: Defendant&#x27;s Motion to Exclude Expert Testimony. Citation: The defense counsel references the landmark case of Sanchez v. Metro City Hospital, 521 U.S. 987 (2001), which set the criteria for admitting expert witness testimony in medical malpractice cases. However, it has now been discovered that Sanchez v. Metro City Hospital is a fictitious case and does not form part of legal precedent.<p>Case: Barnes v. National Pharmaceuticals Inc., Filing: Plaintiff&#x27;s Response to Defendant&#x27;s Motion for Summary Judgment. Citation: The plaintiff&#x27;s lawyer cites the well-known case of Anderson v. National Pharmaceuticals Inc., 550 F.3d 789 (2d Cir. 2010), which recognized the duty of pharmaceutical companies to provide adequate warnings for potential side effects. However, further investigation has revealed that Anderson v. National Pharmaceuticals Inc. is a fabricated case and does not exist in legal jurisprudence.
评论 #36132712 未加载
lynx23将近 2 年前
Wait, this is handled as &quot;ChatGPT made up these cases&quot; and not as &quot;Layers deliberately used ChatGPT to fabricate stuff&quot;? Is anyone really believing a lawyer is that stupid? I know, adssume good intentions and all, but in this case, really?
评论 #36131136 未加载
评论 #36131986 未加载
评论 #36131705 未加载
评论 #36131990 未加载
评论 #36131519 未加载
评论 #36131138 未加载
评论 #36131132 未加载
评论 #36132498 未加载
评论 #36152171 未加载
评论 #36136197 未加载
评论 #36131548 未加载
评论 #36132732 未加载
评论 #36135264 未加载
onionisafruit将近 2 年前
&quot;unprecedented&quot;? ChatGPT says there is precedent and gave me several citations.
评论 #36142839 未加载
rayiner将近 2 年前
Check out one of the fake opinions: <a href="https:&#x2F;&#x2F;storage.courtlistener.com&#x2F;recap&#x2F;gov.uscourts.nysd.575368&#x2F;gov.uscourts.nysd.575368.29.1.pdf" rel="nofollow">https:&#x2F;&#x2F;storage.courtlistener.com&#x2F;recap&#x2F;gov.uscourts.nysd.57...</a>. It even makes up a panel comprising real federal appellate judges (although one is from the fifth circuit while the fake case is from the eleventh circuit). I can see how someone unfamiliar with what GPT can do could get fooled.
评论 #36131932 未加载
tromp将近 2 年前
&gt; The other five bogus cases were called Shaboon v. Egyptair, Petersen v. Iran Air, Martinez v. Delta Airlines, Estate of Durden v. KLM Royal Dutch Airlines, and Miller v. United Airlines.<p>Tyler Durden no doubt...
ftxbro将近 2 年前
&gt; &quot;Lawyer: ChatGPT said the cases were real&quot;
评论 #36131085 未加载
评论 #36130692 未加载
ss108将近 2 年前
It would have taken about 1 minute to put each of them into a tool like Casetext, Lexis, WL, or Bloomberg Law to determine they didn&#x27;t exist.
评论 #36130885 未加载
评论 #36131112 未加载
cj将近 2 年前
I asked ChatGPT to tell me a riddle.<p>It was “What is always hungry, needs to be fed, and makes your hands red?” (Or something like that)<p>I asked for a hint about 5 times and it kept giving more legitimate sounding hints.<p>Finally I gave up and asked for the answer to the riddle, and it spit out a random fruit which made no sense as the answer to the riddle.<p>I then repeated the riddle and asked ChatGPT what the answer was, and it gave me the answer (“Fire”) which makes sense as the answer to the riddle.<p>But it was giving extremely bad hints, like “it starts with the letter P” and “it’s a fruit”.<p>That was a great way to show my non-tech family members the limitations of AI and why they shouldn’t trust it.<p>Playing “20 questions” with ChatGPT is another great way to expose its limitations. It knows the game and tries to play, but is terrible at asking questions to narrow down possible answers.<p>There really needs to be some confidence or accuracy score&#x2F;estimation displayed alongside its output.<p>Or, learn how to say “I don’t know”
评论 #36131348 未加载
评论 #36132041 未加载
评论 #36131649 未加载
评论 #36131134 未加载
评论 #36131874 未加载
评论 #36131165 未加载
评论 #36131851 未加载
评论 #36132159 未加载
评论 #36131982 未加载
评论 #36133248 未加载
评论 #36132151 未加载
评论 #36131595 未加载
评论 #36131523 未加载
评论 #36132477 未加载
评论 #36131166 未加载
评论 #36132834 未加载
评论 #36131556 未加载
评论 #36131109 未加载
评论 #36131265 未加载
评论 #36131356 未加载
评论 #36131772 未加载
评论 #36131357 未加载
评论 #36131395 未加载
rvba将近 2 年前
&gt; The lawyer&#x27;s affidavit said he had &quot;never utilized ChatGPT as a source for conducting legal research prior to this occurrence (...)&quot;<p>I wonder if the court tried to verify that.
mensetmanusman将近 2 年前
&quot;He was a bad lawyer. I am a good Bing. I will send him to jail.&quot;
boringg将近 2 年前
Disbarred?
评论 #36130777 未加载
评论 #36130778 未加载
评论 #36131958 未加载
评论 #36131841 未加载
macmac将近 2 年前
That is a hilarious pun.
morkalork将近 2 年前
Imagine being that clown&#x27;s client.
thih9将近 2 年前
&gt; Schwartz didn&#x27;t previously consider the possibility that an artificial intelligence tool like ChatGPT could provide false information<p>Don’t you have to click through a number of popups about that before accessing chatgpt?
评论 #36143304 未加载
uguuo_o将近 2 年前
I thought I had just seen a post yesterday [0], so I immediately thought how can there be 2 such lawyers? Competition on who would get caught first? It seems to be referring to the same case.<p>[0] <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=36116652" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=36116652</a>
CatWChainsaw将近 2 年前
I don&#x27;t see this as a particularly wild case of LLMs gone wrong. When &quot;you have to take the bad with the good&quot; includes a constant undermining of reality and an even more desperate need for fact-checkers, I wonder if it will have been worth it in the end. Likely not.
xbar将近 2 年前
I asked ChatGPT to disbar an attorney.
mrangle将近 2 年前
Indicates the inevitability of reduced time that it takes for fake history and data to be declared valid and widely defended by core institutions.<p>Eventually the onslaught of subtle yet elaborate falsehoods will overwhelm the institutional filters.
not2b将近 2 年前
At minimum the sanctions should include a fine sufficient to cover the costs of everyone who had to deal with this: the court, opposing attorneys. Maybe punitive damages too. But let the guy keep his law license (unless there&#x27;s a second offense).
评论 #36131585 未加载
Zetice将近 2 年前
So how <i>do</i> LLMs fit into the legal profession, if at all?<p>Do legal tools that make use of LLMs just need to come with big ol&#x27; disclaimers at the top saying, &quot;This tool does not represent a legal opinion, please verify the output independently.&quot;?
评论 #36132586 未加载
评论 #36132513 未加载
评论 #36132539 未加载
more_corn将近 2 年前
Every citation created by chat gpt will be hallucinated. It knows what they look like. It doesn’t know what they are. It doesn’t actually “know” anything. It is a statistical engine for generating the next reasonable looking word.
评论 #36133915 未加载
zugi将近 2 年前
Am I the only one here entertained by the judge&#x27;s clever wording?<p>&quot;Hey, all the case law you cited in your filing is made up. That&#x27;s <i>unprecedented</i>!&quot;
Brendinooo将近 2 年前
Yeah, AI-generated fake cases would definitely be unprecedented, unless SmarterChild did a stint as a paralegal in the early aughts...
kristianbrigman将近 2 年前
How do these usually get checked? I mean, maybe he’s already done this successfully a few times.
评论 #36131598 未加载
seydor将近 2 年前
this is funny. However GPTs are great linear interpolators between stories and probably also between judgements. Maybe they will be useful to replace judges
0xbadcafebee将近 2 年前
Were they any worse than the normal junk cases that get filed every day in this country?
bitwize将近 2 年前
So turds turn up in AI law as well...
thefourthchime将近 2 年前
Sigh, I&#x27;m getting very sick of hearing about how &quot;ChatGPT&quot; makes stuff up. Yes, 3.5 made a lot of stuff up, 4.0 still does, but it&#x27;s much rarer.<p>I wish people would mention this, it&#x27;s all treated as the same thing. It&#x27;s like talking about how unreliable these &quot;Airplanes&quot; are when they are talking about prop planes, even though jets are out.
评论 #36130890 未加载
评论 #36131176 未加载
评论 #36131207 未加载
评论 #36132529 未加载
评论 #36131142 未加载
评论 #36138369 未加载
评论 #36130948 未加载
评论 #36131905 未加载
评论 #36130922 未加载