TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Ask HN: Why are LLM UIs so slow?

1 点作者 rishikeshs大约 1 个月前
I love Claude Sonnet but the user interface is super slow. After chatting sometime, it becomes too slow and its even hard to scroll.<p>To counter this, I tried Openrouter&#x27;s chat interface, but that is painfully slow. I&#x27;m trying now Gemini 2.5 in Google AI studio and it is also slow.<p>What is the underlying reason for this? I understand the backend takes a lot of computation, but frontend?

1 comment

lukejkwarren大约 1 个月前
It&#x27;s the big bummer with reasoning models, although they are improving a lot. I experimented with various reasoning models for my AI security scanner product but found the performance to just be far too slow.
评论 #43556243 未加载