TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Ask HN: Is there a "Slow but thoughtful" LLM?

4 pointsby kva12 months ago
Looking for something where I can give it a topic or area of interest, and it does an extremely comprehensive deep dive.<p>The ideal version would be that I can give it a topic and a level of understanding (ELI5, ELI10, ELI25) and it would summarize and organize all human knowledge on that topic and provide it to me. I would pay $X00 for this and would wait even 24-48 hours...

2 comments

kingkongjaffa12 months ago
&gt; The ideal version would be that I can give it a topic and a level of understanding (ELI5, ELI10, ELI25) and it would summarize and organize all human knowledge on that topic and provide it to me. I would pay $X00 for this and would wait even 24-48 hours...<p>That exists, its called a textbook. <a href="https:&#x2F;&#x2F;www.lesswrong.com&#x2F;posts&#x2F;xg3hXCYQPJkwHyik2&#x2F;the-best-textbooks-on-every-subject" rel="nofollow">https:&#x2F;&#x2F;www.lesswrong.com&#x2F;posts&#x2F;xg3hXCYQPJkwHyik2&#x2F;the-best-t...</a><p>“ For years, my self-education was stupid and wasteful. I learned by consuming blog posts, Wikipedia articles, classic texts, podcast episodes, popular books, video lectures, peer-reviewed papers, Teaching Company courses, and Cliff&#x27;s Notes. How inefficient!<p>I&#x27;ve since discovered that textbooks are usually the quickest and best way to learn new material. That&#x27;s what they are designed to be, after all.”
评论 #40483499 未加载
talldayo12 months ago
No, not really. The biggest AI models can currently run at near-realtime, so you&#x27;re not missing out on much with the immediacy of it.<p>The &quot;premium&quot; version of AI is doing your own research. AI will be wrong whether you run it fast or slow, so you&#x27;ve got to take that into account from square one.