TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Ask HN: Does this estimate for GPT-4's 128k token window make sense?

1 pointsby sschmitt23 days ago
GPT-4 supports a 128,000-token context window (per OpenAI).<p>I&#x27;m trying to understand how much text that really is. Here&#x27;s my rough estimate:<p>1 token ≈ ¾ word ≈ 4 characters<p>A single-spaced A4 page (12pt font) ≈ 500–600 words → ~700–800 tokens<p>128,000 ÷ 800 = ~160 pages<p>So: GPT-4 can handle ~150–190 A4 pages of standard English prose, depending on formatting.<p>Does that sound accurate?

no comments

no comments