TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

InternLM – SOTA OS 7B and 20B model with 200K context length

3 pointsby osansevieroover 1 year ago

1 comment

brucethemoose2over 1 year ago
20B 200K sounds perfect... But I have zero trust in the huggingface (or opencompass) benchmarks. They are all but meaningless because they can be, and frequently are, cheated.<p>And they present basically no information other than the standard metrics.<p>Will just have to try it myself, I guess. Yi 200K was quite a pleasant surprise already.