TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Ask HN: Is there every a good reason to create a 1GB+ JSON file?

1 点作者 dynamite-ready超过 2 年前
From time to time, I see attempts to serialize huge JSON objects to files, and in every instance, it seems like a code smell. Today, I encountered a file that made the 1GB figure look like a joke (it was much bigger than 1GB). Has anyone else found themselves working with files like this (especially JSON, but would be interested to here stories involving other serialization formats. Media files excluded)?

2 条评论

salawat超过 2 年前
Yep. 3 GB JSON handler reporting in.<p>It all depends on what assumptions you can make about update frequency and the continuity of link.<p>If you can&#x27;t assume 100% availability of the link, but you <i>need</i> to maintain operability in between link uptime windows, it&#x27;s worth paying the price for a sizable data file download, and perform runtime deserialization of that in the abscence of uplink.<p>Remember:<p>The network being available should never, ever be taken for granted.
mac3n超过 2 年前
It makes sense if it&#x27;s a file of JSON records, one per line Less so, if it&#x27;s a single blob.<p>DOn&#x27;t forget that JSON isn&#x27;t only a web interchange format.