TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

JSONH: JSON Homogeneous Collections Compressor

31 pointsby d0vsover 13 years ago

7 comments

resnamenover 13 years ago
If going with this means adopting a specialized library with its own format, sticking with a fixed schema, and giving up human readable formatting, why not go whole hog and use protocol buffers instead? It'd be cheaper to convert and store.
评论 #3089279 未加载
评论 #3089276 未加载
rkallaover 13 years ago
In a similar vein of "alternative ways to maximally leverage JSON" there is the Universal Binary JSON Specification <a href="http://ubjson.org" rel="nofollow">http://ubjson.org</a><p>Unlike BSON or BJSON, UBJSON is 1:1 compatible with the original JSON spec, it doesn't introduce any incompatible data types that have no ancillary in JSON.<p>Simple is similar, but utilizes more complex data structures for the purpose of further data compression which is great, but introduces complexities in generation and parsing while UBJSON is intended to be a binary JSON representation as simple as JSON.<p>"as simple as" defined here as you being able to open the files in a HEX editor and read through it easily in addition to being able to grok the spec in under 10 mins (and generate or parse it just as easily as JSON itself).<p>Because it has 1:1 compatibility with JSON, the general parsing and generation logic stays the same, it is just the format of the bytes written out that change.<p>There has been a lot of great community collaboration on this spec from the JSON specification group and more recently the CouchDB team that has improved the performance and usability quite a bit.<p>There is a Java and .NET impl with a handful of people working on implementations in Node.JS, Erlang and C but I don't have much info on the status of those impls yet.<p>[1] <a href="https://github.com/thebuzzmedia/universal-binary-json-java/blob/master/src/test/java/com/ubjson/io/MediaContentBenchmark.java" rel="nofollow">https://github.com/thebuzzmedia/universal-binary-json-java/b...</a><p>[2] <a href="https://github.com/eishay/jvm-serializers/wiki" rel="nofollow">https://github.com/eishay/jvm-serializers/wiki</a>
maratdover 13 years ago
You have two types of situations for JSON. Static and dynamic.<p>For static JSON, you have a flat file and that's that. It's always the same. In this case, all these fancy optimization formats are pointless. GZip the hell out of it once, cache it, and simply serve the compressed version. Binary compressed data is always optimal. The browser will automatically decompress and utilize. CPU is plentiful on the client-side. Debugging is easy, Firebug will display the clean JSON.<p>For dynamic JSON, you have a problem. You can't optimize or compress. Remember, it's all about total time. It doesn't matter if your data is now smaller if the amount of time it took to make it smaller is larger than the amount of time it would have taken to download the difference between the original data and the optimized data.
shaunxcodeover 13 years ago
Interesting, doesn't seem to explicitly handle nested arrays/objects.<p><a href="http://jsfiddle.net/uxAFb/" rel="nofollow">http://jsfiddle.net/uxAFb/</a><p>I imagine passing a flag to indicate the properties which are arrays of objects should also be packed could work too.
Andiover 13 years ago
I think we all know what JSON is good for (unlimited documents without fixed schema) and what GZIP is good for ;)
lordmattyover 13 years ago
Sounds excellent - would love to create an ObjC server/client library to take advantage of this!
gojomoover 13 years ago
Do you know why the performance page requests enabling Java to run?
评论 #3089263 未加载