When I wrote the same kind of article in Nov 2011 [1], I came to similar conculsions; ujson was blowing everyone away.<p>However, after swapping a fairly large and json-intensive production spider over to ujson, we noticed a large increase in memory use.<p>When I investigated, I discovered that simplejson reused allocated string objects, so when parsing/loading you basically got string compression for repeated string keys.<p>The effects were pretty large for our dataset, which was all API results from various popular websites and featured lots of lists of things with repeating keys; on a lot of large documents, the loaded mem object was sometimes 100M for ujson and 50M for simplejson. We ended up switching back because of this.<p>[1] <a href="http://jmoiron.net/blog/python-serialization/" rel="nofollow">http://jmoiron.net/blog/python-serialization/</a>
The problem with all (widely known) the non-standard JSON packages is, they all have their gotchas.<p>cjson's way of handling unicode is just plain wrong: it uses utf-8 bytes as unicode code points. ujson cannot handle large numbers (somewhat larger than 2<i></i>63, i've seen a service that encodes unsigned 64-bit hash values in JSON this way: ujson fails to parse its payloads). With simplejson (when using speedups module), string's type depends on its value, i.e. it decodes strings as 'str' type if their characters are ascii-only, but as 'unicode' otherwise; strangely enough, it always decodes strings as unicode (like standard json module) when speedups are disables.
I disagree with the conclusion. How about this: you should use the tool that most of your coworkers already know and which has large community support and adequate performance. In other words, stop foling around and use json library. If (IF!!!) you find performance inadequate, try the other libraries. And most of all, if optimization is your goal: measure, measure and measure! </rant>
I just want to add another library in here which – at least in my world – is replacing json as the number one configuration and serialisation format. It's called libucl and it's main consumer is probably the new package tool in FreeBSD: `pkg`<p>Its syntax is nginx-like but can also parse strict json. It's pretty fast too.<p>More info here: <a href="https://github.com/vstakhov/libucl" rel="nofollow">https://github.com/vstakhov/libucl</a>
How hard is it to draw a bar graph? I'd imagine it is easier than creating an ASCII table and then turning that into an image, but I've never experimented with the latter.
> ultrajson ... will not work for un-serializable collections<p>So I can't serialize things with ultrajson that aren't serializable? I must be missing something in this statement.<p>> The verdict is pretty clear. Use simplejson instead of stock json in any case...<p>The verdict seems clear (based solely on the data in the post) that ultrajson is the winner.
> keep in mind that ultrajson only works with well defined collections and will not work for un-serializable collections. But if you are dealing with texts, this should not be a problem.<p>Well-defined collections? As in, serializable? Well sure, that's requisite for the native json package as well as simplejson (as far as I can recall -- haven't used simplejson in some time.)<p>But does "texts" refer to strings? As in, only one data type? The source code certainly supports other types, so I wonder what this statement refers to.
I disagree with the verdict at the end of the article, it seems like json would be better if you were doing a lot of dumping? And also for the added maintenance guarantee of being an official package.
> We have a dictionary with 3 keys <p>What about larger dictionaries? With such a small one I would be worried that a significant proportion of the time would be simple overhead.<p>[Warning: Anecdote] When we were testing out the various JSON libraries we found simplejson much faster than json for dumps. We used <i>large</i> dictionaries.<p>Was the simplejson package using its optimized C library?
But ujson comes at a price of slightly reduced functionality. For example, you cannot set indent. (And I typically set indent for files <100MB, when working with third-party data, often manual inspection is necessary).<p>(BTW: I got tempted to try ujson exactly for the original blog post, i.e. <a href="http://blog.dataweave.in/post/87589606893/json-vs-simplejson-vs-ultrajson." rel="nofollow">http://blog.dataweave.in/post/87589606893/json-vs-simplejson...</a>)<p>Plus, AFAIK, at least in Python 3 json IS simplejson (but a few version older). So every comparison of these libraries is going to give different results over time (likely, with difference getting smaller). Of course, simpejson is the newer thing of the same, so it's likely to be better.
(My own due diligence when working with serialisation: <a href="http://stackoverflow.com/questions/9884080/fastest-packing-of-data-in-python-and-java" rel="nofollow">http://stackoverflow.com/questions/9884080/fastest-packing-o...</a><p>I leave this here in case it helps others.<p>We had other focus such as good for both python and java.<p>At the time we went msgpack. As msgpack is doing much the same work as json, it just shows that the magic is in the code not the format..)
I'll have to try ultrajson for my use case, but when I benchmarked pickle, simplejson and msgpack, msgpack came out the fastest. I also tried combining all three formats with gzip, but that did not help. Primarily I care about speed when deserializing from disk.
I know it goes against the grain, but I wish that binary json (UBJSON) had much more widespread usage. There's no reason tools can't convert it back to json for us old humans.<p>The speed deference between working with binary streams and parsing text is night and day.
We took a look at ujson about a year ago and found that it failed loading even json structures that went 3 layers deep. I also recall issues handling unicode data.<p>It was a big disappointment after seeing these kinds of performance improvements.
It kills me that the default JSON module is <i>so</i> slow, if you're working with large JSON objects you really have no choice but to use a 3rd party module because the default won't cut it.