I remember searching for a JSON library with minimal dependencies a while ago, and came across this:<p><a href="https://rawgit.com/miloyip/nativejson-benchmark/master/sample/conformance.html" rel="nofollow">https://rawgit.com/miloyip/nativejson-benchmark/master/sampl...</a><p>The variance in feature set, design and performance is huge across all of them. I ultimately landed on libjson, written in C: <a href="https://github.com/vincenthz/libjson">https://github.com/vincenthz/libjson</a><p>It does a lot for you, but it notably does not build a tree for you and does not try to interpret numbers, which I found perfect for adding to languages with C FFI that have their own collection and number types. It’s also great for partial parsing if you need to do any sort of streaming.<p>It looks like this one can’t currently do partial parsing, but it looks great if C++ maps/vectors are your target.
Compile time is largely a "developer problem", but so is the usability of a library. nlohmann/json's main perk that it is selling is that it's interface is usable. Whether or not a developer values usability at typing time vs compile time is an interesting thing to ponder for sure.
Code in jart's version is refreshingly clean and easy to read compared the nlohmann's version.<p>As an aside, I wonder: what are the ThomPike* set of macros actually doing in jart's implem ?<p>Also, a speed comparison of this vs the other one would be very welcome: conformance and simplicity are certainly important criteria when picking a JSON parser, but speed is rather crucial.
Really interesting that nlohmann isn't fully compliant. What cases are these?<p>It seems to me though that if you're encountering the edges of json where nlohmann or simple parsing doesn't work properly, a binary format might be better. And if you're trying to serialize so much data that speed actually becomes an issue, then again, binary format might be what you really want.<p>The killer feature of nlohmann are the the NLOHMANN_DEFINE_TYPE_INTRUSIVE or NLOHMANN_DEFINE_TYPE_NON_INTRUSIVE macros that handle all of the ??? -> json -> ??? steps for you. That alone make it my default go to unless the above reasons force me to go another direction.
On the other end of the spectrum there is [1]. It's both performance and usability oriented, although compile times are probably higher.<p>Nlohmann is the slowest out of the popular libraries, AFAIK, and not particularly more usable than rapidjson, in my experience. So "better than nlohmann" is not very novel.<p>[1] <a href="https://github.com/beached/daw_json_link">https://github.com/beached/daw_json_link</a>
The moment nlohmann's library came out, I switched to it and I never looked back.<p>I loved the interface and its exactly how I would've designed a json library with modern c++.<p>Just maybe turn off the implicit conversion option, that can get a bit messy ;)
This is a fine library, but I use nlohmann extensively and haven't experienced any considerable compilation slowdown once I added it to the project.<p>Overloading from_json to modularize parsing is really useful, I think that should be a part of every templated C++ json parser library.<p>That said, I have seen these ThomPike* macros in cosmopolitan.h before, I wonder what the origin is.
<a href="https://github.com/jart/json.cpp/blob/4f0a02dab1af7d81888cf5887c20cf5d71415efa/json.h#L45">https://github.com/jart/json.cpp/blob/4f0a02dab1af7d81888cf5...</a><p>The response doesn't tell you the location of the problem in the input.