Very interesting discussion.
I work on a 2D MMORPG for Android.
This is extremely relevant to me. I have a few questions though.<p>What if you take compression and deserialization out of the picture?
For example, in my server I have a hash like data structure that gets turned into JSON for browsers and byte array for mobile clients.<p>For example, because the data has to be transferred at fast rates and will be going over mobile networks. The size of the packet matters because every millisecond counts.<p>Then to read the data, I simply read the stream of bytes and build the objects I need on the client.
This has to happen mostly without allocations for example on Android to avoid the GC.<p>So a few questions:
Does deserializing JSON cause any memory allocations?
If you're not tokenizing the data and don't need to parse it, will it be a significant gain over s serialized byte protocol or JSON?<p>In any case, I'll experiment on my end and perhaps blog about my own findings.