There are times when data should be a JSON, and there are times when it really shouldn't be. 165GB of compressed JSON is simply insane. More than 53 bytes to record a timestamp with 11 digits behind the decimal point of the second?!?!? I'm lucky if my DNS query latency is below 40ms, and couldn't imagine a single case where that degree of precision is relevant. Someone missed taking a basic statistics course. Just... wow. I hope someone from Merklemap sees this and reforms their bloated ways.<p>I wonder how many dollars of cloud storage they're burning on this waste?