This question is really not specific.<p>In scientific and high performance computing, people are regularly inventing new file formats. Many of these decisions also follow paradigms such as "We don't like XML nor complexity, so let's do as if it was 1980 and serialise these data as ASCII one datum per line".<p>Don't forget that your viewpoint comes from your community. If you are a CAD person, you probably never use JSON. If you do data research, you probably never use XML. If you do hardware development, you probably open any file with a hex editor anyway just because data is usually some bitstream for you.
I generally use json - inter-operable with almost any language.<p>Use line-delimited json (jsonl) if you need to store lots of records in a single file -- gzip that file if size is important. foo.json.gz is a common data interchange format supported in data warehousing systems and the like.
For scientific data acquisition I extended an extendable standard format and collect data to the files real-time while displaying the signal. It's not as easy as it sounds.