This is a fairly bizarre way to rank academic institutions. First of all, the methodology almost entirely neglects computer science because its metric of success is papers published in top tier journals; computer scientists tend to submit to conferences (e.g., NeurIPS) and hence get no credit for their work in this count. However, other fields seem over-represented from the list of journals---this is likely why Cold Spring Harbor, a very good biological lab where probably the vast majority of the papers are published in Nature-approved venues seems to be so elite.<p>The "normalization" they use divides the proportional count of authors that have contributed to an article in the "Nature Index" to the total output of the institution in the sciences, measured by a company called Dimensions. This has the odd effect of penalizing institutions for publishing outside their listed journals.<p>Finally, as an academic, there are some journals on the index that I have published in, but many venues I have published in did not make their cut. Sometimes more specialized journals are necessary---one cannot easily publish, for example, a detailed proof of a theorem in Nature, even if the result is very important.<p>List of journals: <a href="https://www.natureindex.com/faq#introduction1" rel="nofollow">https://www.natureindex.com/faq#introduction1</a>