Seems reasonable to me. My guess is that it's not performance, but rather predictability, that matters here. Being able to detect when a page meaningfully changes is probably useful for Google, and a good implementation of Math.random() would potentially thwart that. Especially seeing how many pages have the magic constant in them...<p>Also, probably useful for determining two pages are the same, which may be needed to help prevent the crawler from crawling a million paths into a SPA that don't actually exist, for example.