With defer_javascript on, browser-reported page load times tell less of the picture. The problem is that what browsers report back to your analytics service is the amount of time that passes before the onload event while defer_javascript [1] postpones javascript execution until the onload event. This means that with defer_javascript off you were counting javascript execution time but when you turned it on you stopped counting it.<p>We're trying to optimize something like "time until the page is visually complete and usable", and there's not currently a good metric for that. Speed index [2] does visual completeness well, but I don't know of any algorithmic way to measure time-until-usable.<p>(I work on the PageSpeed team at Google, mostly on ngx_pagespeed.)<p>[1] <a href="https://developers.google.com/speed/docs/mod_pagespeed/filter-js-defer" rel="nofollow">https://developers.google.com/speed/docs/mod_pagespeed/filte...</a><p>[2] <a href="https://sites.google.com/a/webpagetest.org/docs/using-webpagetest/metrics/speed-index" rel="nofollow">https://sites.google.com/a/webpagetest.org/docs/using-webpag...</a>