I'm sorry for voicing my ignorance, but the article mentions a high-performance timer with microsecond granularity.<p>Are these things actually useful to anyone? I was under the impression that most operating systems use something on the near order of a few milliseconds of granularity--thus, any extra precision is not really useful.<p>It seems strange to me that we'd care about microsecond resolution in Javascript.