Seems like "this is why Javascript sucks!" is a common reaction here, which I find strange on a "hacker" community. It's a fun little brainteaser, not production code. Use your head a little?
My question is, <i>when</i> did that start working in Javascript?<p>Does this trick work in Brendan Eich's first JS implementaion in Netscape 2.0? Which came first, the spec or the implementation?
This sort of thing really bothers me with Javascript. Using the unary + operator on an array should be an error. Hiding errors by having implicit type convertions doesn't help me fix those errors.<p>You may say that users don't need to see to see strange error messages they don't understand. Quite right, what we need instead is to have a way for browsers to transmit uncaught exceptions in JS to the server.
Thought some and came up with an way to convert whichever natural number you want into this form.<p><a href="https://gist.github.com/1531201" rel="nofollow">https://gist.github.com/1531201</a><p>It depends on underscore.js for the functional bits.
This is the reason I prefer strongly typed languages. Allowing developers to play fast and loose with data types only leads to less maintainability down the road and makes code difficult to read.