Essential rule: ALWAYS treat large ID numbers in JSON as strings. Produce them as strings on the server, so the browser treats them as strings during the JSON decoding process.<p>(Fortunately, if they're ID's where they need to stay exact, then you probably don't need to do any math on them! Although once a coworker of mine had to write a bitwise XOR function in JavaScript that operated on large ID numbers represented as strings. That was fun...)
"Eich: No one expects the rounding errors you get - powers of five aren't representable well. They round poorly in base-two. So dollars and cents, sums and differences, will get you strange long zeros with a nine at the end in JavaScript. There was a blog about this that blamed Safari and Mac for doing math wrong and it's IEEE double -- it's in everything, Java and C"<p>-Peter Seibel Interview with Brendan Eich "who under intense time pressure, created JavaScript", Coders at Work, p.136
A lot of these issues are equally applicable to integer overflow (termination of loops and the "real web application disaster"). As a result, this is a deeper issue that extends beyond JavaScript into most languages—in general, programmers have to be aware that numbers in the runtime model of a language may not always work like you would expect theoretical, mathematical numbers to work.<p>Of course, floating point imprecision results in more surprising behaviors than integer overflow on the whole, but the danger is still there no matter what language you use (not counting the languages with full numeric towers, like Scheme).
This is not the only area in JS where it pays to be careful with numbers.<p>For example, the parseInt function mentioned in the article actually does magic base determination. Your string will be parsed into a base-10 number, unless it begins with '0x' in which case base-16 is used, or if it begins with a leading '0' then it is treated as base-8.<p>This last case has stung me on several occasions when parsing user input into numbers: User puts a leading-zero, and you magically end up with octal conversions. (I believe this whole octal thing has been deprecated in recent JS implementations).<p>In any case it's sensible to specify the 'radix' whenever using the parseInt function, as in parseInt(numberString, 10);
This bit a lot of twitter integrations hard, when tweet id's exceeded the non-lossy integer range of javascript floats.<p>All of a sudden, tweet ids got rounded off to point to completely different entries!<p><a href="https://dev.twitter.com/docs/twitter-ids-json-and-snowflake" rel="nofollow">https://dev.twitter.com/docs/twitter-ids-json-and-snowflake</a><p>Other JSON parsers may also be affected.
This is one reason I stick to statically typed languages. I do a lot of work with cash values and without an explicit decimal type, the shit hits the fan when you hit a float issue.<p>I get a lot of flak on here for that opinion which is odd.
I got hit by JavaScript rounding on a project once. Funny thing is, IE8 was okay with it and Chrome was the one that caused issues.<p>Turns out that
1000000 * 8.2 = 8199999.999999999<p>Sometimes we learn the hard way. The bug was getting written into an XML document and pushed to an embedded device over HTTP which was expecting an int and caused the device to crash. We fixed both bugs.
1. IDs are strings, not numbers. This is the real problem twitter had. The ID size was only vaguely related to the number of tweets.<p>2. An event counter is not going to reach 2^50.
<p><pre><code> I have determined by experience that 9007199254740995
(which is 2^53+3) is the smallest not representable
integer in Javascript.
</code></pre>
Here's something to drop in your browser console as an amusing illustration of this<p><pre><code> 9007199254740994 ===
(9007199254740995 - 1)
</code></pre>
and<p><pre><code> 9007199254740995 ===
(9007199254740995 - 1)</code></pre>
Interestingly, GWT gets around this problem entirely by having the 'long' datatype be created as two 32 bit numbers. Slows down calculation a little bit, but you get the correct values always.<p><a href="https://developers.google.com/web-toolkit/doc/latest/DevGuideCodingBasicsCompatibility" rel="nofollow">https://developers.google.com/web-toolkit/doc/latest/DevGuid...</a>
> <i>Javascript doesn’t have integer type but lets you think it has</i><p>Actually, JS has integer typed arrays.<p><pre><code> new Uint8Array([1,100,300])
[1, 100, 44]
new Uint16Array([-1,2,300])
[65535, 2, 300]
new Uint32Array([-1,2,300])
[4294967295, 2, 300]
</code></pre>
Here's the spec:<p><a href="https://www.khronos.org/registry/typedarray/specs/latest/#7" rel="nofollow">https://www.khronos.org/registry/typedarray/specs/latest/#7</a>