"Truncating the fast (but risky) way" talks about the ~~
(double bitwise NOT) trick as a (probably) faster alternative to Math.trunc()
, both in the typing and in the execution.
But the ~
(bitwise NOT) operator, which is the basis of ~~
, exposes a dangerous hidden corner of JavaScript: Not all numbers are created equal.
Let's begin with a fact that many developers forget: JavaScript has only one number type. Every integer and decimal number you encounter in your work is stored as a double-precision (64-bit) IEEE 754 floating-point number, no exceptions. You can visit this Wikipedia page for the gory details, or Annotated ES5 Section 8.5 for a quick summary. That said, 18,437,736,874,454,810,624 (18.5 quintillion) possible values ought to be enough for anybody.
Except only integers from -(2^53-1) to (2^53-1) can be guaranteed exact representation in this format. In ES6, these limits are enshrined as constants:
Number.MIN_SAFE_INTEGER // -> -(2^53-1)
Number.MAX_SAFE_INTEGER // -> (2^53-1)
Pre-ES6 implementations will have to explicitly use -(Math.pow(2,53)-1)
and Math.pow(2,53)-1
.
"But gromgit," you say, "I do integer math way bigger than that, and they sure look right!"
On the contrary:
Number.MAX_SAFE_INTEGER
=> 9007199254740991
Number.MIN_SAFE_INTEGER
=> -9007199254740991
Number.MAX_SAFE_INTEGER - Number.MIN_SAFE_INTEGER + 1
=> 18014398509481984
Try the above in any browser you care to, going back as far as you want. If you haven't spotted the problem yet, it's this: odd + odd + odd = even?!?!
You can blame the inexactness of large floating-point number representation for this. Still, 18,014,398,509,481,983 (18 quadrillion) exact integers ain't bad.
Except the more experienced among us remember hardware-supported bitwise shortcut operations from our years in the programming trenches. Sure enough, JavaScript imports stuff like <<
(bitwise right shift) and >>
(bitwise right shift) for quick multiplication and division by powers of two, bit-twiddling with &
and |
and of course ~
, that sort of thing. Fast, predictable...and every single one defined over 32 bits.
So if I had a 2TB cloud storage allocation, and wanted to double it:
alloc = 2000000000 // I hate it when storage vendors shortchange me 8-)
alloc << 1 // -> -294967296 (wait, I **owe** them space?!?!)
alloc * 2 // -> 4000000000 (oh, ok. phew!)
It's called engineering.
Back in the day, a young fellow named Brendan Eich had ten days to come up with a new language to embed in the then-new Netscape Navigator browser. (I used it, way back then, and it was pretty much the only browser game in town, so yeah, it was good-by-default.) It was a rush job, to put it kindly, but as with many developments in computer history, it was "fit to purpose", so everyone chose to ignore this 64-53-32 disparity. Perhaps addressing this inelegance was tabled at standards-forming time, but it's likely that no one was fiddling with gargantuan numbers in JavaScript, so "good enough for web browsing" won the day.
In Brendan's own colorful recollection:
[...] bignums were not in the cards. JS had to "look like Java" only less so, be Java's dumb kid brother or boy-hostage sidekick. Plus, I had to be done in ten days or something worse than JS would have happened.
And so here we are today, stuck with:
double by default, int under the hood, and bitwise ops are 32-bit int (uint if you use >>>). I blame Java.
Perhaps ES10 may arrive in an age where even 64-bit is considered "barely functional", and move swiftly to correct this oversight with 128-bit bitwise operators. Better yet, proper arithmetic and bit-level support of bignums will banish this horrible reality into legend. Until then, remember:
we're stuck with 64-bit floats
and int-53 just gets my goat
but bang some bits and you will see
large nums deflate to -3
-- gromgit