This post discusses how FeatureBase uses Bit-sliced indexes to significantly reduce the number of bitmaps needed to represent a range of integer values. And by applying range-encoding to the indexes, it is able to perform lightning fast range queries.
Interesting, but I feel like they took the long way around to arrive in the end at a fairly "obvious" representation - slices of the binary encoded bits.<p>And the big question is how does the performance actually compare against scanning an array of ints.