Be careful with theoretical asymptotic complexity (big O) related to execution time. E.g. if your algorithm time complexity is O(1), but internally calls a higher complexity function, e.g. malloc(), implemented with higher complexity, e.g. O(log n), your algorithm time complexity would be O(log n) and not O(1). It could be even worse: <i>on average</i> or <i>typical</i> constant time algorithm could be in reality an O(n) one: e.g. case of hash table reindexation (that's the reason of why many big data structures, including most SQL databases, requiring real time behavior, are implemented as trees, tree hierarchies/division/clustering, instead of big hash tables).