I built a tokenizer in C++ with a Python binding that outperforms HuggingFace tokenizers by 10x on large inputs. It's optimized for minimal memory usage and latency.<p>Benchmarks and comparison included in README. Would love feedback or contributions!